Extremists, such as far-right nationalists, ISIS, and QAnon conspiracy theorists, are often considered to hold irrational beliefs. By drawing on recent, but underexplored ideas from academic philosophy, Finlay Malcolm will identify three ways by which extremist beliefs might be irrational and suggests how some of these ideas might be useful for research and policy on deradicalisation.
It is often said that extremists are irrational. Perhaps they are drawn to unfounded conspiratorial thinking or prone to view the world in terms of simplistic racist narratives. This failure of rationality is also thought to be one of the contributing factors that leads people into extremism in the first place.
But is it right to say that extremists are irrational? And if they are, in what ways might they be irrational?
One way that we can begin to address these questions – as I have done in my research on religious fundamentalism – is by drawing on current ideas in contemporary epistemology. These ideas have not thus far, for the most part, been used to explore and critique extremism, and so turning to this work can open up new ways of understanding extremist actions and ideologies, and can contribute to the discussion on deradicalisation.
Belief and Evidence
There are two ways of thinking about rationality: first, what it is rational to do; and second, what it is rational to believe. With respect to the first kind of rationality, there is a highly pertinent and interesting question about whether the use of extreme methods to achieve political goals – what Quassim Cassam calls ‘methods extremism’ – are successful. As Richard English says, with respect to terrorism, there seems to be no simple answer to the question. But those are questions about what it is rational to do. Here, I will talk about the second kind of rationality: the rationality of extremist beliefs.
Whilst there is also no simple explanation as to whether extremist beliefs are rational, one place to begin with this issue is to assume that extremist beliefs might be irrational if they are not based on adequate evidence. Brenton Tarrant, the shooter in the Christchurch Mosque attacks in 2019, appears to have believed in, and been motivated by, “The Great Replacement” conspiracy theory. But since the theory is unsupported by evidence, then Tarrant’s beliefs are not supported by evidence, and so in this respect, Tarrant’s beliefs are irrational. A similar point might also apply to any extremist who believes in the QAnon conspiracy theory since there is no real evidential support for it.
But there may yet be something rational about someone who holds beliefs without sufficient evidence to back them up. What if someone lives in an echo chamber – either digitally, in their community, or a combination of both – where everyone believes the same thing, where leaders within the community advocate the beliefs, and where opposing views are filtered out? Even if what is believed is as radical as QAnon or The Great Replacement, you could be thought of as rational in your beliefs in this kind of situation. After all, you have been given no reason to doubt what you believe, and the community authorities laud the views as those that should be believed. Aren’t these people believing rationally? It’s worth exploring exactly how people can be rational in such circumstances.
Echo Chambers and Insular Communities
An extreme example of an insular echo chamber-like context is an isolated commune such as the Lyman family, where the children within the community were raised to believe they were going to live on Venus, or in the family of Tara Westover, whose father told them that civilisation would end at the turn of the Millennium. The young children raised in these isolated communities were not irrational for believing without evidence. All the evidence they had pointed to the truth of what they believed – everyone believed the same, the supposed authority figures lauded the views to be believed, and there were no opposing views present. So, there are ways of rationally believing an extreme view even when it is not supported by good evidence.
We might think that some extremist groups may structurally resemble these kinds of communes. For instance, some communities in the Middle-East where ISIS proliferated may have been informationally isolated, and so unable to access different views and opinions. Perhaps this is part of the explanation for the rise of the Taliban, whose support grew in largely rural communities in Afghanistan. In a comparable way, people who spend most of their time communicating online can cultivate the information they consume to that which agrees with their pre-existing beliefs. In this way, they create a kind of insular echo chamber.
What is different, though, between children raised in an insular commune or isolated community, and someone in a QAnon online echo chamber, is that in the first case, the people are doing the best with what they have and are following reasonable norms of belief: they trust the authorities they have and believe according to the evidence at their disposal. But in the case of the online community, the people intentionally screen out relevant alternative viewpoints, and in so doing, engage in vicious kinds of closed-mindedness and wishful thinking.
So, as some have argued, echo chambers and other insular communities are not themselves essentially irrational contexts – it is how they are used in closed-minded ways that is problematic. These are issues of process rather than evidence. It is how the evidence and information for or against a view is treated in process, and in this case, by failing to challenge or critique it, and in seeking out information that justifies a preferred position. In this sense, an extremist’s beliefs can also be irrational if the extremist screens out relevant views that oppose those that they want to believe, and only imbibes those that agree with their pre-existing opinions. In this way, they fail to follow processes that would lead to them gathering reasonable evidence and, ultimately, lead them to gain rational beliefs. So, this is another way of being irrational.
But the idea that people structure the information they receive in accordance with wishful thinking and closed-mindedness suggests a third way of critiquing extremist beliefs: in terms of the intellectual character traits with which people form and hold on to their beliefs. Extremists often have distinct psychologies, including preoccupations with purity, strong grievances and a sense of victimhood, and salient emotions of anger and resentment. But they can also be closed-minded, inclined to wishful thinking, prejudicial, negligent, gullible, or intellectually insouciant. Forming beliefs on the basis of these vicious intellectual character traits can also generate irrational beliefs. For instance, if someone is careless or shows a lack of attention to detail, that can make someone too lazy to check the facts correctly and, as a consequence, to form false beliefs. Bad or vicious intellectual character is usually understood to be the kinds of psychological traits people have that make people form or hold on to false beliefs, and so prevent people from acquiring knowledge. This can include such things as biases, which lead people to make prejudicial and incorrect judgments about people and situations, as well as general carelessness for facts and truth, which heightens someone’s inclinations to believe wishfully or conspiratorially.
How this will play out in the thinking of extremists will vary from case to case. But if we consider some of the far-right nationalism from the UK and other European countries in recent years, we can see a strong tendency towards grievance and anger. For instance, the protests and political movements established by Tommy Robinson in the UK in recent decades have been motivated by a deep and abiding Islamophobia. People joining in with this political movement have strong emotions of hatred and preoccupations with racial purity. In turn, these emotions and preoccupations can dispose people toward biased thinking – to believe what confirms their pre-existing beliefs – to be closed-minded to the viewpoints of others, to be rigid in how one views the world, and to conform to the views of authority figures who have limited credibility. Each of these are problematic intellectual character traits which act as conduits through which beliefs are formed and sustained, and in either case, can make those beliefs irrational. Charity groups have been set up in the UK to present counternarratives that can lead people away from such vicious thinking and support people to leave the far right.
There is an interesting potential connection between insular communities and poor intellectual character: the more time someone spends in an insular echo chamber-like environment, it seems likely that they will develop bad character traits, such as prejudice, closed-mindedness and negligence, and so the two can create a vicious circle.
Intellectual character can play a role in bringing people to radical extremism, and so preventing bad intellectual character is also pertinent to preventing radicalisation. In the UK, the government’s Prevent strategy has the aim of stopping people from becoming radicalised into terrorist extremism. Highlighting the role of influential leaders and their use of powerful narratives, the strategy acknowledges that ‘Many people who have been radicalised here have been significantly influenced by propagandists for terrorism’ (§5.33). Some of the widely recognised features of a person’s psychology that makes them susceptible to such terrorist propagandising include a sense of disaffection or victimhood, or emotions of grievance and anger, as well as the ‘search for identity, meaning and community’ (§5.22). However, this leaves aside the intellectual component to one’s susceptibility to propaganda, including their gullibility, carelessness and conformity. For example, the ‘underwear bomber’ Umar Farouk Abdulmutallab had been radicalised by listening to the online sermons of Anwar al-Awlaki. But whilst he may have felt disaffected and in need of belonging, he was also too easily convinced by al-Awlaki – too intellectually gullible. Many terrorists operate within groups who conform to the views of the leader; others won’t consider the possibility that their views might be wrong; and some don’t think to check or question the claims that are presented to them. All of these are matters concerning intellectual vices, and so can help make sense of the radicalisation process, but have not yet been applied to aid our understanding of terrorism and extremism. Focusing on some of these issues of intellectual character may open up new ways, in research and policy, of preventing extremism.
People often assume that extremists are not rational in what they believe. But there are different ways of understanding this idea. Someone can be irrational by believing against the evidence, screening out relevant alternative views in closed-minded ways (such as echo chambers), or forming beliefs through bad intellectual character. But some extremists may be completely rational if they have lived within an insular community where everyone believes the same, and alternate views have been filtered out. These differing positions, taken from recent work in philosophy, provide a helpful framework when considering the rationality of extremism. They also open up new ways of thinking about how to prevent radicalisation.
About this blog:
Welcome to the “Right Now!” blog where you will find commentary, analysis and reflection by C-REX’s researchers and affiliates on topics related to contemporary far right politics, including party politics, subcultural trends, militancy, violence, and terrorism.
The Center for Research on Extremism, C-REX, is a cross-disciplinary center for the study of right-wing extremism, hate crime and political violence. It is a joint collaboration with five of the leading Norwegian institutions on extremism research, hosted by the Faculty of Social Sciences at the University of Oslo.