How young people become extreme online

"Even within extremist circles, some were surprised at how certain platforms spread their content," says a journalist who followed extremist groups on the internet.

Portrett av My Vindgren.
For seven years, My Vindgren was part of online extremist communities. The experience turned into a film about the radicalisation of youth.
Published

In the early hours of August 24, Tamima Nibras Juhar, aged 34, was killed while working at an institution for children and youth in Oslo. 

Both the Norwegian Police Security Service (PST) and the Oslo police had previously received warnings about the 18-year-old man now charged with the murder, according to the Norwegian Broadcasting Corporation NRK. He is said to have expressed far-right views. 

He told police the killing was politically motivated. The accused was also a member of a chat group run by the far-right party Alliansen, reports Filter Nyheter (link in Norwegian).

Over the past decades, far-right extremists have carried out several attacks in the West. Many of them were radicalised through social media and online platforms, according to Folk og forsvar.

Why and how does this happen? And what can be done to prevent online extremism?

Jonas. R. Kunst.
"A person’s beliefs and algorithms together create a self-reinforcing spiral," says Jonas R. Kunst.

Searching for meaning in a chaotic world

Professor Jonas Kunst at BI Norwegian Business School researches conspiracy theories, extremism, and digital influence.

He highlights two main factors behind radicalisation. The first relates to the individual.

Those drawn to extremism often experience strong emotions such as anger, fear, mistrust, and a sense of injustice, according to Kunst.

"They feel a strong urge to find meaning in a chaotic world that they don't understand. Extremist groups and conspiracy theories offer simple answers to complex problems. They also provide powerful enemy images," Kunst said at a public event in Arendal recently.

Belonging to such groups can also create a sense of being special, of knowing something others don't.

Shown content that reinforces your attitudes

The second factor has to do with social media. Algorithms are designed to ensure we spend as much time there as possible.

"They're optimised for engagement, not truth. And we engage most with content we agree with. Content with strong emotions and moralism is prioritised and spread with the help of algorithms, which amplifies their effect," said Kunst.

When a young person goes onto social media with strong feelings – for instance about immigration – algorithms begin reinforcing those emotions from day one. The content shown gradually becomes more radical.

"A person’s beliefs and algorithms together create a self-reinforcing spiral," said Kunst.

Young people are the most vulnerable

Algorithms work the same way for all social media users, but the difference is that young people spend much more time there. They use the internet to find information and make sense of the world.

"Young people are especially vulnerable because they're still searching for an identity and a place in society," said Kunst.

Swedish journalist My Vindgren infiltrated extremist online communities using fake profiles.

She spent seven years making the documentary Hacking Hate, which explores how social media is used to spread hate and radicalise youth.

Youth radicalised themselves

One of Vindgren's fake profiles ended up in a TikTok group with boys aged 12-13. They talked about how paedophiles were bad people who should be exposed. Before long, they moved the conversation to Discord, a platform for gaming and chatting.

"It took only a few days before they began planning violence against paedophiles. And their definition of paedophiles was very flexible. Suddenly, it wasn't just older men abusing young girls, but also Muslims, transgender people, and gay men," Vindgren said at the event.

The young boys were not lured in or recruited by extremists. Everything happened on their own initiative, according to Vindgren.

"I was shocked by how fast it went from a simple chat to a group with terrorist ideas. And these were not vulnerable youths. The boys came from middle-class backgrounds, and several had parents in the Swedish armed forces," said Vindgren.

Here's the trailer for the film My Vindgren made about extremist online communities.

Move from platform to platform

Øyvind Strømmen is a journalist and author of the new book Hat. Fortellinger om ytre høyre (Hate. Stories of the Far Right).

He reminded the audience that far-right use of the internet is not new. It began with bulletin boards – a kind of digital notice board – in the 1980s. Since then, extremists have moved between smaller, niche spaces and mainstream platforms with many users.

"Whenever one platform introduces restrictions against hate speech, they simply move on to another," said Strømmen.

Extremist groups tap into discontent and opposition to causes ranging from wind power to the EU in order to spread their message. 

Extremists were shocked

My Vindgren also gained access to a group of far-right leaders.

"They were surprised they hadn't been banned, and that they could get hundreds of thousands of views in just a few hours. Even within extremist circles, they were a bit shocked at how certain platforms spread their content," she said.

Øyvind Strømmen believes many, especially adults over 40, still treat the internet as if it were separate from daily life.

"But the internet is part of our reality. When hate against queer people emerges online, that hate also exists on the street," he said.

Øyvind Strømmen is a journalist and author.

Building resistance through understanding

Jonas Kunst argues that resilience should be strengthened through digital literacy.

"How do algorithms actually work? What does that mean for the content I see? And above all, who profits from them? These are things we need to learn," he said.

There are also strategies that can make people less vulnerable online. One of them is a kind of 'mental vaccination,' known in research as psychological inoculation.

People are shown small amounts of false information. They are then told why the information is wrong and which tricks were used to mislead them. The goal is that when they later encounter suimilar claims online, they will recognise the manipulation.

"Research shows this works. But it's not enough to do it just once. We need to practice this regularly, just like we need booster vaccines. Otherwise, the protection disappears," said Kunst.

Big Tech must be held accountable

But educating individuals is not enough.

Tech companies must be held accountable and regulated, Vindgren argued.

Kunst also believes the focus should be directed at the platforms.

"We should demand that they explain their algorithms," he said.

He had several suggestions for measures that could make people more aware of what they're seeing online and why. 

Simple steps

Users could, for example, be given the option to choose whether they want to see content based on what they've previously viewed, or sorted by other criteria such as date. AI-generated content should be clearly labelled. 

Social media could also prompt users before sharing something, asking whether they have actually read or watched it. Twitter once had such a feature, before the platform was bought by Elon Musk.

"The most effective solution is to change the algorithms so that the most extreme and polarising content is deprioritised. But this is a hard sell to tech companies, since that's how they make money," said Kunst.

The power of human connection

Still, the strongest safeguard against radicalisation is human contact, according to Strømmen.

"We need to build strong communities and preserve meeting places. The best antidote to extremism is meeting people who are different from yourself and engaging with perspectives that are different from your own," he said.

Strømmen believes we have to acknowledge an uncomfortable truth.

"People in Norway often forget that liberal democracy has enemies. And those enemies exploit every opportunity to spread their ideas. We need to take that seriously," he said.

———

Translated by Alette Bjordal Gjellesvik

Read the Norwegian version of this article on forskning.no

Related content:

Subscribe to our newsletter

The latest news from Science Norway, sent twice a week and completely free.

Sign up

Powered by Labrador CMS