'A perfect platform': internet's abyss becomes a far-right breeding ground


No depth goes unplumbed on the far-right forum 8chan. Its threads reveal a seething, toxic mass of rabid antisemitism, neo-Nazism, Islamophobia, gratuitous violence, coded inside jokes and conspiratorial ravings published by anonymous users.

Nothing has changed in the days after the Australian alleged gunman Brenton Tarrant, 28, came to 8chan boasting of the imminent massacre in Christchurch. Posts have since praised Tarrant as a “hero” and called for copycat attacks, or, alternatively, denounced him as a pawn in a false flag conspiracy.

8chan, which describes itself as the “darkest reaches of the internet”, is just one of a series of online forums populated by the extremist right. Studies in the UK suggest far-right forums are growing in number, giving a bigger platform to violent, racist messaging.

It begs the question: why is it that users are flocking to the internet’s abyss? And what roles do such forums play in radicalising those in the far right, alongside newspapers, broadcasters, YouTube, Twitter and Facebook?

Related: Scott Morrison attacks 'mindless tribalism' after Christchurch massacre

There are no simple answers to such questions. The Griffith University terrorism expert Geoff Dean has spent a large chunk of his career attempting to understand these forums, their role in radicalisation and the ways security agencies can identify high-risk individuals – those most likely to move from rhetoric to action. “It’s about all I think about,” he told Guardian Australia.

The starting point for coming to far-right online groups, whether fringe forums or those on Facebook, is an attraction to extreme views, he says. A tendency to view the world in black and white, and an inability to consider opposing views.

That could be encouraged by exposure to material in traditional media or from the divisive rhetoric of governments.

“A lot of people have extreme views; they’re black-and-white thinkers,” Dean said. “It’s very simplistic. It’s either right or wrong. They don’t have any sophisticated way of knowing it, they believe the fake news.

“They get exposure in the mainstream media. It’s very easy for them to become attracted to a Facebook group, or a forum, or a page that reinforces that. People love to have their own ideas validated because it gives them a sense of self-worth and self-identity.”

The forums are given potency by their ability to provide a sense of social connectedness among like-minded individuals, their legitimisation of otherwise repugnant beliefs, and their tendency to become “echo chambers”, places where contrary views are not expressed in any form.

“All you’re doing is breathing the same stale air as the other people,” Dean said. “That deepens their commitment, because if people spend a lot of time on forums like that, what happens is the repetitive going over things hardens their neural pathways to only think in extreme ways.

“People become attracted to it, they become obsessed by it, then they get fixated. At that point of fixation, that’s the most worrying point, because they get enough incitement from these forums to say ‘well now I’m going to do something about it’.”

There can be little doubt that Facebook, Twitter and YouTube have a problem with extreme, far-right messages. Studies, time and again, have shown the algorithms employed by YouTube and Facebook push conspiracy theories and far-right propaganda into the feeds of users.

Related: Christchurch suspect: Europe investigates possible far-right links

A report from research institute Data & Society last year found YouTube had become a breeding ground for far-right radicalisation. Those already interested in conservative or libertarian ideas became exposed to white supremacy or extreme nationalist content, the report found. The messaging was promulgated by the algorithm YouTube uses to recommend videos to individual users. There are also academics, media commentators and celebrities who promote a wide range of rightwing political positions.

“Discussing images of the ‘alt-right’ or white supremacism often conjures a sense of the ‘dark corners of the internet’,” the report said. “In fact, much extremist content is happening front and centre, easily accessible on platforms like YouTube, publicly endorsed by well-resourced individuals and interfacing directly with mainstream culture.”

Extreme, far-right content is similarly pervasive on Facebook. A 2016 study by researchers at Griffith University, including Dean, in Australia examined the online presence of “new-style” radical far-right groups, and found their Facebook messaging was spreading to tens of thousands of Australians. For example, Reclaim Australia, at the time, had 63,593 Facebook subscribers and the United Patriots Front had 27,348.

In January, YouTube announced it would “begin reducing recommendations of borderline content that could misinform users in harmful ways”. It cited flat-earth theories, 9/11 conspiracies and phoney miracle cures as examples.

But the Christchurch terror attack shows conspiratorial and far-right content is still spreading across YouTube and Facebook in considerable volume.

The Australian National University radicalisation expert, Clarke Jones, has extensively studied extremism in its various forms. He says the process of radicalisation is complex, and cannot be attributed to any single platform.

“To find an individual pathway or to blame one particular aspect is not really effective,” Jones told Guardian Australia. “You’ve got to be able to take a step back and look at what’s taken place from many different fronts.”

Jones says far-right radicalisation is caused by a “melting pot” of factors, including exposure to far-right messaging online. Radicalisation, he says, is as much about individual psychology and the mind’s resilience as it is about exposure to extremist content.

“Most people would have resilience to work through that or say what they feel on social media, as we’ve seen over the last three days on both sides since the attacks,” he said. “But there are those who are less resilient, and who act it out.”

Related: Channel Seven says Pauline Hanson still welcome after Koch Christchurch accusation

In a speech to the Australia-Israel Chamber of Commerce on Monday the prime minister, Scott Morrison, reminded social media companies they “carry social responsibilities”.

The government has already signalled it will seek to find ways to prevent Facebook from transmitting video of the kind livestreamed in the Christchurch terror attack.

Morrison is also reportedly planning to push for the G20 to take urgent action to clamp down on platforms that facilitate or allow hate speech.

University of Waikato senior lecturer Joe Burton said it was a critical lesson that must be learned in the wake of the Christchurch attack. The task was difficult, he said. When such forums were shut down, they simply “go dark”, moving behind encrypted platforms, out of the reach of security agencies or tech companies.

Nonetheless, Burton sees it as a problem that must be addressed to combat far-right extremism. “The spread of fake news and propaganda on the internet creates a perfect platform to increase fear, anger and anxiety,” he wrote in a piece for the Conversation. “These are the psychological conditions from which acts of violence are committed.”