How should online hate and misinformation be dealt with? When Wiley launched a tirade of anti-Semitism, his social media accounts were removed, as have the accounts of hate actors like Katie Hopkins, and Alex Jones in recent years.
In response to the growing following of the QAnon conspiracy theory, Twitter has deleted over 7,000 accounts dedicated to it. Some deride this as a sign of an emerging “cancel culture” and an attack on free speech, but it should also be asked whether deplatforming actually works?
Earlier this year, David Icke was the king of a profitable conspiracy empire. Within weeks of the pandemic reaching the US and UK, he had become the single greatest producer of coronavirus misinformation anywhere in the world.
As an organisation focussed on tackling the growing online myths about coronavirus, we could see Icke was a huge problem, but targeting him for deplatforming was not an easy decision. We asked ourselves in advance “is this going to work, or might it backfire?”
After all, publicly challenging those who spread hate and lies comes with the risk that you might just give them fresh exposure, making the problem worse, not better. The Center for Countering Digital Hate’s own research on online trolling shows that engaging with a claim in order to refute it can often help amplify and entrench it, both as a result of the technology and human psychology.
Freedom of speech does not mean freedom of reach.
That said, there are consequences to inaction, too. Icke’s poisonous misinformation about coronavirus had already been viewed 30 million times.
Every week his social media accounts attracted another 22,000 followers. And all of this was helping make Icke and the tech giants who both powered and profited from him millions of dollars in revenue.
Obviously David Icke has the right to spout whatever nonsense he likes, but he doesn’t have the right to an audience of millions. Freedom of speech does not mean freedom of reach.
The key was to focus our call on the tech giants themselves, highlighting their moral failings in enabling Icke, to deplatform him and take away the megaphone that they had given him.
We knew this approach could work from the research that our friends at Hope Not Hate conducted following the removal of Tommy Robinson’s anti-Muslim hatred from mainstream platforms, as well as research from the academics JM Berger and Jonathan Morgan showing that the suspension of extremist Twitter accounts “limited the ISIS network’s ability to grow and spread”.
Now our own investigation has laid bare the devastating effect deplatforming has had on Icke’s ability to spread his coronavirus misinformation and anti-Semitic hate.
Icke is now having to rely on BitChute, a YouTube alternative for the far right, where he has just 42,000 subscribers compared to the 890,000 he had on YouTube before his ban.
His use of BitChute makes for a case study in why deplatforming works. Since 2017, Icke has backed up all of his YouTube videos to BitChute, allowing for a direct comparison of how hate actors fare when they are robbed of the vast reach that tech giants can give them.
Our report examined 64 of Icke’s YouTube videos that had been viewed 9.6 million times – the same videos on BitChute have been viewed just 430,000 times. On average, just 6,711 people watch Icke’s BitChute videos, compared to 150,000 before his ban from YouTube.
Importantly, another nine videos that had been viewed on YouTube 3.9 million times were not backed up to BitChute. It means that the spread of these videos, including one that was YouTube’s most popular video promoting the “Agenda 21” conspiracy that the UN is intentionally depopulating the world, has been halted completely.
There are encouraging signs that Icke is reaching less people through his network of collaborators too. When our report launched in April, most people who had viewed Icke’s coronavirus misinformation had done so on the London Real YouTube channel, attracting 15 million views. While that channel still exists, all of its videos of Icke have been removed.
It shows that deplatforming works – even when it is partial. Unfortunately, Instagram and Twitter have yet to follow Facebook and YouTube in removing Icke and his content.
Icke, of course, has complained that our campaign was just another part of the global Zionist conspiracy arrayed against him.
The truth is that professional conspiracy theorists like him will always come up with new conspiracy theories.
The difference is that there are now 1.6 million fewer subscribers listening to him, and you can’t believe a conspiracy theory if you’ve never heard it.
Imran Ahmed is CEO of the Center for Countering Digital Hate
This article originally appeared on HuffPost UK and has been updated.