Experts have seen a huge increase in Islamist and far-right propaganda on social media as extremists try to exploit a captive audience during lockdown.
Milo Comerford, from the Institute for Strategic Dialogue (ISD), said zealots have tried to take advantage of the “chaos and uncertainty” caused by the global pandemic to spread extremist messages.
In a paper published as part of the annual Global Terrorism Index report, he said one pro-Islamic State (IS) network on Facebook used “a web of several hundred accounts” to “expand” spreading propaganda between April and July.
It used tactics to avoid the content being removed, including hiding it by starting videos with legitimate news footage before moving to extremist material.
Covid-related hashtags were used on Islamist messaging to draw in unsuspecting users, and one anti-vaccine Facebook page with thousands of followers was hijacked by IS supporters.
Mr Comerford told the PA news agency: “It shouldn’t be a surprise that these bad actors are trying to use the chaos and the uncertainty around Covid to spread their extremist messaging.
“It’s pretty fertile ground for people spreading really divisive, violent, supremacist messaging. We’ve seen this across the ideological spectrum.
“We’ve seen a major proliferation of extremist content during the lockdown and that’s partly because there’s a lot more of a captive audience.”
Researchers from the ISD tracked the hijacking of popular coronavirus hashtags in Arabic by extremists, finding more than 500,000 views of pandemic-related IS video content on Twitter.
Last week the head of UK counter-terrorism policing, Neil Basu, raised concerns about the increased risk of teenagers being radicalised while spending more time online during lockdown.
Figures showed the vast majority of under-18s arrested for terrorist offences in the UK last year – 10 out of 12 – had expressed extreme far-right views.
While Islamist terrorists remain the greatest threat to Britain, Mr Basu said the far right is growing fastest.
The Global Terrorism Index report also said that while deaths from terrorism globally fell 15% to 13,826 in 2019, the number of far right attacks in western countries had increased since 2014.
Mr Comerford said the following of far-right channels on encrypted messaging platform Telegram rose sharply during the pandemic.
One white supremacist channel grew by more than 6,000 users in March, while another specifically focused on Covid-19 grew from 300 users to 2,700.
Mr Comerford said: “You’ve certainly got a hardcore of white supremacists, violent activists, on Telegram. We saw a huge expansion of those during lockdown. We saw huge user growth across some of the key channels used to spread terrorist content.
“We saw these channels explode with thousands of extra users during this period. We saw big spikes in groups that were specifically tailored towards exploiting the pandemic.
“Telegram is a big area for mobilisation. That’s notable because Isis used the platform for a long time and then was de-platformed last year by Telegram. But they certainly haven’t got a handle yet on the white supremacist far-right problem.”
While social media companies have boosted efforts to remove extremist content, he believes simply taking down content will never solve the problem.
“There has been a considerable response by social media companies to malign actors seeing to exploit the pandemic, with platforms recognising their policies and terms of service aren’t necessarily fit for purpose, especially in dealing with the health disinformation aspect of all this.
“They’ve had to massively tighten up their policies and their enforcement mechanisms, with this period really showing those vulnerabilities. It’s not going to be enough to just take down content, you really have to understand more holistically the way that these platforms are being abused by extremists.
“It’s really cast a light on that and hopefully we will see more movement around addressing the more systemic issues rather than just taking down individual bits of harmful content which won’t ever solve this problem, ultimately.”
A spokesman for Facebook said: “Violent extremist groups and dangerous organisations have no place on our platforms. In the last quarter we removed 9.7 million pieces of content for violating our terrorism policies, 99% before it was reported to us.
“However, we recognise content removal alone is not enough which is why we have a dedicated team of over 350 people who are focused on working with experts in law enforcement, counter-terrorism intelligence and academic studies in radicalisation to keep people safe on our platforms.
“We also study new trends in speech and adversarial behaviour related to violence, to ensure we stay on the pulse of how different groups evolve over time.
“We help fund The Global Network on Extremism and Technology which provides the latest intelligence and we have commissioned independent research into the relationship between Covid-19 and extremism.”
Telegram and Twitter have also been approached for comment.