Advertisement

Facebook’s QAnon debunk initiative glitches, accidentally sends users to conspiracy theory information

Supporters of Donald Trump wear QAnon branded clothes at rally  (Getty Images)
Supporters of Donald Trump wear QAnon branded clothes at rally (Getty Images)

Facebook’s initiative for informing people about the dangers of the QAnon conspiracy theory directed users to QAnon information when they were not looking for it.

A “glitch” caused people to see information about the conspiracy theory when they were searching for “unrelated terms”.

“When we first launched the Redirect Initiative for QAnon today there was a glitch that caused people to see information about this topic when they searched for unrelated terms. We’ve paused this Redirect while we fix the issue”, Facebook announced in a tweet.

It is unclear when the initative will be reinstated again.

Facebook’s ‘Redirect Iniative’ directs users to information from the Global Network on Extremism and Technology (GNET), the academic research network of the Global Internet Forum to Counter Terrorism, when they search for QAnon terms.

The QAnon conspiracy theory states that president Donald Trump is fighting a shadowy cabal of Satanist pedophiles in the US government, and this information is leaking via forums in codes from a mysterious ‘Q’.

Security researcher Mark Burnett, examining Qanon “codes”, has said that they are “not actual codes, just random typing”, based on an analysis of the letters and numbers used in the communications.

The FBI recently described “conspiracy theory-driven domestic extremists,” as a growing threat, due to the violent actions of its believers.

The conspiracy theorists have risen in prominence because of their support for president Trump, who refused to condemn the fringe group.

Facebook has made numerous efforts to remove QAnon content from its platform, with varying levels of success.

In August, it removed over 790 groups, 100 Pages and 1,500 ads tied to QAnon. Additional restrictions were placed on over 1,950 Groups and 440 Pages on Facebook, as well as over 10,000 Instagram accounts.

At the start of October, Facebook announced that it was banning all adverts in support of QAnon; previously, it had been down-ranked by the algorithm but the adverts remained on the site.

At the time, Facebook said it would be directing users “credible” child safety resources should they come across QAnon hashtags, because the conspiracy theorists use phrases like “#savethechildren” to recruit and organise.

One week later, Facebook banned all QAnon accounts, pages, and groups from its platforms, which includes Instagram.

Facebook is not the only social media company that has had to take action against QAnon. Twitter launched a crackdown on the QAnon conspiracy theory, citing its "potential to lead to offline harm", by taking down 7,000 accounts and reducing the visability of 150,000 more.

YouTube made similar decisions, saying it would prohibit material targeting a person or group with conspiracy theories that have been used to justify violence.

TikTok also put in place new policies against QAnon, as well as antisemitism and misinformation which are peddled by the QAnon conspiracy, but The Independent found that many accounts and hashtags remained operational even after the restrictions.

Online shopping site eBay is also hosting QAnon-related merchandise, including neck warmers, banners, and earrings.

Read more

Facebook Messenger gets new themes, custom reactions, and logo

QAnon supporters behind canceled fundraiser give Trump $1M

How families are being torn apart by conspiracy group QAnon