Voices: I used to love conspiracy theories – here’s why we get seduced by them

Theories that one would associate with cranks and devotees of David Icke were formative to my early years (Getty Images/iStockphoto)
Theories that one would associate with cranks and devotees of David Icke were formative to my early years (Getty Images/iStockphoto)

In my teens, I, like many people who lacked a social life and had access to the internet, became obsessed with conspiracy theories. Whether it was the existence of shape-shifting reptilians, or insisting that the World Trade third tower fell because of a managed explosion (after all, jet fuel doesn’t melt steel beams) – these theories that one would associate with cranks and devotees of David Icke were formative to my early years.

At its most intense was my fixation on “proving” that the 2005 London bombings were a hoax – to be used by the British government to continue justifying its war in Iraq and Afghanistan and build a system of control and surveillance of people. I was convinced of this, in part, because I had come across a documentary on a not particularly well known site called “Youtube”. While the video had all the hallmarks of a conspiracy theory video (including references to shape-shifting lizards), I hadn’t seen anything quite like it.

The documentary masterfully spliced together clips from BBC documentaries with animated graphics, all graded with futuristic aesthetics and combined with endearing, dark, dramatic techno. Indeed, the documentary was far removed from any I’d ever watched on Panorama. It was like the MTV music videos I’d only ever see once my parents had gone to bed. Which is to say that its appeal had far less to do with the loose arguments it advanced, than its appeal as niche entertainment.

I’ve been thinking about the documentary again, after the Biden administration launched the “Disinformation Governance Board" – a body that aims to combat online disinformation considered harmful. The move raised legitimate fears of granting more surveillance powers to unelected big technology companies and added to ongoing debates on free speech on the internet.

But, to my mind, this professionalisation of disinformation monitoring misunderstands its appeal.

As the technology writer Joe Bernstein pointed out in a recent essay, the involvement of government agencies, NGOs and think tanks in disinformation reporting may be a response to a very real problem, it frames it as a technological problem. They assume that people accidentally find themselves falling down rabbit holes by being exposed to the wrong kind of information. In this analysis, “fixing” the problem requires more regulation, more government oversight, and, of course, a greater ability for surveillance and security technology to be integrated into everyday internet usage.

Yet, this analysis of disinformation is flawed on a more basic level. By assuming that disinformation can be fought with the right balance of fact-checks and regulation, it misunderstands the appeal of disinformation among most of its creators and consumers. A recent study looked at this problem studying the search terms used by Facebook users who were temporarily cut off from the platform. It showed that those users went elsewhere to actively seek the similar kinds of misinformation, in this case, in relation to Covid-19. While this study sampled a relatively small number of users within a limited time frame, its findings match similar studies on search patterns, and challenges conventional assumptions on disinformation.

In simpler terms, it suggests that more regulation might slow the spread of misinformation and propaganda at best. But it’s unlikely that even the harshest regulations would diminish the appeal of misinformation – in fact, a supply shortage might mean that websites and forums peddling misinformation may end up accruing more power and value online.

More recent studies also suggest that while humans tend to be good at identifying “misinformation”, it doesn’t necessarily mean that they won’t share and spread it. Since misinformation often exploits people’s fears and vulnerabilities, its effectiveness lies not in accuracy, but rather how good it is in stirring up their emotions. It is important to understand the proliferation of disinformation not just as a regulatory problem, but one that exists because there’s a demand for it.

To keep up to speed with all the latest opinions and comment, sign up to our free weekly Voices Dispatches newsletter by clicking here

When I think back to the appeal of the 7/7 documentary, its appeal to me wasn’t in the veracity of its claims, but rather, that it told an entertaining story – one that felt coherent. And it was a story that assuaged my feelings of insecurity and fear as a young Muslim male in the aftermath of the attacks. It helped me make sense of a world that I didn’t feel like I was part of.

When I talk to friends and colleagues who share misinformation, I found it more effective to appeal to their feelings of insecurity and fear than attempting to fact-check them. Granted, this doesn’t always work. Getting people out of destructive rabbit holes requires a great deal of compassion and patience. But I found it more productive to combat the allure of conspiracy theories with stories that express compassion, love and genuine care toward them.

It’s likely that, had this approach been taken with me in my teens, I’d have been far less angry and confused. With that in mind, as well as knowing that growing economic and social instability will create even more fertile soil for attractive disinformation and conspiracies to flourish, it’s not enough just to impose sterner forms of moderation.

We will need to tell far better, more convincing and – most importantly – hopeful stories too.