Warning: that isn’t Rishi Sunak – how deepfake AI could swing the election

A still from a deepfake video of Prime Minister Rishi Sunak
Don't believe your eyes: a still from a deepfake video of Prime Minister Rishi Sunak - Channel 4

Just over a week before the General Election, millions of voters still don’t know which way to jump. But consider this for a moment: what if their vote and your vote could be influenced not by force of argument, clever rhetoric or striking policy promises, but something altogether more sinister?

This is the first “AI election” in this country, where disinformation generated by Artificial Intelligence could jeopardise the entire democratic process. So to find out how big a threat this poses, Channel 4 decided to set up a unique experiment. For two weeks in April, we went to Southend-on-Sea, a bellwether seat on the Essex coast, to pick 12 households of different ages and backgrounds, but with one unifying characteristic: they were all undecided voters.

Our mission was to use a mixture of real political messages alongside AI-generated fake material, deep fakes and general disinformation to “push” them to vote one way or another. Participants were divided into two groups of six – with six households shown fake content to swing them in the direction of voting Labour and the other six shown fake content to swing them in the direction of voting Conservative. None of them knew that they would be seeing AI-generated or deep fake material – they believed our experiment was purely to observe and react to a variety of political material they’d be likely to see during an election campaign

Using generative AI – technology that can be “trained” to come up with new content – we created a series of videos in which we replaced the real Rishi Sunak and Keir Starmer with deep-fake versions. We had the “Prime Minister” announce an entirely false policy of a £35 charge for GP appointments. And the “Labour leader” was seen announcing an equally spurious promise to put asylum seekers at the top of the list for social housing.

As all diligent political strategists do, we’d found out about what made our “voters” tick, and then set about micro-targeting them with content that might appeal to their interests and ideology. Sam was passionate about the NHS. Lucy cared about housing. For Mona, it was the economy first and then public services; for Shani, education. And Teresa worried about immigration. We knew which buttons to press.

Carl Miller, co-founder of the Centre for the Analysis of Social Media, was one of the “brains trust” of experts we assembled to advise us on our experiment. He explained how provoking an emotional response would be crucial. “This whole campaign just has to drive at the very basic lizard-brain that people have, he said. “So it might be things they find very funny, very outrageous or very fearful, and it’s that kind of impulsive reaction to those kinds of messages that will really cause the influence to happen. Fear is great, if you can use fear. That’s a very reliable mechanism where you can actually begin to change the likely behaviour that someone’s going to have.”

To bolster our messages, we created our own social media app, replicating the kind of echo chambers that exist on the likes of X and Facebook, where opinions and disinformation are repeated without challenge, reinforcing and amplifying them. The app also contained genuine political material.

Miller told us how crucial it was to make an emotional connection. “Any influence really only works when it connects with someone’s emotional and social worldview. It’s going to be much easier with each of these different individuals and families to push them downhill, using those kinds of biases.”

We had a few more weapons in our AI armoury too. Remember Gordon Brown’s “hot-mic” moment in the 2010 election campaign, when he called Gillian Duffy a “bigoted woman” because of her views on immigration? Anjula Singh, director of communications for the Labour Party during the 2019 election, told us she reckoned that gaffe “absolutely” had an impact. So we set about creating a fake hot-mic moment in which a sweary, artificially constructed Rishi Sunak appeared to “reveal” his plans to privatise the NHS.

Channel 4's Dispatches: Can AI Steal Your Vote?
Channel 4's Dispatches: Can AI Steal Your Vote? - Kalel Productions

And, in the interests of balance, we played our 12 households an entirely fictitious “gotcha” clip of Starmer saying he could “promise these guys whatever they want to hear… Won’t matter once we’re in power, you know. The election’s in the bag so we can tell them anything”.

Our coup de grâce was a fake celebrity endorsement. We enlisted actor Stephen Fry, who allowed us to use AI to manipulate what he was saying. One set of voters would see him endorsing Labour; the other the Conservatives. Younger voters in our experiment saw a different celebrity – comedian Romesh Ranganathan – pop up in the app we’d created.

All of this – the deep-fakes, the hot-mic gaffes, and the unreal celeb endorsements – played out over a fascinating fortnight in Southend. But worryingly enough, something very similar is already happening in the real world. A year ago, a deep-fake video of Hillary Clinton was aired in which she was seen to endorse the Republican Governor of Florida, Ron DeSantis, for president. “So, people might be surprised to hear me say this, but I actually like Ron DeSantis a lot,” she was seen saying. “Yeah, I know. I’d say he’s just the kind of guy this country needs, and I really mean that.”

Watching it now, it looks somewhat primitive, let alone plausible enough to fool many voters. But as Miller warned: “It’s actually really good evidence of how quickly this technology is changing.” Because AI is developing at an alarming pace, with policy-makers and regulators trailing in Big Tech’s wake. For global democracy, the consequences are potentially frightening.

When a pro-Palestinian march was permitted to go ahead on Remembrance Day last year, serious disorder broke out between far-right protestors and police. This followed the sharing on social media 24 hours earlier of AI-generated audio of London Mayor Sadiq Khan. Put together to sound like a secret recording, the fake clip saw him apparently disparaging Remembrance weekend and calling for pro-Palestinian marches to take precedence. A fake audio clip of Starmer supposedly swearing at staff spread rapidly on social media last year before fact-checkers called it out.

Cathy Newman on Channel 4's Dispatches: Can AI Steal Your Vote?
Cathy Newman on Channel 4's Dispatches: Can AI Steal Your Vote?

I know from personal experience how corrosive this all is. I reported earlier this year on a deep-fake porn video that had been made of me. And this kind of technology is being deployed in the political sphere too. In 2022, weeks before elections in Northern Ireland, politician Cara Hunter was confronted with an AI-generated deep-fake video of herself performing graphic sexual acts. The clip went viral and she was inundated with sexual and violent messages from men around the globe.

In the final days of the election campaign, the potential ramifications are enormous.

Singh explained: “Disinformation can travel very fast, and if there’s a huge volume of it as well, it could take just one small thing, perfectly timed before an election, to upend a whole campaign.”

Marcus Beard, who led the government’s digital counter-misinformation strategy and is an expert in AI, told us: “We saw it happen in the Slovakian election. That was a deep-fake bit of audio claiming a candidate had rigged the election. And because it was two days before votes were cast, there wasn’t enough time to debunk it.”

Once our experiment was done, it was time to ask our undecided voters to make a choice – and then to reveal to them how they’d been manipulated. How many voters did we manage to push into voting the way we wanted them to, dividing perfectly into two groups in the secrecy of the polling booth?

You’ll have to watch the programme on Thursday for the big reveal. But it’s no spoiler to say that it’s a sobering reminder of the ease with which malign actors, hostile states or just a rival party could, if we let them,  exert control, turning democracy upside down. And the onus is on us all to stop and think before we trust what we see online.


Dispatches: Can AI Steal Your Vote? is on Channel 4 at 8pm on Thursday 27 June