It’s hard to know if the misinformation affected the election, but whether it did or not, it got ugly. The candidate was subjected to a slanderous campaign — anonymous authors accused him of fathering an illegitimate child; of paying women for sex; of being controlled by a hostile foreign power. His rival’s team denied all involvement, but they certainly knew it was happening, and did little or nothing to stop it.
We worry an awful lot about misinformation in the social media age; the spreading of lies through a combination of shadowy puppetmasters in smoke-filled rooms and clueless Boomers on Facebook.
As the US election looms, tech companies have started taking note: cracking down on QAnon conspiracy theories about paedophile rings; prompting users to read stories before sharing them; flagging questionable material and partisan sources.
On Wednesday, the chief executives of Twitter, Facebook and Google all appeared before the US Senate to face questions about their companies’ role in spreading fake news, and their responsibility for suppressing it. A complicating factor has been that one of the biggest sources of misinformation is the US President’s own Twitter account.
But it is worth remembering that misinformation is not new. The story I told in the first paragraph is not a tale of the social media age — in fact it predates the dotcom boom. The slandered candidate was Senator John McCain, running for the Republican nomination for the 2000 presidential election. Anonymous leaflets supporting his opponent, George W Bush, accused McCain of fathering a lovechild during his time in a Vietnamese PoW camp, and having been brainwashed while there into being a Manchurian candidate. McCain lost; Bush went on to be president.
As long as there have been elections, there have been attempts to subvert them with malicious lies. Hunter S Thompson told a (possibly apocryphal) story about Lyndon B Johnson, in a Senate race in Texas, telling his campaign manager to spread a rumour that his opponent had had sex with a pig. No one’s going to believe that, the manager replied. “I know,” said Johnson, according to Thompson. “But let’s make the son of a bitch deny it.”
The people who are sharing Hunter Biden memes are almost all going to vote for Trump anyway
The key difference in the social media age, says Rob Ford, a professor of political science at the University of Manchester, is that we can see it happening. The fact that it isn’t new doesn’t mean that it isn’t important, of course. But there are limits to how much it can affect things, in this election at least. That’s because almost everyone has already decided how they will vote.
In the US, there just aren’t many swing voters. The politics analysis site FiveThirtyEight estimates that just seven per cent of voters changed who they voted for between the 2012 and 2016 elections; a larger percentage are theoretically persuadable, but Ford thinks it’s no more than 20 per cent or so. The rest, by and large, will ignore negative news about their preferred candidate.
There’s just not much room for misinformation to affect the outcome. Yes, if the margins are tight it could theoretically make a difference, but this year that looks less likely. The people who are sharing Hunter Biden memes are almost all going to vote for Trump anyway.
These fears about misinformation are pervasive, though.
The Cambridge Analytica stories from 2016’s election and the Brexit vote are a case in point — the company seems to have been a bit grubby, but it doesn’t appear to have done very much that, say, Obama’s 2012 campaign didn’t, in terms of data gathering and analysis. But for the losing side, it gives an excuse: it’s easier to believe that we lost because the election was rigged than because lots of people voted for something you don’t like. In the increasingly likely event of a Trump defeat next week, his supporters will probably use postal voting conspiracy theories as a similar crutch.
Most importantly, while we worry about misinformation on social media, the traditional media is still quite capable of misinforming people all on its own.
In 2016, the piece of information that had the biggest impact on the outcome was not some shady targeted Facebook post, but a front-page story in the New York Times overblowing the importance of Hillary Clinton using a non-governmental email account.
That might have changed the whole outcome. Social media misinformation may be shiny and new, but the old kind works just as well.
Tom Chivers is science editor at UnHerd