‘Bot holiday’: Covid disinformation down as social media pivot to Ukraine

When David Fisman tweets, he often receives a deluge of hate within moments of posting. Fisman, an epidemiologist and physician, has been outspoken about Covid and public health.

Even when he tweets something innocuous – once, to test his theory, he wrote the banal statement “kids are remarkable” – he still receives a flood of angry pushback.

Related: Flood of Russian misinformation puts tech companies in the hot seat

But in recent days, Fisman noticed an “astounding” trend, he said. He posted about topics like requiring vaccination and improving ventilation to prevent the spread of Covid – and the nasty responses never came. No support for the trucker convoy, no calls to try the Canadian prime minister, Justin Trudeau, for treason.

Others have observed the same phenomenon; those who frequently encounter bots or angry responses are now seeing a significant drop-off. Covid misinformation, which has often trended on social media over the past two years, seems to be taking a nosedive.

The reasons for this “bot holiday”, as Fisman calls it, are probably varied – but many of them point to the Russian invasion of Ukraine.

Russia’s information war with western nations seems to be pivoting to new fronts, from vaccines to geopolitics.

And while social media has proven a powerful tool for Ukraine – with images of Zelenskiy striding through the streets of Kyiv and tractors pulling abandoned Russian tanks – growing campaigns of misinformation around the world could change the conflict’s narrative, and the ways the world reacts.

The likely reasons for the shift in online chatter are many. Russia began limiting access to Twitter on Saturday, sanctions have been levied against those who could be financing disinformation sites and bot farms, and social media companies are more attuned to banning bots and accounts spreading misinformation during the conflict.

But something more coordinated may also be at play.

Conspiracy theories around the so-called “New World Order” – loosely defined conspiracies about shadowy global elites that run the world – have converged narrowly on Ukraine, according to emerging research.

“There’s actually been a doubling of New World Order conspiracies on Twitter since the invasion,” said Joel Finkelstein, the chief science officer and co-founder of the National Contagion Research Institute, which maps online campaigns around public health, economic issues and geopolitics.

At the same time, “whereas before the topics were very diverse – it was Ukraine and Canada and the virus and the global economy – now the entire conversation is about Ukraine,” he said. “We’re seeing a seismic shift in the disinformation sphere towards Ukraine entirely.”

Online activity has surged overall by 20% since the invasion, and new hashtags have cropped up around Ukraine that seem to be coordinated with bot-like activity, Finkelstein said. Users pushing new campaigns frequently tweet hundreds of times a day and can catch the eye of prominent authentic accounts.

“We can’t say for certain that Russia is behind this or that it contributes directly to the propagation of these messages. But it’s pretty difficult to believe that it’s not involved,” Finkelstein said, with topics strikingly similar to Russian talking points about the Ukrainian president, Volodymyr Zelenskiy, being controlled by the west and the need to dissolve Nato.

A Russian bot farm reportedly produced 7,000 accounts to post fake information about Ukraine on social media, including Telegram, WhatsApp and Viber, according to the security service of Ukraine.

And influencers who previously demonstrated against vaccines are now turning their support to Russia.

Social media users may see a topic trending and not realize its connection to conspiracy theories or disinformation campaigns, said Esther Chan, Australia bureau editor for First Draft, an organization that researches misinformation.

“A lot of social media users may just use these terms because they’re trending, they sound good,” she said. “It’s a very clever sort of astroturfing strategy that we’ve seen in the past few years.”

The topics pushed by troll farms and Russian state media are often dictated by Russian officials, said Mitchell Orenstein, a professor of Russian and east European studies at University of Pennsylvania and a senior fellow of the Foreign Policy Research Institute.

In this case, it seems “their orders got changed because priorities shifted”, he said.

Russia has coordinated significant misinformation campaigns to destabilize western countries, including topics like the 2016 election and the pandemic, according to several reports.

Inauthentic accounts are not fully responsible for real hesitations and beliefs. But they amplify harmful messages and make pushback seem more widespread than it is.

“They’ve had tremendous success with social media platforms,” Orenstein said. “They play a pretty substantial role and they do shift people’s perception about what opinion is.”

Fake accounts will frequently link to “pink slime” or low-credibility sites that once carried false stories about the pandemic and are now shifting focus to Ukraine, said Kathleen Carley, a professor at Carnegie Mellon University.

“The bots themselves don’t create news – they’re more used for amplification,” she said.

These sites frequently sow division on controversial issues, research finds, and they make it more difficult to spot disinformation online.

The escalation of narratives like these could have wide-ranging consequences for policy.

“Right now, we’re in the beginning of a war that has a consensus, right? It’s clear that what Russia’s doing is against the moral order of the modern world. But as the war becomes prolonged, and people become exhausted, that may change,” Finkelstein said.

As “we enter into more unknown territory, these narratives will have a chance to grow … it gives us a window into what these themes are going to be like.”

The research around these changing campaigns is limited, looking at thousands of tweets in the early days of an invasion, Carley cautioned. It’s very early to understand what direction the misinformation is going and who is behind it – and conspiracies tend to follow current events even when there aren’t coordinated campaigns.

And “that does not mean that all the disinformation, all the conspiracy theories about Covid are not still there,” she said. “I would not say the bots are on holiday. They have been re-targeted at different stories now, but they’ll be back.”

Misinformation campaigns around the New World Order can quickly morph depending on the target, giving them more longevity than some other conspiracy theories. “They probably will still exist for a long time,” Chan said. “The question for us is whether they would have an impact on people – on real life and also on policymaking.”

It may be too soon to say what’s emerging during the invasion of Ukraine, but leaders should understand what terms are emerging in conspiracy theories and disinformation campaigns so they don’t inadvertently signal support for the theories in their public statements, she said.

“They need to take note of what terms are commonly used and try to avoid them,” Chan said.

A global agreement on how to address misinformation or disinformation would be key, Carley said.

“Each country does it separately. And the thing is, because we’re all connected very tightly throughout the world in social media, it doesn’t matter that one country has some strong reactions because it’ll still go from another country’s machines on to your machines,” she said.

Such rules would also need to have teeth to prevent further campaigns, she said. And educating the public about how to parse misinformation and disinformation is also important. “We need to start investing better in critical thinking and digital media literacy.”