How could Russia and China fuel conspiracy theories about Kate?

London, UK. 02nd June, 2022. Catherine Princess of Wales, Kate Middleton, has released a video message in which she declares to have been diagnosed with cancer after the surgery she underwent last January in a London hospital and now has to undergo a course of preventative chemotherapy. (Photo by DPPA/Sipa USA) Credit: Sipa US/Alamy Live News
Kate, the Princess of Wales, has announced she's been diagnosed with cancer. (Sipa US/Alamy Live News)

The government has reportedly said that the UK's adversaries – including Russia, China and Iran – might be deliberately spreading misinformation and conspiracy theories about the Princess of Wales.

A significant amount of unfounded rumour, gossip and speculation has circulated on platforms such as Facebook, X and TikTok since Kate underwent abdominal surgery in January, meaning she had to take a leave of absence from royal engagements while she recovered.

Quoting an unnamed Whitehall source, The Telegraph said that there were growing concerns that China, Russia and Iran may be involved in the spread of bizarre conspiracy theories around the princess, in an effort to destabilise Britain.

The source said: "Part of the modus operandi of hostile states is to destabilise things – whether that is undermining the legitimacy of our elections or other institutions.”

Rumours around Kate intensified after her husband, William, pulled out of attending a memorial service at Windsor Castle for his godfather, the late King Constantine of Greece, for an unspecified "personal matter". Kensington Palace confirmed on Saturday that William's absence from the service was due to Kate's diagnosis.

When Kate apologised earlier this month after her office issued a photograph of her with her three children, which she had edited, to mark Mother's Day on 10 March, the rumour mill reached fever pitch.

How could state actors be involved?

Spreading conspiracy theories and other disinformation is a familiar tactic used by Russian ‘troll farms’ to spread lies about issues such as the Ukraine war and about COVID-19 vaccines.

A government report into a ‘troll farm’ unearthed in 2022 said that Russian operatives worked across social media platforms including Telegram, Twitter, Facebook and TikTok.

The operatives use VPNs to appear to be in different countries, and often comment on stories rather than posting them to drive reach, and also amplify ‘organic’ content such as news stories or YouTube videos by other people which happen to agree with the troll's objectives.

The Princess of Wales greets well-wishers after attending the Christmas service at St Mary Magdalene Church, Sandringham. December 25, 2023
Kate was last seen at the Christmas service at Sandringham. (PA/Alamy)

These methods enable trolls to work effectively and without being detected, the UK government said.

James McQuiggan, security awareness advocate at KnowBe4, told Yahoo News that the use of social media by cybercriminals, state attackers and conspiracy theorists is an ongoing issue, especially in the past decade.

He said: "With Brexit and the 2016 US election, social media was used to spread the message with inaccurate information, and often unchecked, that opposing sides of the misinformation campaigners spent more time refuting their claims than proposing their own.

"Historically, state actors have used conspiracy theories to divert attention from domestic issues, discredit opposition, or manipulate public opinion for their gains. They may fabricate or amplify narratives that align with their strategic interests, exploiting existing societal tensions or crises to make these theories more plausible to the public.

McQuiggan said that state actors commonly use social media to spread conspiracy theories widely, "employing bots, trolls, and fake accounts to create the illusion of grassroots support for these narratives".

Prince William is known to be fiercely protective of his family's privacy - for example, he was furious when a French magazine published topless photographs of Kate, taken from a long distance, in 2012.

Royal commentators believe the manner of Kate's video message was both a reflection of the recent speculation and an attempt to shut it down. But conspiracy theories have continued in the wake of her announcement, with some commentators falsely claiming the video was artificial intelligence (AI)-generated, or disputing the accuracy of what she said.

What can tech platforms do?

13 March 2024, Brandenburg, Grünheide: Tesla CEO Elon Musk leaves the Tesla Gigafactory Berlin-Brandenburg. Following an attack on the electric car manufacturer Tesla's power supply, the factory is back online after a power outage lasting several days. Photo: Sebastian Christoph Gollnow/dpa (Photo by Sebastian Gollnow/picture alliance via Getty Images)
Elon Musk has waved goodbye to thousands of X employees since taking over, with some arguing this has harmed the company's ability to combat disinformation. (Getty Images)

Imran Ahmed, chief executive of the US Center for Countering Digital Hate, told the BBC that the conspiracy theories show off the ‘inhumanity’ of social media.

He said: “I think it’s the inhumanity of the way that social media has made us behave, forcing people to talk about things that can be very deeply personal.

“And also seeing of course the impact of that on our society, how quickly it was picked up by millions of people, and how much it’s done damage to the Royal Family themselves.”

In September last year, the Washington Post reported that after Twitter laid off thousands of staff and rolled back misinformation rules since Elon Musk's takeover in April 2022, social networks such as YouTube and Facebook also stopped labelling posts relating to the unfounded rumour that the 2020 election was ‘stolen’ in a sign the companies are abandoning plans to crack down on their most aggressive ways of combating misnformation.

McQuiggan said: "Social media companies have been scrutinised for their role in spreading conspiracy theories, leading to increased pressure to address the issue. For example, platforms like Facebook, Twitter, and YouTube have taken steps to remove or restrict content related to specific conspiracy theories, such as QAnon, and have banned accounts that consistently promote disinformation.

"Some technology efforts include deploying algorithms designed to reduce such content's visibility and introducing fact-checking features and partnerships with third-party fact-checkers to verify information and label or remove false content."

But in a year when dozens of countries - including the UK and US - are having general elections, concerns will doubtless mount as to whether these social media companies are doing enough.

Read more: