Government will ‘look more broadly at social media’ after riots – Starmer
The Government will have to “look more broadly at social media” after recent rioting, the Prime Minister has said in an apparent hint that further regulation could be considered.
False information spread on social media about the identity of the alleged Southport knife attacker has been seen as playing a role in sparking the recent violence, leading some to call for tougher regulation.
Mayor of London Sadiq Khan has been among those calling for reform, saying recent events have shown regulations due to come into force under the Online Safety Act are “not fit for purpose”.
Asked on Friday if he agreed with Mr Khan, Sir Keir Starmer told broadcasters: “I do think that we’re going to have to look more broadly at social media after this disorder.”
He added that social media executives should be “mindful of the first priority, which is to ensure that our communities are safe and secure”.
The role of social media in the riots has been a major theme of the response to the disorder, with police already taking action against some individuals for posts online.
A 28-year-old man from Leeds, Jordan Parlour, was jailed for 20 months on Friday after pleading guilty to stirring up racial hatred in a series of Facebook posts, while Cheshire Police arrested a 55-year-old woman on Thursday in relation to posts containing false information about the Southport suspect.
Meanwhile, experts have highlighted a number of high-profile online figures sharing false information about the suspect, now named as 18-year-old Axel Rudakubana.
Sunder Katwala, director of the think tank British Future, pointed to posts by former actor Laurence Fox and former kickboxer Andrew Tate, while others said misinformation had been used by a “vocal minority” to sow division and “fuel their own agenda and trigger a summer of thrill-seeking impulsive insurrection”.
On Friday, Scottish First Minister John Swinney wrote to social media executives to express concern about the length of time it takes to remove “problematic” content and urge them to take “immediate action” to crack down on disinformation and racism.
He told social media companies: “Everyone has a role in stopping the spread of misinformation.
“You and your platform have a specific responsibility to do so.”
Commentators have been particularly concerned about the impact of content on X, formerly Twitter, whose owner Elon Musk has been heavily criticised for his own posts about the disorder.
The billionaire has engaged with posts by far-right figure Stephen Yaxley-Lennon, known as Tommy Robinson, suggested that civil war in the UK was “inevitable”, and retweeted a fake Daily Telegraph headline suggesting rioters would be sent to detention camps in the Falkland Islands.
The Online Safety Act will, for the first time, make firms legally responsible for keeping users safe when they use their services.
It will require platforms to put in place clear and proportionate safety measures to prevent illegal and other harmful content from appearing and spreading on their sites.
The biggest platforms could face billions of pounds in fines if they do not comply.
Named managers could be held criminally liable in some instances, and sites may face having their access limited in the most severe cases.
Ofcom, which will oversee the new laws, on Wednesday urged social media companies to do more to deal with content stirring up hatred or provoking violence on Britain’s streets.
The watchdog said: “In a few months, new safety duties under the Online Safety Act will be in place, but you can act now – there is no need to wait to make your sites and apps safer for users.”