The Online Safety Bill in its current form is a “mess” and needs a “total rewrite”, a leading digital human rights lawyer has said.
Dr Susie Alegre has written a legal opinion piece warning that the proposed internet safety laws fail to address the root causes of online harm, such as the systems which recommend posts to users, and that the laws would also threaten free speech.
The Government has pledged to reintroduce the much delayed Online Safety Bill to Parliament shortly, with Culture Secretary Michelle Donelan saying this week that ministers were working “flat out” to deliver it having previously suggested some tweaks would be made to the legislation.
But Dr Alegre said the entire Bill needed work, writing that it was poorly drafted with “broad and opaque” definitions that will make it difficult for companies to know what they are required to do.
She also said the Bill was also too focused on policing types of content rather than tackling the “algorithmic systems and design features” that underpin the biggest platforms.
“The Online Safety Bill is a mess and needs a total rewrite. As it stands, it offers the worst of both worlds,” she said.
“It not only threatens free speech, freedom of expression and privacy, but fails to do enough to tackle the real drivers of online harm, such as social media companies actively recommending posts about self-harm, which contributed to the tragic suicide of teenager Molly Russell.”
In her opinion piece, which was commissioned by consumer campaign group SumOfUs, Dr Alegre said the Bill “focuses too heavily on moderating content and the impact of user-generated content on individuals”, meaning that it “fails to put serious checks on the systems that drive online harms, in particular the many ways that personal data is used for profiling, targeting and curation of content”.
She adds that this focus on content moderation in turn creates new threats to freedom of expression and privacy online.
Dr Alegre also said that attempts to satisfy competing demands from campaigners on all sides meant that the Bill was “simultaneously too narrow and too broad to address the underlying issues that lead to online harms”.
While child safety campaigners have repeatedly called for more protections and tougher restrictions on content moderation to be included in the Bill, free speech advocates have raised concerns over online censorship being a side effect of the proposed rules.
More than five years in the making, the Online Safety Bill in its current form would require social media and other platforms to protect their users from harmful content, with large fines and the threat of having their site blocked if they were found to breach the new rules, which will be overseen by Ofcom.
The largest platforms will also be required to combat specific types of content which have been identified as legal, but harmful, with companies expected to set out in their terms of service how they will tackle these harms.
A Department for Digital, Culture, Media and Sport (DCMS) spokesperson said: “This is a completely inaccurate characterisation of the Bill, which will put much-needed responsibility on tech firms to protect children and tackle criminal activity, including for the way their algorithms drive online harm.
“The law’s focus is on tech firms’ systems and processes, not on the regulation of individual pieces of content. However ministers are considering how to further strengthen the Bill’s protections for free speech.”