Ofcom social media safety proposals lack ambition, say bereaved parents

Esther Ghey, the mother of Brianna Ghey
Esther Ghey, the mother of Brianna, says it is time for social media firms to step up and 'make sure that change happens' - Paul Grover for The Telegraph

Ofcom’s plans to protect children from online harms lack ambition, the parents of Molly Russell and Brianna Ghey have said.

Ian Russell, chairman of the Molly Rose Foundation, a charity set up in the memory of Molly, said Ofcom needed to go further if it was to give children and families the online protection that they deserved.

Esther Ghey, the mother of Brianna, who was murdered by 15-year-olds Scarlett Jenkinson and Eddie Ratcliffe in 2023, said it was a “pivotal point” for social media firms to step up and “make sure that change happens”.

And in a letter to Rishi Sunak and Sir Keir Starmer, Bereaved Families for Online Safety, a group of parents whose children’s deaths were linked to social media, urged them to ban tech firms from the UK if they failed to design their services in a “safe and fundamentally responsible way”.

Their comments on Wednesday came after the regulator Ofcom published new draft codes that will require social media firms to introduce “highly effective” age checks including the use of photo IDs like passports so they can identify children on their sites and protect them from online harms.

The companies will also have to configure their algorithms in such a way that they filter out the most harmful content – covering self-harm, suicide and eating disorders – from children’s social media feeds.

Molly Russell
Molly Russell's father Ian has been trying to hold social media companies to account after her death - Family handout/PA

Mr Russell, whose daughter Molly, 13, took her life after being bombarded with self harm and suicide content, said Ofcom needed to be more ambitious.

“Ofcom’s task was to seize the moment and propose bold and decisive measures that can protect children from widespread but inherently preventable harm,” he said.

“The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life.

“It’s been over six years since Molly’s death, but the reality is that very little has yet changed. In some respects, the risks for teens have actually got worse.

“That’s why it’s hugely important that the next prime minister commits to finish the job and strengthen the Online Safety Act to give children and families the protection they deserve.”

Ofcom lists ways in which social media companies could check the age of users, including requiring photo ID such as passports, using facial age estimation – where computer software is used to calculate a user’s age – or reusable digital ID services, where an external company provides age verification.

The campaigning parents believe Ofcom needs to be more precise about what “highly effective” age checks means such that there should be a percentage of the level of accuracy that social media platforms will be expected to achieve.

They are also concerned about what they see as a fault in the legislation which means it is difficult to fully block children’s access to legal but harmful content such as dangerous stunts.

‘Platforms should not use algorithms’

They are also pushing for Ofcom to go further on restricting algorithms. Rather than “filtering out” online harms, they believe there should be a presumption that platforms should not be using algorithms unless they can be sure they are not causing harm.

Ruth Moss, whose daughter Sophie, 13, died at home in Fife in 2014 after she had viewed suicidal and self-harm posts on social media, told the BBC that she was concerned about how Ofcom would enforce its codes. “It’s that I still have questions about,” she said.

She said social media firms should also be barred from using algorithms with children. “It’s inappropriate for children to have any type of algorithm in their feed that is going to promote anything harmful,” she said.

Ms Moss also warned that end-to-end encryption on messaging services should be reversed as it puts children at risk of grooming with the platforms having no access to spot and prevent such harms.

“That means that if something goes wrong and a child is groomed in seconds or in Sophie’s case a 31-year-old starts sending semi naked pictures of themselves to your daughter, there is no evidence. That is lost,” she said. She urged Ofcom to campaign for a change in the law to tackle the risks from encryption.

In their joint letter sent to the Prime Minister and Labour leader the Bereaved Families for Online Safety said: “We encourage you to make clear to tech companies that they must start to design and build their services in a safe and fundamentally responsible way.

“If companies are not prepared to do so, they should be made to understand there is no longer a place for them in the UK.

“We urge you to commit to a strong, determined and evidence-based course of action, including a commitment to strengthen the Online Safety Act in the first half of the next parliament. We strongly implore you to embed safety-by-design principles into the development of generative AI – in doing so, acting now to ensure that history is unable to repeat itself.

“Tech companies must be subject to proper and meaningful accountability, with children and parents able to have confidence that platform terms and conditions are enforced, and that when the companies receive user reports they will respond quickly and take them seriously.”