A coroner’s conclusion that Molly Russell died after suffering from “negative effects of online content” should “send shockwaves through Silicon Valley”, online safety campaigners have said.
In his ruling, senior coroner Andrew Walker said online material viewed by the 14-year-old on sites such as Instagram and Pinterest “was not safe” and “shouldn’t have been available for a child to see”.
In response, the Russell family said it is “time the toxic corporate culture at the heart of the world’s biggest social media platform changed”.
Children’s charity NSPCC said the conclusions of the inquest “must be a turning point” and tech companies must now be held to account through the proposed Online Safety Bill, after accusing social media sites of putting “the safety of children second to commercial decisions”.
In their statement, the Russell family said: “It’s time for the Government’s Online Safety Bill to urgently deliver its long-promised legislation.
“It’s time to protect our innocent young people, instead of allowing platforms to prioritise their profits by monetising their misery.
“For the first time today, tech platforms have been formally held responsible for the death of a child.
“In the future, we as a family hope that any other social media companies called upon to assist an inquest follow the example of Pinterest, who have taken steps to learn lessons and have engaged sincerely and respectfully with the inquest process.
“For Molly’s sake… let’s make the online world a place that prioritises the safety and wellbeing of young people over the money that can be made from them.”
Prime Minister Liz Truss has previously confirmed the Online Safety Bill, which has been in the making for more than five years, will return to Parliament soon after a number of delays.
The Bill aims to introduce rules for social media and other user-generated content-based sites that compel them to remove illegal material from their platforms, with a particular emphasis on protecting children from harmful content.
All platforms in scope will have a duty to find and remove illegal content, with the largest also expected to tackle named “legal but harmful” forms of content too, as well as have clear and robust terms and conditions in place.
First and foremost, this is about Molly Russell. Following the inquest, tech companies must be held accountable when they don’t make children’s safety a priority.
— NSPCC (@NSPCC) September 30, 2022
In the wake of the inquest into Molly’s death, the NSPCC urged the Government to act quickly to get the Bill in place.
Sir Peter Wanless, NSPCC chief executive, said: “The ruling should send shockwaves through Silicon Valley – tech companies must expect to be held to account when they put the safety of children second to commercial decisions. The magnitude of this moment for children everywhere cannot be understated.
“Molly’s family will forever pay the price of Meta and Pinterest’s abject failure to protect her from content no child should see, but the Online Safety Bill is a once-in-a-generation opportunity to reverse this imbalance between families and big tech.
“This must be a turning point, and further delay or watering down of the legislation that addresses preventable abuse of our children would be inconceivable to parents across the UK.”
Online safety campaigners have long called for enhanced regulation of social media and wider platforms, arguing the current system of self-regulation led by each site’s own terms of service and content moderation processes is inadequate.
Earlier this month, new Culture Secretary Michelle Donelan said the Government is planning “tweaks” to the Online Safety Bill, but the protections included, particularly for children, will not be watered down.
Following the coroner’s ruling, pressure is likely to now increase on the Government to bring the Bill back to Parliament swiftly.