Social media content viewed by Molly Russell contributed to her death, coroner rules

Molly Russell - Family handout/PA
Molly Russell - Family handout/PA

Social media material viewed by Molly Russell contributed to her death, a coroner has concluded in a damning blow to the tech giants.

The 14-year-old took her own life in November 2017 after months of "bingeing" on suicide, self-harm and depression related content on platforms including Instagram and Pinterest.

Following her family’s five-year wait for answers, Andrew Walker, a senior coroner, set out his findings following a fortnight-long inquest.

He told North London Coroner’s Court that some of the sites viewed by Molly were “not safe” for children to use.

The coroner said it was “likely” that the material viewed by the teenager “affected her mental health” and “contributed to her death in a more than minimal way”.

Mr Walker then said he did not feel comfortable recording her death as suicide, instead recording: “Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content.”

The coroner’s findings represent a significant blow to the social media giants who have sought to minimise their role in the schoolgirl’s death and will reignite calls for Duty of Care legislation to protect children online.

The inquest heard that Molly viewed thousands of videos and images - including suicide depictions and self-harm guides - on social media in the months before her death.

Molly Russell - Beresford Hodge/PA
Molly Russell - Beresford Hodge/PA

Two senior executives from Meta, which owns Instagram, and Pinterest, the picture-sharing website, apologised for the harrowing material Molly saw after flying from the US to give evidence.

Judson Hoffman, from Pinterest, admitted that at the time the teenager was using the site it was unsafe for children.

The Russell family discovered that after Molly had died Pinterest continued to send her emails with headlines such as "new ideas for you in depression", above a gallery of ghoulish images.

She had been suffering from depression and anxiety at the time.

Her father, Ian Russell, has become a dedicated campaigner for protecting children online and accused Instagram of "helping to kill" his daughter just six days before her 15th birthday.

Instagram was the final app she had accessed, at 12.45am on Nov 21, 2017. She was dead within an hour or two, her family believes.

Mr Russell told the inquest he had started combing through his daughter’s social media accounts in a desperate attempt to understand what had driven her to suicide.

Jud Hoffman, Pinterest's Global Head of Community Operations - Jamie Lorriman
Jud Hoffman, Pinterest's Global Head of Community Operations - Jamie Lorriman

He would go on to uncover a "horrifying" cache of images she had saved, including self-harm injuries and quotes reinforcing feelings of worthlessness or portraying suicide as inevitable.

"It’s just the bleakest of worlds," he said last week. "It’s a ghetto of the online world that, once you fall into it, the algorithm means you cannot escape."

Meta was granted "interested person" status at the inquest, making it a legal participant in the proceedings in what is thought to have been a first for the tech company in the UK.

Pinterest also confirmed it was the first time it had acted as an interested person in a UK inquest.

There will now inevitably be renewed impetus for the Government to bring forward the much-delayed duty of care laws to protect children from harmful content online.

In May, Mr Russell accused ministers of backsliding over the Online Harms Bill, which could see tech companies face multi-billion pound fines of up to 10 per cent of their turnover if they fail to protect users, particularly children, from suicide material, sexual abuse and terrorism.