Social media firms have been accused of a “despicable” failure to act on harmful content after new research showed that material related to suicide, self-harm and depression is still prevalent on Instagram, TikTok and Pinterest six years after Molly Russell’s death.
Michelle Donelan, the technology secretary, spoke after the Molly Rose Foundation, a suicide prevention charity established by Molly’s family, published a report showing a “clear and persistent problem” with readily available and discoverable harmful content on the platforms.
“It is despicable and indefensible that social media companies are still turning a blind eye to the scale of horrendous suicide and self-harm content on their platforms,” said Donelan.
The technology secretary added that she would be raising the issue at a meeting with tech executives soon.
Molly’s father, Ian Russell, said the report showed “little has changed” on social media platforms since his daughter’s death.
He said that social media companies must share “internal worries” over harms their platforms cause to teenagers, as court documents in the US revealed that Instagram’s owner, Meta, was concerned “similar incidents” to Molly’s death would occur because the platform’s algorithms were “[l]eading users to distressing content”.
Russell said: “What I would like to see is these companies being honest about their internal worries about the negative effects of their platforms.
“This is not to say that the platforms are wholly bad … But they have to focus on problems related to this harmful content and take measures that really separate vulnerable children from harmful material that might be on their platforms.”
Molly Russell, from Harrow, north-west London, took her own life at the age of 14 in November 2017 after viewing harmful content related to suicide, self-harm, depression and anxiety on Instagram and Pinterest. In a landmark ruling, last year an inquest into her death found that Molly “died from an act of self-harm while suffering from depression and the negative effects of online content”.
The MRF study analysed 1,181 of the most engaged with posts on TikTok and Instagram that used well-known hashtags relating to suicide, self-harm and depression and found a significant proportion of the material was harmful.
The research, produced in collaboration with non-profit organisation Bright Data, also searched for content under hashtags that Molly had used. It found that 48% of the most engaged with posts on Instagram were found to be harmful – defined as promoting or glorifying suicide or self-harm, or containing “relentless” themes of depression – while 49% of the posts examined on TikTok were also found to be harmful.
The report also found that Pinterest “actively recommended” suicide and self-harm content including a video with the caption “one day I will leave and never come back any more”.
The Online Safety Act became law last month and protecting children from harmful content is a key focus of the legislation. Its provisions include requiring that social media companies prevent children from encountering harmful content such as suicide-related material.
A company spokesperson for Meta said the company wanted teenagers to have “safe” experiences on Instagram and works with experts to develop guidelines on suicide and self-harm content in a bid to strike a balance “between preventing people seeing sensitive content while giving people space to talk about their own experiences and find support.”
A spokesperson for TikTok, which launched in the UK in 2018, said: “Content that promotes self-harm or suicide is prohibited on TikTok and, as the report highlights, we strictly enforce these rules by removing 98% of suicide content before it is reported to us.”
A spokesperson for Pinterest said the company was “constantly updating our policies and enforcement practices around self-harm content.”
In the UK and Ireland, Samaritans can be contacted on freephone 116 123, or email email@example.com or firstname.lastname@example.org. In the US, you can call or text the National Suicide Prevention Lifeline on 988, chat on 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org