The father of a teenage girl who took her own life says he has "no doubt that social media helped kill my daughter" - as he backed a report calling for companies to share data and pay towards research into the technology's potential harms.
Molly Russell was 14 when she died days before her birthday in 2017 after viewing graphic self-harm and suicide material on Instagram.
In a new report by the Royal College of Psychiatrists, her father Ian said there was an urgent need for greater action after blaming "pushy algorithms" for the "wrecking ball of suicide" that "smashed brutally" into his family.
He said of Molly's social media accounts: "Among the usual schoolfriends, pop groups and celebrities followed by 14-year-olds, we found bleak depressive material, graphic self-harm content and suicide-encouraging memes.
"I have no doubt that social media helped kill my daughter."
He added: "Without research using data from social media companies, we will never know how content can lead our children and young people to self-harm or, in the most tragic cases, take their own lives."
In the report, Molly's father also detailed one of her final notes which described how she felt "with heart-breaking clarity".
She had written: "I'm the weird sister, quiet daughter, depressed friend, lonely classmate.
"I'm nothing, I'm worthless, I'm numb, I'm lost, I'm weak, I'm gone. I'm sorry. I'll see you in a little while. I love you all so much. Have a happy life. Stay strong xxx."
In its report, the Royal College of Psychiatrists says a proposed 2% levy on the UK revenues of major tech companies does not go far enough.
Instead, it wants the so-called "turnover tax" to apply to international turnover, and for some of the money from it to be used for mental health research.
The co-author of the report has now called on the government to step up its action by forcing an independent regulator to pass data gathered from smartphones and devices on to researchers to study.
Dr Bernadka Dubicka told Sky News: "I am seeing more and more children self-harming and attempting suicide as a result of their social media use and online discussions.
"Social media companies have an awful lot of knowledge, they've got very sophisticated algorithms."
The report has been backed with support from children's safeguarding charity NSPCC and mental health charity YoungMinds.
Andy Burrows, the NSPCC's head of child safety online policy, said: "Big tech cannot continue shirking accountability by hiding the harms children are facing on their sites.
"Only a regulator has the power to force them to share this information which will help protect young people."
Following Mr Russell's campaigning in the wake of his daughter's death, Instagram pledged to remove images, drawings and cartoons showing methods of self-harm or suicide.
But last March - a month after Instagram's promise - Sky News found numerous disturbing videos and pictures on the social media website, before they were later removed.
In response to the Royal College of Psychiatrists' report, Facebook said it is "already taking a number of the steps recommended".
A Facebook company spokesperson said: "Making sure people are safe on our platforms is the most important work we do.
"We already take a number of the steps recommended in this report, prioritising the removal of harmful content and providing support for those who search for it.
"We are working closely with organisations such as the Samaritans, and the government, to develop industry guidelines in this area.
"We're already working with independent academic researchers on how we can share data from our platforms across areas including suicide and self-harm.
"We will continue to look at ways we can provide this data, as long as it doesn't compromise people's privacy."
A government spokesman said: "We are developing world-leading plans to make the UK a safer place to be online. This includes a duty of care on online companies, overseen by an independent regulator with tough enforcement powers, to hold them to account.
"The regulator will have the power to require transparency reports from companies outlining what they are doing to protect people online. These reports will be published so parents and children can make informed decisions about their internet use."
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email email@example.com in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK