The report into the impact of social media and screen-use on young people's health says that regulation is the only way to address the risks that the web poses for children.
It also follows a demand that social media companies "purge" their platforms of content that promotes self-harm and suicide , made by the family of 14-year-old Molly Russell.
According to MPs on the science and technology committee of parliament, social media damages children's sleep patterns and body image.
It also exposes children to bullying, grooming, and sexting, warned the committee.
"Although these risks existed before social media, its rise has helped to facilitate [them] - especially child abuse," said the MPs, citing figures by the National Crime Agency.
The report recommends that the government establish a social media regulator which would provide guidance on how to spot and minimise the harms that social media presents.
Such a regulator would also take enforcement action, potentially involving issuing fines, against the firms if they fail to safeguard children.
Although the commitee's inquiry into the impact of social media heard much evidence of the harm of social media, it also found that there had not been enough scientific investigation into the topic.
Norman Lamb MP, the chair of the committee, said: "It is frustrating that there is not yet a well-established body of research examining the effects of social media on younger users.
"More worryingly, social media companies - who have a clear responsibility towards particularly young users - seem to be in no rush to share vital data with academics that could help tackle the very real harms our young people face in the virtual world."
"Self-regulation will no longer suffice," added Mr Lamb. "We must see an independent, statutory regulator established as soon as possible, one which has the full support of the government to take strong and effective actions against companies who do not comply."
In response to the report, industry body techUK which counts Facebook and Google among its members, said there was a "clear need to develop better solutions to tackle online harms".
"Tech companies are committed to working constructively with government to find the best way forward," it said.
"There are some good suggestions in this report. However, some proposals, such as a broad duty of care, are not yet fully developed or understood.
"In its widest form a duty of care could require platforms to monitor all speech on their platforms in breach of other fundamental rights.
"Solutions must be found that are effective and proportionate taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts.
"These are difficult issues that impact everyone and it is vital that we all work together constructively to get the solutions right."