Social media firms are facing new laws requiring them to protect users by the end of the year, a government minister has announced.
Margot James, the minister for digital and creative industries, announced the measures during her speech at the Safer Internet Day conference.
"For too long the response from many of the large platforms has fallen short. There have been no fewer than fifteen voluntary codes of conduct agreed with platforms since 2008."
"Where we are now is an absolute indictment of a system that has relied far too little on the rule of law," Ms James added, stressing that the government would be bringing forward laws to tackle social media giants.
The criticism of the web giants has been especially prominent following reports that Facebook (NasdaqGS: FB - news) paid children as young as 13 to install software on their phones which allowed the company to watch their every activity.
It also follows a demand that social media companies "purge" their platforms of content that promotes self-harm and suicide , made by the family of 14-year-old Molly Russell.
A spokesperson for the department of digital, culture, media and sport (DCMS) said: "We have heard calls for an internet regulator and to place a statutory 'duty of care' on platforms, and are seriously considering all options.
"Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people."
Its spokesperson said: "Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not."
Ms James said the white paper would be followed by a consultation over the summer and will set out new laws which would "ensure that the platforms remove illegal content, and prioritise the protection of users, especially children, young people and vulnerable adults".
At the time of MPs' initial call for a "duty of care" to be placed on social media businesses, industry body techUK - which counts Facebook and Google among its members - said there was a "clear need to develop better solutions to tackle online harms".
"Tech companies are committed to working constructively with government to find the best way forward," it said.
"There are some good suggestions in this report. However, some proposals, such as a broad duty of care, are not yet fully developed or understood.
"In its widest form a duty of care could require platforms to monitor all speech on their platforms in breach of other fundamental rights.
"Solutions must be found that are effective and proportionate taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts.
"These are difficult issues that impact everyone and it is vital that we all work together constructively to get the solutions right."