Social media regulation could include deadlines for removing harmful content

By Jamie Harris, Press Association Science Technology Reporter

A deadline to speed up the removal of harmful content on social media is among the legislative measures being considered by the government in a bid to protect young people.

Speaking at the final evidence session for the Science and Technology Committee’s inquiry into the impact of social media, Minister for Digital Margot James said she is interested in Germany’s recent approach.

It compels social networking sites to remove illegal content within 24 hours of it being reported, or face fines.

When asked by Labour MP Liz Kendall whether the government was considering such a move, Ms James responded: “Yes, I mean that would be one of the aspects of what may require new law as opposed to the many non-legislative measures that we’re still developing.”

Social networks have previously acknowledged that regulation looks likely (Yui Mok/PA)

The minister warned there are some grey areas of what is deemed illegal content and what is not, which it is expected to be addressed in a white paper this winter.

Ms James sat alongside Jackie Doyle-Price MP, Parliamentary Under Secretary of State for Mental Health, discussing how government and social media companies could tackle a range of issues affecting children online, particularly cyber-bullying.

“I think the social media companies could do a lot more, I think they’re being very slow in this area actually, and I have no doubt that it is the stick of regulation that has focused the mind to do more,” Ms Doyle-Price said.

Prior to the pair’s appearance, Children’s Commissioner for England Anne Longfield told the committee that children moving from primary school up to secondary school felt an “avalanche of pressure” to be more socially aware on social media.

She said: “There are of course huge positives, support, friendships, the learning you can get online is just unimaginable compared to ten years ago, but some of the negatives about the content, around things like self harming are clearly there.

“A lot of my emphasis is about getting children the emotional resilience to be able to manage their time online positively, but then also look at all those other people, schools, government, social media companies, who have a role to play in being able to provide that positive platform for children to be able to get the most out of it rather than be drawn and sink into that negativity.”

In October, technology giants including Google, Facebook and Twitter signalled they would not be opposed to the introduction of a regulator to monitor their platforms.

Karim Palant, Facebook’s public policy manager in the UK, told the committee last month: “Our chief executives have talked about the fact that some regulations are going to be inevitable.”