More children must blow the whistle on harmful content online, says Ofcom in its first campaign as the incoming regulator of social media firms.
Its call for children and young people to put pressure on social media firms to take down harmful material follows its research showing only a sixth lodged complaints about it.
This was despite more than two thirds of those aged 13 to 24 encountering potentially harmful content that ranged from bullying, abusive behaviour and threats to scams, misinformation and trolling.
It is the first move by Ofcom to get social firms to clean up their act before the Government’s online safety bill gives it the powers to start investigating all social media companies for potential breaches. Its regulatory reach currently only extends to video-sharing sites.
It will have powers to fine social media firms up to 10 per cent of their global turnover and shut down services if they fail to remove harmful content. It will also be able to prosecute social media executives who fail to hand over information or cooperate with investigations.
Evidence of failure to remove content even after being alerted by users is likely to strengthen the case for Ofcom to take tough action.
Campaign designed to empower young people
Anna-Sophie Harling, online safety principal at Ofcom, said: “As we prepare to take on our new role as online safety regulator, we’re already working with video sites and apps to make sure they’re taking steps to protect their users from harmful content.
“Our campaign is designed to empower young people to report harmful content when they see it, and we stand ready to hold tech firms to account on how effectively they respond.”
To help galvanise more young internet users to report potentially harmful content, Ofcom has joined forces with social media influencer Lewis Leigh, and behavioural psychologist Jo Hemmings, to launch the campaign – “Only Nans”.
The campaign aims to reach young people on the sites and apps they use regularly to highlight the importance of reporting posts they may find harmful.
While 67 per cent reported encountering harmful content online, just 17 per cent took action to report it.
More than one in five (21 per cent) said they did not think reporting it would make a difference, while 29 per cent said they did not see the need to do anything. One in nine (12 per cent) said they did not know what to do or who to inform.
Exposure to harmful content can desensitise children
Ms Hemmings said: “With young people spending so much of their time online, the exposure to harmful content can unknowingly desensitise them to its hurtful impact.
“People react very differently when they see something harmful in real life – reporting it to the police or asking for help from a friend, parent or guardian – but often take very little action when they see the same thing in the virtual world.
“What is clear from the research is that while a potential harm experienced just once may have little negative impact, when experienced time and time again, these experiences can cause significant damage.
“Worryingly, nearly a third of 13-to-17-year olds didn’t report potentially harmful content because they didn’t consider it bad enough to do something about. This risks a potentially serious issue going unchallenged.”
Mr Leigh rose to national prominence as a social influencer during lockdown with his viral TikTok videos showing him teaching his grandmother, Phyllis, dance moves.
“My generation has grown up using social media and it’s how I make a living. So although it’s mainly a positive experience and a place to bring people together and build communities, harmful content is something I come across all the time too,” he said.
“That’s why it was important to team up with my lovely Nan for this campaign to raise awareness of what we can do to protect each other online.”