Ofcom has unveiled new guidelines for video-sharing platforms (VSPs) to keep users safe from harmful videos, as part of its powers to regulate the video services.
The regulator’s guidance will help sites and apps stick to new rules to protect users which were introduced last year.
Under the laws, VSPs established in the UK must take measures to protect under-18s from potentially harmful content and all users from videos likely to incite violence or hatred, as well as certain types of criminal content.
Those rules apply to sites where users can upload and share videos with other members of the public and allow users to engage with a range of content and social features.
In its new guidance for firms, Ofcom says platforms must now provide clear rules around uploading content and have “clear, visible” terms and conditions which prohibit content relating to terrorism, child abuse material or racism.
It also says a site should have easy reporting and complaint processes in place and for those platforms that host pornographic material, age-verification measures should be in place to protect children from accessing such material.
The regulator said it was working with companies to understand and find better technical solutions around age verification.
High-profile platforms such as TikTok, Snapchat, Twitch, OnlyFans and Vimeo are among those included under the jurisdiction of the new regime, while YouTube and Facebook are set to fall under Irish jurisdiction.
Ofcom said the long-term aim is for the regime to be superseded by the Online Safety Bill, the draft of which is currently being examined by parliamentary committees.
That Bill is set to introduce widespread and robust regulation across the tech sector, requiring social media firms and internet companies to sign up to a duty of care to protect users, with Ofcom set to be the regulator.
But until then, chief executive Dame Melanie Dawes said the new guidance for VSPs was an important step in better protecting people, particularly children, from online harms.
“Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them,” she said.
“The platforms where these videos are shared now have a legal duty to take steps to protect their users.
“So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”
Ofcom research from last year found that a third of video-sharing platform users have witnessed or experienced hateful content, a quarter said they have been exposed to violent or disturbing content and one in five said they had seen content that encouraged racism.