Children will be exposed to terror content on social media ‘unless Ofcom takes tougher action’

Person uses mobile phone
Person uses mobile phone

Children will be exposed to terrorist content on social media unless the online regulator takes a tougher approach with tech firms, the Government’s counter-terrorism adviser has said.

Jonathan Hall, KC, the independent reviewer of terrorism legislation, said terrorist content was a “blind spot” in Ofcom’s proposed code to protect children from online harms.

He said the proposals by Ofcom – which has been given the duty of protecting the public from “illegal harms” under the Online Safety Act – failed to treat young people as being at particular danger of being lured into terrorism.

Ofcom published the draft code at the beginning of May, saying it set out 40 practical measures that tech firms must take to keep children safe.

It comes amid growing concerns among police chiefs and counter-terrorism experts about the increasing number of teenagers being radicalised online and arrested for terror offences.

The Telegraph revealed on Thursday that a record 40 children under 18 were arrested for terror related offences in the year to March, up 67 per cent. The Home Office figures also revealed that children were more likely to be convicted of terror-related offences than any other age group.

‘Terrorism content not identified as risk’

Mr Hall has set out his concerns in response to Ofcom’s consultation on its children’s code which aims to protect children from online harms. “Terrorism content is not identified as presenting a particular risk to children,” he said.

The code focused on protecting children from particular characteristics like abuse, hatred against people, promoting serious violence but not terrorism, which was ideological in nature. This let the firms off the hook because it would be difficult for them to judge whether children were ideologically driven.

He added: “There is merit in placing greater obligations on providers to remove terrorism or priority harmful content for children, who are particularly susceptible, than for adults, who are better able to make up their own minds.”

The code also failed to include terrorism as a specific risk – or “relevant priority content” – for children on the very platforms where it was particularly prevalent. Instead, they only prioritised content relating to suicide, self-harm and eating disorders, which was a “deficiency,” said Mr Hall.

These included discussion forums such as 8Chan and 8Kun which Mr Hall said were notorious for hateful violent content, and live streaming platforms such as those that promoted the Christchurch mosque shootings in which 51 people died in New Zealand and which were a “key vector” for radicalisation.

These also included reposted or forwarded content. “Forwarding violent and hateful material is a key risk with these services,” he said.

‘Spell out more fully the risks’

Mr Hall warned that unless Ofcom makes significant changes, “children will not get the special protection which they so clearly need”.

“I urge Ofcom to spell out more fully the risks of children encountering relevant priority content. This is to avoid the risk posed by terrorism content to children being neither adequately catered for under the illegal content regime or the child protection regime,” he concluded.

Under the Online Safety Act, Ofcom has the power to fine companies up to £18 million or 10 per cent of turnover, depending which is larger, if they breach the new rules. Additional penalties for failing to comply with the requirement to remove unsafe material can also be imposed. The fines can be imposed on companies based overseas as well as those in the UK.

An Ofcom spokesman said: “As online safety regulator, protecting children is our number one priority. We’ve been inviting views on strong measures to protect all users – including children – from terrorism, hate speech and radicalising content, including better oversight and making it easier for people to report. These are just the first of many protections against terrorist content that we’ll introduce as new evidence, technology and harms emerge.”