Children being groomed on webcams 'fuelling rise in child sex abuse images'

Children being groomed into filming themselves on webcams could be fuelling a surge in sex abuse images being found online in the UK, a watchdog has warned.

The Internet Watch Foundation (IWF) took action last year against a record 105,000 pages that contained millions of images – an increase of one third on the previous year.

The charity, which acts as the UK’s main watchdog for online child abuse images, said every photo or video represented a victim being abused or coerced.

“One of the key trends is the increase in self-generated content,” chief executive Susie Hargreaves told The Independent.

“About half of all content features children under 10 years old, and one in three originated on webcams – from children in their bedrooms.”

Ms Hargreaves said 96 per cent of self-generated images featured girls, and 81 per cent of those were aged between 11 and 13.

“It’s a very vulnerable age group,” she added. “Clearly given the age and what the children are doing they are in some way being groomed and coerced, whether it’s someone pretending to be their boyfriend or someone else.”

Almost a fifth of self-generated videos found by the IWF are in the most severe category, which includes rape and sexual torture.

“While we see more content of older children, the younger the child the more likely it is that the abuse is the most severe form,” Ms Hargreaves said, adding that victims included babies.

Analysts said that once a video has gone online, it can be split up, converted into images and spread in different forms that must be individually removed.

The report detailed the story of one victim, a girl called Olivia who had been sexually abused from the age of three.

Police rescued her from her abuser in 2013, when she was eight, but the IWF still find images of her being raped and tortured online every day.

The IWF said some of the images were found on commercial sites, which were profiting from her abuse.

“We simply don’t know if Olivia was aware that images of her abuse were being shared online,” the report added.

“It’s difficult to imagine how traumatic that knowledge must be, particularly for someone so young … knowing an image of your suffering is being shared or sold online is hard enough. But for survivors, fearing that they could be identified or even recognised as an adult is terrifying.”

The IWF is currently assessing a webpage every two minutes on average, and finding a child abuse image every five minutes.

The charity receives reports of indecent images and uses technology to actively search for the material, removing it and passing information to international law enforcement.

It believes the increase in reports and takedowns resulted from both improved detection using new technology, and the fact more images are being shared.

Police forces in England and Wales are also recording rising numbers of crime relating to indecent images of children, and the National Crime Agency called for technology firms to prevent the material being uploaded.

“The scale has so fundamentally changed that we need a fundamentally recalibrated approach,” the NCA’s director for vulnerabilities, Will Kerr, warned last year.

“It is not sustainable for companies to simply identify indecent images on their servers and report it to law enforcement, when we know that technologically you can prevent it at source.”

Matthew Falder, one of Britain’s most prolific paedophiles, blackmailed victims into sending him images of obscene acts (PA)
Matthew Falder, one of Britain’s most prolific paedophiles, blackmailed victims into sending him images of obscene acts (PA)

Ms Hargreaves said the cause of rising offences was demand from people wanting to watch children being abused.

“Unfortunately, and as the police tell us often, there are 100,000 people sitting in the UK right now demanding images of the abuse of children,” she added.

“This is a global challenge and no doubt every country’s police force will have their own estimations of this criminality.”

The IWF found the images on file hosting “cyberlockers”, blogs, websites, forums, video channels and social networks, with the Netherlands hosting the highest proportion of child abuse images, followed by the US and Russia.

Ms Hargreaves said that while technology is making it easier to detect the images, society needs to “step up”.

“We can fight it with technology, through working with law enforcement and governments but collectively we’ve all got to step up as a society and have a zero tolerance approach,” she added.

The government’s recently published white paper on online harms laid out plans to create a new independent regulator for internet companies and to hold them to a new, mandatory duty of care to their users.

Victoria Atkins, the minister for crime, safeguarding and vulnerability, said: “Tackling this sickening crime is a top priority of the government.

”The IWF do incredible work in removing this content from the web, but we need to stop this material from appearing in the first place.

“The online harms white paper, launched this month, will ensure that tech companies have a legal responsibility to remove this vile material from their platforms with severe sanctions for those that do not.“