Children as young as seven coaxed into 'streaming' explicit videos

Paedophiles are targeting children online using streaming apps -  Dominic Lipinski/PA
Paedophiles are targeting children online using streaming apps - Dominic Lipinski/PA

Children as young as seven are being groomed by paedophiles and coaxed into ‘streaming’ sexually explicit pictures and footage of themselves online in exchange for ‘likes’ and ‘friends’ on social media sites, an internet watchdog has warned.

The youngsters, often unaware that what they are filming on a webcam, mobile phone or tablet is explicit, are encouraged by perverts to film themselves to boost their online status and ranking on new social media platforms.

The alarming development has been identified by the Internet Watch Foundation (IWF) as it reveals that the number of ‘self generated’ child sexual abuse images and videos has soared nearly 400 per cent in just one year.

In a four month period from November 2016 to February 2017 the charity received reports of 1,227 indecent pictures and clips on websites and forums, that were found to have been taken by the child themselves, usually aged between seven and 13.

Over the same period up to February this year, that figure soared to 6,011, representing a 389 per cent increase.

The charity said that for the first time it had found that one in four images were “self-generated”, meaning children were taking pictures of themselves invariably after being coerced, blackmailed or duped by paedophiles.

In some cases the pictures were sent by the youngsters to children of a similar age. However, once they were posted online the child loses control of the image and the footage is shared across the internet.

In January, 1,717 images or clips were found to have been taken by children, representing 26 per cent of such material that the IWF investigated. That is compared to just 349 such images found in January last year, which was six per cent of obscene images of children investigated.

In some footage disseminated across the internet, IWF researchers even heard parents talking in a nearby room. In one clip a mother is heard calling to her child, ‘dinner’s ready’.

In another example, a 14 year old boy who shared images of himself with his girlfriend of the same age discovered the recorded footage had been uploaded to adult pornography sites. By the time the teenager contacted the IWF asking for help to remove it, the clip had appeared on 40 different website. Despite efforts to eradicate it, he footage continues to be shared online.

Social media apps on a mobile phone - Credit: Yui Mok/PA
Social media apps on a mobile phone Credit: Yui Mok/PA

Fred Langford, the IWF deputy chief executive, urged parents to consider what children can do with mobile phones, tablets and webcams, and how ‘groomers’ can target them. He also urged companies that develop apps which stream images live to get involved with the IWF at an early stage of their product development in an attempt to understand how often well meaning software can be abused by paedophiles.

Mr Langford said: “Some of these videos were streamed because the children were effectively receiving a form of kudos through ‘likes’, which helps them move up in social rankings on some social media apps.

“When these children are being directed and appear to receive more likes when taking their clothes off, they do not realise they are being sexually exploited and the long term devastating impact of what has happened.

“It is only when the live-streamed video is shared online that the IWF, along with law enforcement, realise what has happened and can try to track a victim down.

“It’s harrowing. These victims are being groomed, coerced, manipulated, and sometimes blackmailed through live streaming apps, and this has contributed to the rise in reports we are receiving where the child has actually taken the images or videos themselves.”

The sharp increase in ‘self-generated’ images is also a result of the IWF’s improved techniques in tracing such material. The charity is one of the only organisation apart from law enforcement agencies that can search for and remove child sex abuse images.