MP Jess Phillips says 'alarming it wasn't already illegal' as she vows crackdown on AI child abusers
Safeguarding Minister and Yardley MP Jess Phillips has warned Britain is on a "dangerous trajectory" in the fight to protect children from sick online predators. The UK has today, Sunday, become the first country in the world to announce tough laws to crush the vile trade in AI-generated child sex abuse material.
Ms Phillips described the harrowing impact on victims who discovered the abuse they suffered as children was being shared with paedophiles to make money. She told The Mirror she was "shocked" that little had been done for years despite police pleas for action.
She said the case of Alexander McCartney, from Northern Ireland, whose sick online abuse led to 12-year-old Cimarron Thomas taking her own life in the USA, shows the urgency for action. "That's the real world impact of this," she said.
Read more: Acocks Green murder probe as man seriously injured inside house - two arrested
The Home Office minister continued: "We are on a dangerous trajectory where the perpetration against children of sexual crimes has been growing." Alarming new figures from charity the Internet Watch Foundation (IWF) show the number of AI-generated child sexual abuse image cases rose by 380 per cent in 2024 compared to the previous year. These are fuelling a new wave of sex abuse against children.
New laws will outlaw AI tools designed to generate child sexual abuse material (CSAM). Those convicted face sentences of up to five years in jail - a world first. Possession of AI 'paedophile manuals' - which outline how technology can be used to abuse youngsters - will carry terms of up to three years behind bars.
And a new offence will be created targeting predators who run websites where paedophiles can share material and get advice. Those convicted will be locked up for up to ten years. On top of this, Border Force officers will be given new powers to unlock phones and scan them for sick content.
Ms Phillips said: "I'm stunned that there isn't more attention on it. I think your readers would also be alarmed to hear that these things weren't already illegal."
Police have unearthed harrowing cases where AI tools have been used to turn innocuous photos of children - often shared by their families - into fake sexual images. This can include putting their faces onto the real-life sexual abuse of other children, who then face the trauma of reliving their ordeal.
Some of these are so lifelike they have been treated as though they were photographs, the Home Office said. Last year IWF analysts found 3,512 AI CSAM images on a single dark web site.
The number of Category A - the most severe - material rose by 10 per cent compared to 2023. And the charity found reports showing AI generated CSAM have risen 380 per cent, with 245 confirmed reports in 2024 compared with 51 in 2023.
Each report can contain thousands of images, officials said. The charity pointed to the case of Olivia - not her real name - a young girl who suffered years of horrendous sexual abuse from the age of three. Videos of her ordeal have been shared for years.
The IWF said those who shared or paid to view the content contributed to her torment. Ms Phillips said: "I have spoken with women who had images of their child abuse - the worst thing that ever happened to them - shared again and again, over a million times.
"It hits home when you hear those stories. It's the worst thing that ever happened to them becoming a thing that makes money for people, and that encourages and potentially causes further child abuse."