The Government will make it a crime to take and share pornographic “deepfake” images and “downblouse” photographs without consent.
As well as giving police and prosecutors more power to bring abusers to justice, the Ministry of Justice will bring forward laws to tackle the installation of equipment such as hidden cameras to take or record images of someone without consent. The legislation was recommended by the Law Commission.
The proposals will form part of the Online Safety Bill, which is going back to Parliament next month. The changes will follow “upskirting” being made a specific criminal offence in England and Wales in 2019.
But what is a deepfake?
What is a deepfake?
A deepfake is a fake image or video created using a form of artificial intelligence called deep learning.
Deepfakes are created by taking an existing image or video and adding somebody’s likeness on top, making it appear as though people are doing or saying something they didn’t.
Deepfake technology has been used for sinister purposes, such as revenge porn, fake news and fraud. Actresses Margot Robbie and Scarlett Johansson are among those who have been victims of deepfake pornography.
In 2010, Johansson spoke out about how useless it is trying to fight back against those who create deepfakes.
“I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself. It’s a fruitless pursuit for me but a different situation than someone who loses a job over their image being used like that.”
She said: “The internet is just another place where sex sells and vulnerable people are preyed upon. And any low-level hacker can steal a password and steal an identity. It’s just a matter of time before any one person is targeted.”
Last year, videos were posted to social media of what appeared to be actor Tom Cruise, under the username deeptomcruise. But the videos were actually made by visual effects expert Chris Ume, working with actor and Tom Cruise impressionist Miles Fisher.
Deepfakes are typically more convincing when the model bears a resemblance to the person they will be impersonating.
In 2020, Channel 4 created a deepfake of the Queen for an “alternative Christmas message” as a warning about misinformation and fake news.
In 2019, a video of a deepfake Boris Johnson endorsing Jeremy Corbyn circulated online. It was made by Future Advocacy, an artificial intelligence think tank, in an effort to pressurise MPs to address the spread of deepfakes online.