What is a deepfake? Proposal to criminalise fake pornographic images
The term deepfake has made the headlines again. Popular online-content streamers, including QTCinderella and Sweet Anita, have discovered their likenesses were used in deepfake pornographic videos, access to which was sold online.
These videos had likely been there long before they were discovered. And, while the creator of these deepfake videos has reportedly deleted them from the platform they were using, you can be sure countless other videos are still up, and that this trend is only on the rise.
“Deepfake” does not solely refer to pornography. It is not only used in reference to video. However, pornography is one of the more damaging forms of deepfake content. But not the only damaging form. So how do deepfakes work?
What is a deepfake?
A deepfake is typically a medium in which the face or body of a person is digitally altered to make them look like someone else. In deepfaked audio, the voice of a specific person is synthesised, to produce an audio clip that appears to show them saying something they never actually said. All this stuff is powered by machine learning and artificial intelligence.
A 2019 study by Deeptrack Labs found 96 per cent of deepfake videos were pornographic. However, since then, there has been an explosion of less obviously objectionable deepfake content shared on social media, as the tools used to create them are that much more readily available.
For instance, you may have stumbled upon Miles Fisher online. He posts deepfake videos of Tom Cruise doing silly things on TikTok. In these videos, you’ll hear Fisher’s own voice and see his real body, but the face pasted upon his is Cruise’s. It helps that he doesn’t look or sound a million miles removed from a younger Tom Cruise.
Why are deepfakes dangerous?
Deepfake tech is most dangerous when it is used to create pornographic content without people’s consent, or when it is used as a tool to spread misinformation or disinformation.
Even the most ostensibly harmless deepfake content presses some of the same buttons as the malicious stuff, nudging your brain to believe it’s real even if it is flagged as deepfake.
A deepfake video might show a political candidate saying something incendiary ahead of an election, in an attempt to hurt their chances of winning. It could be a public figure speaking out against a vaccine, despite never having done so.
In 2019, a video of a deepfake Boris Johnson endorsing Jeremy Corbyn circulated online. It was made by Future Advocacy, an artificial-intelligence think tank, in an effort to pressure MPs to address the spread of deepfakes online.
Tools have been developed, and are being developed, to attempt to keep up with the increase in quality of these deepfakes. However, thanks to the virality of some of this content, the damage may have been done by the time videos or audio clips are revealed as fakes.
What is deepfake porn?
In deepfake porn, the face of a celebrity or public figure is transplanted into a sex scene, effectively turning any piece of porn into the kind of “sex-tape” content that used to attract so much attention years ago.
Actresses Margot Robbie and Scarlett Johansson are among those who have been victims of deepfake pornography.
In 2010, Johansson spoke out about how useless it is trying to fight back against those who create deepfakes.
“I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself. It’s a fruitless pursuit for me but a different situation than someone who loses a job over their image being used like that.”
She said: “The internet is just another place where sex sells and vulnerable people are preyed upon. And any low-level hacker can steal a password and steal an identity. It’s just a matter of time before any one person is targeted.”
Are deepfakes illegal?
Deepfakes in general are not illegal. They are part of a spectrum that runs from face-swap apps on your phone to a pornographic video of Boris Johnson proclaiming support for Russia, and banning “deepfakes” outright is unfeasible.
However, in November 2022, the UK Government announced plans to make pornographic deepfakes shared online illegal.
“Explicit images or videos which have been manipulated to look like someone without their consent,” will be criminalised in a planned amendment to the Online Safety Bill.
Other malicious uses of deepfake technology could be covered by existing laws surrounding fraud and defamation/libel. However, there is no blanket protection should someone create a deepfake of your likeness.