Ban on ‘deepfake’ videos with celebrity faces added to hardcore porn
Pornhub and Twitter have joined other sites such as Reddit in banning so-called ‘Deepfake’ porn videos, AI-created porn where celebrity faces are mapped into pornographic scenes.
Pornhub issued a statement this week saying it would ban the videos as ‘non-consensual’, with other companies such as Twitter also banning the videos.
Deepfakes are videos with celebrity faces mapped into real porn using AI software, and there are dozens of videos ‘starring’ celebs from Natalie Portman to Scarlett Johansson.
MOST POPULAR ON YAHOO UK TODAY
James Bulger killer Jon Venables jailed again for possesion of 1,000 indecent child images
Young woman, 28, shares heartbreaking photos of final days after shock cancer diagnosis
The internet is obsessed with a clip of how Donald Trump’s hair actually works
Coleen Nolan reveals she’s divorcing husband of 10 years after ‘hellish’ 12 months
Welsh teen, 17, sent home from A&E after constipation complaint is found dead 24 hours later
Two London neighbours are locked in a passive aggressive parking war
The videos are mostly generated using a tool called FakeApp, which maps celebrity faces onto pornography automatically.
It uses PCs with Nvidia GPUs, and scans two sets of images (one of porn, one of the celeb) before mapping the two together automatically.
People have also used the software to create spoofs, such as mashing Hillary Clinton and Donald Trump together (below).
This week, both Reddit and Gfycat announced they were cracking down on the videos – and now Pornhub says it will remove deepfake videos.
Pornhub said, ‘We do not tolerate any non-consensual content on the site and we remove all said content as soon as we are made aware of it.
‘Nonconsensual content directly violates our TOS [terms of service] and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission.’
Motherboard reports that deepfake videos are still available on the site, and suggests that Pornhub may be removing them when notified by users.