Ban on ‘deepfake’ videos with celebrity faces added to hardcore porn

Rob Waugh
Contributor
Fake porn videos featuring celebrities deleted from the internet in attempt to stop ‘deepfake’ footage

Pornhub and Twitter have joined other sites such as Reddit in banning so-called ‘Deepfake’ porn videos, AI-created porn where celebrity faces are mapped into pornographic scenes.

Pornhub issued a statement this week saying it would ban the videos as ‘non-consensual’, with other companies such as Twitter also banning the videos.

Deepfakes are videos with celebrity faces mapped into real porn using AI software, and there are dozens of videos ‘starring’ celebs from Natalie Portman to Scarlett Johansson.

MOST POPULAR ON YAHOO UK TODAY

The videos are mostly generated using a tool called FakeApp, which maps celebrity faces onto pornography automatically.

It uses PCs with Nvidia GPUs, and scans two sets of images (one of porn, one of the celeb) before mapping the two together automatically.

People have also used the software to create spoofs, such as mashing Hillary Clinton and Donald Trump together (below).

 

 

This week, both Reddit and Gfycat announced they were cracking down on the videos – and now Pornhub says it will remove deepfake videos.

Pornhub said, ‘We do not tolerate any non-consensual content on the site and we remove all said content as soon as we are made aware of it.

‘Nonconsensual content directly violates our TOS [terms of service] and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission.’

Motherboard reports that deepfake videos are still available on the site, and suggests that Pornhub may be removing them when notified by users.