Pornhub continued to host 'deepfake' porn with millions of views, despite promise to ban [UPDATE]

UPDATE: Feb. 12, 2018, 9:43 p.m. EST Pornhub has removed the videos in question, based on the same search term used earlier on Monday. The company initially didn't respond to an additional request for comment. When asked again about the videos, a spokesperson reiterated that "nonconsensual content" violated its terms of service, but didn’t detail why the content was allowed to stay on the site.

Pornhub has promised to remove "deepfake" porn from its massive video platform. But 48 of these videos remained online Monday morning, some with millions of views.

SEE ALSO: A guide to 'deepfakes,' the internet's latest moral crisis

"Deepfake" porn typically features actresses or other celebrity women who haven't consented to appearing in the videos. New and easily-accessible software makes it simple for anyone to stitch footage of one person's face onto another's, which has predictably resulted in hardcore porn featuring the likenesses of Scarlett Johansson, Gal Gadot, and others.

On Feb. 2, Pornhub told Mashable it would remove deepfake videos that had been flagged by users, and a few days later it told Motherboard it's banning them outright.

"We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it," a spokesperson reportedly told Motherboard. "Nonconsensual content directly violates our TOS [terms of service] and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission.”

But a search for "deepfake" on Pornhub Monday revealed that plenty were still available — and had been since before the website issued its statements to press.

One video featuring Scarlett Johannson's face had been viewed 742,000 times:

A search on Monday morning revealed plenty of deepfake porn is still hosted on Pornhub.com
A search on Monday morning revealed plenty of deepfake porn is still hosted on Pornhub.com

Image: Pornhub

Another, with Gal Gadot's face, had 2 million views, while yet another with Emma Watson's had 354,000. There was even a deepfake video featuring Ivanka Trump:

A "deepfake" porn video of Ivanka Trump can still be found on Pornhub, long after the company said it would scrub its site of such videos.
A "deepfake" porn video of Ivanka Trump can still be found on Pornhub, long after the company said it would scrub its site of such videos.

Image: Pornhub

Though the videos are considered "nonconsensual" by the platform, according to its own statement, experts believe there is little legal recourse for people who have been featured in deepfake porn. 

"There's no pornographic picture of the actual individual being released," Jonathan Masur, a professor who specializes in patent and technology law at the University of Chicago Law School, told Mashable earlier this month. "It's just the individual's face on someone else's body."

Apart from the obvious concerns pertaining to non-consenting individuals being shoved into hardcore porn, deepfakes may represent a new frontier in the internet's "fake news" crisis. As the technology improves, video footage featuring politicians, CEOs, or any relevant person could easily be doctored, leading the public to distrust information, no matter where it came from. 

Put another way, if you think your Trump-worshipping uncle is in his own reality on Facebook because of a few Infowars clips, just wait until he has "video evidence" backing up his claims. As the rift between echo chambers widens, people may figure any news could be "fake" — so why believe anything at all?

Porn's a problem, but it's only the opening act of a much more complicated play.

WATCH: AI will become the criminal hacker's best friend—and worst enemy