How to spot an AI deepfake, from Trump’s arrest to the Pope’s puffer coat
The constant flood of information on social media often makes it difficult to discern what is true and what is not, even more so with the recent boom in artificial intelligence (AI).
New generative AI tools such as DALL-E and Midjourney have made it possible for virtually anyone to create imaginary - yet plausible - scenarios with ease.
This trend has given rise to so-called deepfakes: videos or images that convincingly depict someone in a fake scenario.
The impact of this technology varies, from humorous and silly to political smear campaigns and non-consensual sexual content.
At times when even visuals can be faked so easily while being so disturbingly realistic, it is crucial to know how to spot deepfakes to avoid misinformation and propaganda. Here are a few tips.
Look out for odd details
It’s important to always analyse the details in images and look out for discrepancies when something feels off.
Manipulated images typically lack realistic representations of lighting, and the hands are often digitally mangled. So, if it looks more like a painting than a real image, then it is probably fake.
For example, Donald Trump recently shared an image of himself where he is seen kneeling with a spotlight shining directly on him.
However, a closer look at this image shows the AI image generator messed up his hands clasped in prayer. The right one is missing a finger and his thumbs are distorted.
The background is another giveaway: it doesn’t look very realistic for the former president to be in a room kneeling with a spotlight on him with no one around looking at him.
DALL-E 2, Stable Diffusion, Midjourney: How do AI art generators work, and should artists fear them?
Unnatural or blurred-out features
Other examples of deepfake images that recently gained widespread attention were pictures falsely depicting Trump’s arrest - the former president was charged in April over hush money payments to two women who said they had sexual encounters with him - and a memorable (but fake) picture of the Pope wearing a puffer coat.
In the deepfake images of Trump's arrest, the faces of the people they feature often have unnatural skin tones and waxy or blurred-out features, strong indications that the images are fake.
When people’s features aren’t a clear giveaway, a closer look at the image itself can reveal more dubious details.
For example, in the AI-generated image of the pope wearing a white puffy jacket, his glasses are deformed and don’t seem to fit right. Also, a quick look at his right hand reveals that the water bottle he’s holding has a strange shape that makes it look like it has melted.
Anything strange in the background?
Another way of telling when an image is AI-generated can be details in the background of the image.
For instance, in an image showing French President Emmanuel Macron working as a bin man at the height of France’s pension reform strikes, the indecipherable inscriptions on the rubbish bags are giveaways - beyond the surrealism of the situation - that the image is fake.
People living in France might also notice that the pedestrian traffic light in the background looks nothing like one we would find in Paris or any other city in the country.
Botched hands and writing
AI also still struggles to write words correctly. In a false image of Macron being arrested, the word "police" on the helmets and uniforms of the police officers is not written correctly - and Macron’s right hand has six fingers.
Spotting deepfake videos
While there are various ways one can spot deepfake images, deepfake videos tend to share two main features: unnatural eye movements, and audio that is very often out of sync with the person’s mouth movements.
In a 2019 video where Donald Trump is seen denouncing his impeachment proceedings before abruptly asserting that Jeffrey Epstein did not take his own life, we can clearly see that his mouth movements don't match what we hear.
Deepfake videos have also been used to spread propaganda relating to the war in Ukraine.
In one instance, a manipulated video purporting to show Ukrainian President Volodymyr Zelenskyy calling on citizens to surrender to Russia was widely circulated on social media and even briefly relayed on a hacked Ukrainian news site. The video was revealed to be a deepfake that also featured unnatural eye movements.
However, as AI technology improves, it will become harder to detect discrepancies between real and fake content online. For this reason, the most important way to spot deepfakes and avoid misinformation is to fact-check and question the reliability of the sources sharing the images or videos.
It is important to get information from official and trusted sources such as official government agencies, and credible news platforms. In most cases, a quick search online, using reliable news sources or fact-checking organisations such as Full Fact or PolitiFact, can help identify whether something is true or not.