AI spots deepfake videos of Ukrainian President Volodymyr Zelenskyy

AI spots deepfake videos of Ukrainian President Volodymyr Zelenskyy

A deepfake detector designed to identify unique facial expressions and hand gestures could spot manipulated videos of world leaders such as Volodymyr Zelenskyy and Vladimir Putin


December 7, 2022

August 2022 - Kyiv, Ukraine.  Watching President Volodymyr Zelenskiy online speech on smartphone on wooden table background.  News from the front

Smartphone video of a real speech by Ukrainian President Volodymyr Zelenskyy

Kristina Kokhanova/Alamy

A deepfake detector can detect fake videos of Ukrainian President Volodymyr Zelenskyy with high accuracy by analyzing the combination of facial expressions, voice and upper body movements. This detection system could not only protect Zelenskyy, who was the target of a deepfake attempt during the first months of the Russian invasion of Ukraine, but also be trained to flag deepfakes of other world leaders and business tycoons.

“We don’t have to separate you from a billion people – we just have to separate you from [the deepfake made by] anyone trying to imitate you,” says Hany Farid of the University of California, Berkeley.

Farid worked with Matyáš Boháček at the Johannes Kepler Gymnasium in the Czech Republic to develop face, voice, hand gesture and upper body movement detection skills. Their research builds on previous work in which an AI system was trained to detect the faces and head movements of world leaders, such as former President Barack Obama.

Boháček and Farid trained a computer model on more than eight hours of video featuring Zelenskyy that had already been publicly released.

The detection system examines multiple 10-second clips taken from a single video, analyzing up to 780 behavioral characteristics. If it flags multiple clips of the same video as fake, that’s the signal for human analysts to take a closer look.

“We can say, ‘Ah, what we observed is that with President Zelenskyy, when he raises his left hand, his right eyebrow goes up, and we don’t see that,'” Farid says. “We always imagine there will be humans in the loop, whether they’re journalists or National Security Agency analysts, who have to be able to look at this like, ‘Why do you think- is that wrong?'”

The deepfake detector’s holistic head and upper body analysis is particularly suitable for detecting manipulated video and could complement commercially available deepfake detectors that primarily focus on detecting less intuitive patterns involving pixels and dimmers. other image characteristics, says Siwei Lyu of the University at Buffalo. in New York, who did not participate in the study.

“So far, we haven’t seen a single example of deepfake generation algorithms capable of creating realistic human hands and demonstrating the flexibility and gestures of a real human being,” Lyu says. This gives the latest detector an edge in detecting today’s deepfakes that fail to convincingly capture the links between facial expressions and other body movements when a person speaks – and potentially stay ahead of the rapid pace of progress in deepfake technology.

The deepfake detector achieved 100% accuracy when tested on three deepfake videos of Zelenskyy that altered his mouth movements and speech, commissioned from Delaware-based Colossyan, which offers custom videos featuring featured AI actors. Likewise, the detector worked perfectly against the real deepfake which was published in March 2022.

But the time-consuming training process requiring hours of video for each person of interest is less suited to help deepfake cases involving ordinary people or non-consensual sexual images. “The more futuristic goal would be how to make these technologies work for less exposed people who don’t have as much video data,” says Boháček.

Researchers have already built another deepfake detector focused on finding fake videos of US President Joe Biden and plan to create similar models for public figures such as Russia’s Vladimir Putin, China’s Xi Jinping and billionaire Elon Musk. They plan to make the detector available to select news outlets and governments.

Journal reference: PNASDOI: 10.1073/pnas.2216035119

Learn more about these topics:

#spots #deepfake #videos #Ukrainian #President #Volodymyr #Zelenskyy

Leave a Comment

Your email address will not be published. Required fields are marked *