New computer program generates eerily realistic fake videos


“The camera never lies” is a thing of the past.

A new computer program can manipulate a video so that the person on the screen reflects someone’s movements and expressions in a different video. Unlike other movie hijacking software, this program can alter more than just facial expressions. The algorithm, which will be showcased on August 16 at the SIGGRAPH 2018 meeting in Vancouver, also changes head and torso poses, eye movements and background details to create more realistic counterfeits.

These video fakes are “surprisingly realistic,” says Adam Finkelstein, a computer scientist at Princeton University not involved in the work. This system could help produce dubbed movies where the actors’ lip movements match the voiceover, or even movies featuring dead actors resuscitated from old footage, he says. But giving internet users the power to create ultra-realistic fake videos of public figures could also take fake news to the next level (NS: 08/04/18, p. 22).

The algorithm begins by digitizing two videos frame by frame, following 66 facial “landmarks” – such as dots along the eyes, nose and mouth – to map features, expression, head tilt. and a person’s line of sight. For example, these videos could show former President Barack Obama and Russian President Vladimir Putin. Then, in order for Putin to mimic Obama’s behavior, the program distorts Putin’s image to adopt Obama’s head pose, facial expression, and eye line in each frame. The program can also change the shadows, change Putin’s hair, or adjust the height of his shoulders to suit his new head pose. The result is a video of Putin doing an eerily precise imitation of Obama’s exact movements and expressions.

Computer scientist Christian Theobalt of the Max Planck Institute for Informatics in Saarbrücken, Germany, and his colleagues tested their program on 135 volunteers, who watched five-second clips of real and fake videos and indicated whether they thought every clip was genuine. The fake videos fooled, on average, 50% of viewers. But people may have been more critical of the forged footage during the study than they would have been if they had naturally encountered these clips online, as they were prepared to anticipate forgeries. . Even when study participants watched real clips, 20%, on average, still believed the clips were not real.

The new software still has some limitations: the program can only play with videos shot by a stationary camera, framed to show someone’s head and shoulders against an immutable background. And the algorithm can’t change a person’s pose too much from their original video. That is, a clip of Putin speaking directly into the camera could not be edited to turn him around because the software wouldn’t know what the back of Putin’s head looks like.

Still, it’s easy to imagine how this type of digital puppet could be used to spread potentially dangerous disinformation. “The researchers who are developing this stuff are getting ahead – in the sense that now [this algorithm] exists and people are more aware of the types of manipulation possible, ”says Kyle Olszewski, a computer scientist at the University of Southern California at Los Angeles. This may encourage people to treat videos on the Internet with more skepticism, he says.

“Learning to do this kind of manipulation is [also] a step towards understanding how to detect them, ”says Olszewski. A future computer program, for example, could study both real and fake videos to learn to spot the difference.


Gordon K. Morehouse