A recent demonstration showing how easy it is to spoof video of a world leader recently made headlines, foretelling a future where robot-created videos cause political and financial havoc. But now comes word of an antidote. On Monday, a group of computer scientists from Carnegie Mellon University’s Software Engineering Institute published new research showing how algorithms can tell whether the person on-screen has a human heartbeat. The technique will help future intelligence analysts, journalists, or just scared television viewers detect the difference between spoofed video and the real thing.
In case you missed the original news about AI-created fake video, last month a team of researchers at the University of Washington revealed a tool that can change footage of someone’s face to match an audio clip, making it look like the person is saying things they aren’t.
Here’s a rather disconcerting demonstration:
“Combine a tool like this with technology that can recreate anyone’s voice using just a few minutes of sample audio and you’d be forgiven for thinking there are scary times ahead,” said The Verge.
“This is the future of fake news. We’ve long been told not to believe everything we read, but soon we’ll have to question everything we see and hear as well,” lamented The Guardian.
What AI giveth, AI taketh away. In a blog post, the Carnegie Mellon researchers describe how they were able to determine whether the person on the screen has a real pulse, the technique magnifies thousands of frames of video and feeds them to a machine-learning program.
The researchers built on 2012 work by an MIT team that pioneered a technique called Eulerian Video Magnification. As they explain in this seminal paper on the topic, the MIT crew was able to “visualize the flow of blood as it fills the face and also to amplify and reveal small motions. Our technique can run in real time to show phenomena occurring at temporal frequencies selected by the user.”
In 2012, the trick only worked in a lab setting. The Carnegie Mellon team has made it work in real time.
They applied some of the techniques, such as magnifying blood vessels to reveal pulse and amplifying signals in certain areas.They focused on 68 “facial landmarks,” parts of the face that were the most likely to yield pulse data. Chewing through thousands of frames of video, graphics processing units detect and measure a pulse.
Result? “We tested our tool on spoofed facial videos, and we found that our tool was able to detect abnormalities that indicated spoofing—for example, in sampling different sections of a subject’s face in a spoofed video, we found widely varying heart rates. On an original video, those regions were much more consistent,” they write.
Faked news footage could be a serious national security concern, as recently evinced by a May incident where hackers attacked QNA, the Qatari state-run news agency. The hackers, now suspected to be linked to the government of the UAE, uploaded a fake video on QNA’s website. The footage was real but the text at the bottom of video falsely reported that Qatar’s leader, Sheikh Tamim bin Hamad Al Thani, had boasted of Qatar’s strong ties to Iran. The video contributed to a major diplomatic row that continues to play out. Message: fake news can have real consequences. But it is stoppable.
The Carnegie Mellon authors note the obvious applications for the Pentagon: “The Department of Defense is increasingly relying on biometric data, such as iris scans, gait recognition, and heart-rate monitoring to protect against both cyber and physical attacks,” they note. “Current state-of-the-art approaches do not make it possible to gather biometric data in real-world settings, such as border and airport security checkpoints, where people are in motion.”
The Air Force Research Lab has funded similar research to detect soldier and pilot stress simply with camera or video feed.
Of course, pulse data can also indicate stress, which may signal deceit. So not only is it possible to determine if a video of a politician is fake or real, it’s also possible, at least in theory, to tell if that politician is lying.