Detecting Deep Fakes with a Heartbeat

Researchers can detect deep fakes because they don’t convincingly mimic human blood circulation in the face:

In particular, video of a person’s face contains subtle shifts in color that result from pulses in blood circulation. You might imagine that these changes would be too minute to detect merely from a video, but viewing videos that have been enhanced to exaggerate these color shifts will quickly disabuse you of that notion. This phenomenon forms the basis of a technique called photoplethysmography, or PPG for short, which can be used, for example, to monitor newborns without having to attach anything to a their very sensitive skin.

Deep fakes don’t lack such circulation-induced shifts in color, but they don’t recreate them with high fidelity. The researchers at SUNY and Intel found that “biological signals are not coherently preserved in different synthetic facial parts” and that “synthetic content does not contain frames with stable PPG.” Translation: Deep fakes can’t convincingly mimic how your pulse shows up in your face.

The inconsistencies in PPG signals found in deep fakes provided these researchers with the basis for a deep-learning system of their own, dubbed FakeCatcher, which can categorize videos of a person’s face as either real or fake with greater than 90 percent accuracy. And these same three researchers followed this study with another demonstrating that this approach can be applied not only to revealing that a video is fake, but also to show what software was used to create it.

Of course, this is an arms race. I expect deep fake programs to become good enough to fool FakeCatcher in a few months.

Posted on October 1, 2020 at 6:19 AM26 Comments

Comments

Erik October 1, 2020 6:41 AM

I imagine that the deep fake software does not preserve or simulate PPG only because no one working on that software knew PPG was a thing. I don’t imagine it will be hard to fill that gap.

So yeah, couple of months and that’ll start being fixed.

Winter October 1, 2020 7:20 AM

I think we can go even further: A Captcha like Turing Test for moving pictures.

Human pupil dilatation and micro saccades mirror cognitive working load. When speaking (behaving in general) people will have to think, which will reflect themselves in pupil dilatiation and eye movements. However, the cognitive load depends on the nature of what is said and done, by whom, and a lot of context.

Extracting the cognitive load from the spoken words automatically will be pretty difficult. But humans do it all the time, it is part of our theory of mind.

I think that the cognitive load related to sentences said will be fairly different between Obama, Trump, Biden and Clinton. And these are not easy to predict automatically.

Hence, this would be way to see whether the moving pictures are actually produced by a thinking human, or not.

Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze
https://journals.plos.org/plosone/article?id=10.1371%2Fjournal.pone.0203629

me October 1, 2020 7:31 AM

didn’t know that this kind of amplification existed, and the use to monitor heartbeat is amazing!!

Singular Nodals October 1, 2020 7:56 AM

It helps if you play this while training your AI

m.youtube.com/watch?v=Ig3AqgG5DRU

J.C. Checco October 1, 2020 8:57 AM

The Eulerian video method has been around for about 8 years now, and although I first thought it would be employed as a user verification technique, I applaud it being used for other uses. As deepfake AI algorithms becomes more advanced, I believe so will the the algorithms and use cases around eulerian.

Geoffrey Smith October 1, 2020 10:19 AM

Are there deepfakes that are that so realistic it isn’t obvious from looking at it? I assumed the challenge to autonomously detecting deepfakes was simply finding an efficient means of doing so, not necessarily that finding them at all were a challenge. For example the eyes and shadow in deep fakes are usually what gives it away. It looks like there’s been some research into this.

I can see PPG being used in addition to other methods but I really doubt it alone is that effective and likely results in false deepfake positives. It might be computationally cheap and easy to do though. Like DRM there will always be a race to detect deepfakes and software to defeat detection.

Winter October 1, 2020 10:37 AM

@Geoffrey Smith
“Are there deepfakes that are that so realistic it isn’t obvious from looking at it?”

Yes, search for “Best Deepfake Videos” and you get a very nive collection. Btw, speech synthesis and voice conversion are also pretty good nowadays.

xcv October 1, 2020 10:53 AM

@Winter

“Best Deepfake Videos”

Convince 12 stooges of the suspect’s guilt in court, or simply dispense with the video and dismiss the jury entirely, especially if it is deemed pornographic, because any “reasonable person” would vote to convict, and a trial by judge is sufficient to protect the rights of the accused, because the defense has not shown in court that the guilty party’s rights are violated in any way by omitting the formalities of a full jury trial on the particulars of the matter.

Winter October 1, 2020 11:11 AM

@xcv
” a trial by judge is sufficient to protect the rights of the accused, because the defense has not shown in court that the guilty party’s rights are violated in any way by omitting the formalities of a full jury trial”

Not sure what you want to say. However, jury trials are not the standard in the world. Actually, trial by jury is rather uncommon.
https://www.cairn.info/revue-internationale-de-droit-p%C3%A9nal-2001-1-page-603.htm#

And some of the most trustworthy judicial systems do not used juries.

Thunderbird October 1, 2020 11:59 AM

What about Sylvester Stallone as The Terminator?

I looked at it–thanks for the pointer. Video on the internet always has strange compression artifacts, but that one has a visible outline around the terminator’s head in the initial shot where he’s in silhouette against the sky.

Like Bruce says, it’s an arms race, which means that one side will constantly be leapfrogging the other. The notion of what constitutes evidence of something should vary drastically depending on the trustworthiness of the parties in the discussion. In a purely adversarial situation like court (or the “court of public opinion”) we should be much more skeptical than in situations where people of goodwill are cooperating to discover the truth.

Unfortunately, it’s hard to discern the interest of actors in most cases. We can all think of situations where it turned out that self-interested parties gamed the system while pretending disinterest. Often it doesn’t come to light for many years.

Winter October 1, 2020 12:17 PM

@Thunderbird
” The notion of what constitutes evidence of something should vary drastically depending on the trustworthiness of the parties in the discussion.”

Neither photographs nor videos are evidence without witness deposits of the one who made them.

L. Jeffers October 1, 2020 12:21 PM

El Guapo: Jefe, would you say I have a plethora of photoplethysmographs?
Jefe: A what?
El Guapo: A plethora of photoplethysmographs.
Jefe: Oh yes, El Guapo. You have a plethora of photoplethysmographs .
El Guapo: Jefe, what is a photoplethysmograph?
Jefe: Why, El Guapo?
El Guapo: Well, you just told me that I had a plethora of photoplethysmographs, and I would just like to know if you know what it means to have a plethora of photoplethysmographs. I would not like to think that someone would tell someone else he has a plethora of photoplethysmographs, and then find out that that person has no idea what it means to have a photoplethysmograph.

xcv October 1, 2020 6:31 PM

@Winter

Neither photographs nor videos are evidence without witness deposits of the one who made them.

Right. You need serious money in the bank to make bail, if you’re going to present your side of the case in court, and the cops aren’t going to confiscate all of your notes, files, videos, and photos and file additional charges against you, while you’re trying to fight the original charges in court, for which the documents on record have already been amended or altered.

Winter October 2, 2020 12:15 AM

@xcv
“You need serious money in the bank to make bail, …”

I assume it is your country, so do something about it.

It is pretty useless to complain to me, who thinks the USA is a second/third world country with a corresponding legal system and level of corruption.

Jon October 2, 2020 1:02 AM

@ xcv

Incidentally, in a bunch of experiments involving mock trials, jurors did the WORST while viewing videos. They came to more reasonable judgements by just hearing audio of testimony, and even more reasonable judgements by reading the transcripts, having never set foot in the courtroom.

There are definitely flaws in trial by jury, and video evidence is one of them.

Jon

Clive Robinson October 2, 2020 7:42 AM

@ Geoffrey Smith,

Are there deepfakes that are that so realistic it isn’t obvious from looking at it?

Whilst there are deepfakes that do look “realistic” that does not mean they are “real”.

What it realy means is that our “perception” is faulty in some way, because if what we see is “not real” then there are “tells” that indicate that, and all we have to do is find them.

For instance consider the “noise” in a video even though it his “digital” there will be jitter or other artifacts that appear in the video, if the video is real. This is because the video is simply an “analogue sampling gate” followed by a digitizer. I won’t go into the maths of it but just say there is no perfect digitizer.

Thus the bulk of the “noise” is actually artifacts of the sampling gate and digitizer and as such determanistic signals.

These signals will be different for different sampling gates and digitizers.

You can by analysing the noise thus tell where two different video sources have been spliced together, if you make the appropriate measurments.

However there are artifacts, even in natural light there are rapid changes in intensity. These two appear in the video signal and be different in two different video sources. Likewise even in a deep hole underground with artificial light there will be changes be it by “mains hum”, or movments in objects in the rome changing reflection and absorption patterns.

The problem you have as a “deepfake maker” is getting right what you know about, as well as getting right what you do not know about.

Both generator (attacker) and verifier (defender) can spend one heck of a lot of money to try and make the “authentication” but at the end of the day the advantage lies with the verifier not the generator.

This is because the generator can only cover for the test methods that are known at the “time of generation” which is an imovable stake in the ground of the time line. From that point forward the verifier can make use of any new methods of verification that were not known at the time of generation.

marmalade October 2, 2020 10:57 AM

Interesting and completely irrelevant. Most people will believe what they see if it’s convincing. The only solution would be a repeal of Section 230 of the laughably named Communications Decency Act and a requirement for ALL video and photos to be run through a state-of-the-art deep fakes detector before posting.

The comments regarding jury trials also completely miss the point. The US justice system is obscenely expensive. Only celebrities, CEOs, and politicians will be able to afford a libel suit that proves deep fakes are involved. The average person will be forced to accept deep fakes, regardless of the harm. Deep revenge porn, deep queer porn, deep libel, and the like have only just begun.

We’re almost at the end of civilization as we know it.

Winter October 2, 2020 12:42 PM

@marmelade
“Deep revenge porn, deep queer porn, deep libel, and the like have only just begun.”

There was a time when people believed paintings and drawings. There are still people who believe a text because it was written.

A digital image, or movie, is a collection of 1 and 0s. There is no reason to believe they represent a reality, or are valid evidence

Civilization ends when we do not trust each other anymore. The validity of images, moving or not, is unrelated to the end of civilization.

Most people are decent folks, so I think the end of civilization is not near.
(two recent books show exactly that: “Behave” and “Humankind”, the latter shows “The Lord of the Flies” is dystopian horror, and far from reality)

marmalade October 2, 2020 1:06 PM

@winter

Whatever. Ask the people falsely accused of child molestation how decent people are. Ask the people fired from their jobs due to posts on their private social media accounts how decent people are.

By the way, I have personal knowledge of someone who was in a public place when weasels took video of him. That video was then used to create a deep queer video with him ostensibly saying that he’d give blowjobs to anyone, anywhere, anytime. I’ll give you three guesses what his life is like now. Putting a stop to it would be impossible, given the whack-a-mole nature of social media.

xcv October 2, 2020 2:37 PM

@Winter

@marmelade
“Deep revenge porn, deep queer porn, deep libel, and the like have only just begun.”

Nobody would give a f*** about any of that sh** if it weren’t always show up in a standard criminal background check — you can’t ever get a job or even purchase a firearm if there’s “deep revenge porn” out on you, because all the prosecutors & gentlemen of the court are spanking the monkey over all that crapola, poring over U.S. Code books for additional charges to file against the victim.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.