The Future of Forgeries

This article argues that AI technologies will make image, audio, and video forgeries much easier in the future.

Combined, the trajectory of cheap, high-quality media forgeries is worrying. At the current pace of progress, it may be as little as two or three years before realistic audio forgeries are good enough to fool the untrained ear, and only five or 10 years before forgeries can fool at least some types of forensic analysis. When tools for producing fake video perform at higher quality than today’s CGI and are simultaneously available to untrained amateurs, these forgeries might comprise a large part of the information ecosystem. The growth in this technology will transform the meaning of evidence and truth in domains across journalism, government communications, testimony in criminal justice, and, of course, national security.

I am not worried about fooling the “untrained ear,” and more worried about fooling forensic analysis. But there’s an arms race here. Recording technologies will get more sophisticated, too, making their outputs harder to forge. Still, I agree that the advantage will go to the forgers and not the forgery detectors.

Posted on July 10, 2017 at 6:04 AM43 Comments

Comments

ThaumaTechnician July 10, 2017 6:17 AM

This will make Helen Nissenbaum’s ‘Privacy as Contextual Integrity’ even more so required reading.

In a society where everything about you is known and where your voice, your handwriting, your writing style can be easily forged, you can’t establish your identity, because anyone else can also do everything you would do or know.

By having some/most of your life private and unknown except to a very few, you have some secrets that can be used to verify who you are.

Will July 10, 2017 6:27 AM

So if its in the hands of amateurs tomorrow, presumably its within the grasp of professions today?

For arguments sake, if Oceania or Eastasia wants to forge a moon-landing, would the footage they release as proof be indistinguishable from reality?

Me July 10, 2017 8:07 AM

@Will

I read somewhere that back in 1969 the technology to fake a moon landing didn’t exist, but the technology to actually land there did.

Today, it is the opposite.

Josh July 10, 2017 8:08 AM

I’d say that fooling the “untrained ear” is just as big of a concern. If some one can easily fake a video or audio of [political candidate of your choice] that is convincing enough for HufPo or Fox news to pick up, they might discover that it’s a fake later and publish a retraction buried on their website, but the majority of their readers are going to have already internalized it.
People still believe that Palin said she could see Russia from her house, and that wasn’t even meant to be a convincing fake.

David Rudling July 10, 2017 8:21 AM

Thauma Technician is spot on.
At the moment, we use secure private cryptographic keys to secure what we say and in some cases do. The time is arriving when we will need something like a secure private “cryptographic” key to secure who we are, not just what we say or do. I mean something well beyond current digital signing or the private key of public key cryptography. It all makes the case for secure cryptography without government (ie everybody else) access that much more compelling.

Richard Cook July 10, 2017 8:36 AM

Orwellian

Under the wikipedia listing for orwellian in the see also list we find:

  • Bibliography of George Orwell
  • Mass surveillance
  • National security
  • Doublespeak Award
  • Alternative Facts

Remarkably, Orwell wrote 1984 in 1949 — over a half century ago. The book is an apt expression of the social consequences of totalitarianism, including the pervasive presence and intrusiveness of privacy destroying technology.

Roy Jensen July 10, 2017 8:55 AM

A secure cryptographic key, as mentioned by @David Rudling, with an embedded GPS element is the next step, that all cameras must encode into their Audio and Video. It too would be hackable and possibly forgable but that seems to be the direction this is going. Many digital cameras have this now, as an option, and perhaps limiting the mandatory nature of this to audio and video is to narrow a scope, we should be pushing for all data be tattooed with a point of origin unique cryptographic GPS key, something beyond the theoretical bit strength we now know. A new futuristic forensic bread crumb to tie every single piece of data, text-files, audio recordings, and video data to a point in time and point is space; perhaps a blockchain of keys to show its path from point of origin to wherever it is now. Sounds Orwelliean (@Richard Cook).

Khavren July 10, 2017 9:14 AM

What will life be like in a post scarcity creative arts economy when anyone can seamlessly replicate and edit content? Would you like original star wars? All nintendo character star wars? Star wars where the death star is blown up by the Borg?

Jarrod Frates July 10, 2017 9:58 AM

In one of the sourcebooks for R. Talsorian’s Cyberpunk 2020 tabletop role-playing game, hacker extraordinaire Rache Bartmoss is describing the reliance on what people see and the ability to make it look perfect, citing as an example that most Americans in 2023 believe that Richard Nixon left office by committing suicide on national TV because they’ve seen the video. We’re going to have some real problems getting around that.

The only way around it that I can see is going to be to digitally tag everything with a centrally-trusted ID from tamper-resistant hardware heretofore only available to governments (and almost certainly issued by them). We’re going to get universal signing, alright, but not because the populace demands it. It would have an enormous impact on privacy, and would cause significant clashing in the courts and possibly on the streets.

Daniel July 10, 2017 10:05 AM

@Bruce: “I am not worried about fooling the “untrained ear,” and more worried about fooling forensic analysis.”

I disagree. The use of technology to make propaganda more intimate and more convincing is more corrupting to self-autonomy than a few criminal actors are ever going to be. This is the underlying mistake of Helen Nissenbaum’s work. Integrity and identity mean nothing when someone else defines your identity. Privacy in the sense she means it effectively ingrains propaganda more deeply into the individual, privacy in her conception is not a path to freedom but the perfection of totalitarianism.

The whole way that privacy advocacy is argued today is deeply misguided. Advocates fret about how technological privacy can be exploited by the 1% of the population who are criminal actors but ignore the fact that modern technology is an exceptionally efficient mechanism for popularizing heroes and denigrating enemies. We need need look no further for evidence of that assertion than Twitter and Trump. Seriously, what harm is there truly to someone’s identity being stolen when that individual’s identity consists of nothing more than the parroting of an empty media head? I have no more interest in protecting the “contextual integrity” of such a person than I do protecting the man on the moon. Violating such a person’s contextual integrity is an absolute pre-requite for any hopes of consciousness raising. Stealing identities is an act of public service…called telling the emperor of aping he has no clothes.

gwern July 10, 2017 10:10 AM

But there’s an arms race here. Recording technologies will get more sophisticated, too, making their outputs harder to forge.

One worrying aspect of GANs in this context is that they are literally built on a good forgery-detector (the discriminator/critic NN). If you come up with any automated way of detecting a forgery or higher-resolution outputs, you can simply throw that into the GAN training loop and wait for the generator to improve its outputs & win. (If you’re relatively patient, you can do the training with human-based forgery detectors too.)

Anon July 10, 2017 11:11 AM

The problem is not that video can be easily forged. The problem is if people believe video even though it can be easily forged.

Before the invention of photography all we had were drawings and written or oral accounts. These are trivial to “forge”. Yet people managed somehow, and it really wasn’t that bad. In any case photography and other definitive technical records have made much less difference than many suppose. People still get away with on camera murders.

In any case we can certainly learn about the future by looking at the past.


I don’t think private keys can solve very much.

You first need to decide what identity is. If identity is knowing a key (more realistically, knowing where a key is hidden) then private keys solve the problem. And a key can’t be stolen or lost, since identity is the key.

If identity is to be something else, and most people think it should be, keys are at best a secondary id. You still need to decide what identity is, and how to establish it. This is of course nothing new, since there are many people who believe you are your documents, rather than that your documents are evidence of who you are.

Rachel July 10, 2017 12:21 PM

@ Anon
[ by the way Mr Schneier requests people not use that as a handle, it’s far too generic]

‘The problem is not that video can be easily forged. The problem is if people believe video even though it can be easily forged…’

nice comments. we don’t need to look to the future & AI.
hollywood brings us advances in cinema technology, simaltaneously the television is screening videos of ‘protests’ in the middle east, announcements by a variety of people claiming to be Osama Bin Laden and videos of ‘his’ ‘capture’ & the ‘assasination’ of ‘Saddam Hussein’ etc.
OpSec includes awareness of counter propaganda

David July 10, 2017 12:43 PM

What’s alarming is financial institutions such as Vanguard and Fidelity have moved to offering voice recognition as a way to authenticate oneself when calling into their automated system.

I find this a step in the wrong direction and a playground for attackers. The general public will think it’s cool and move towards adoption. There needs to be a way to get financial institutions to wake up to the ease of forging somebody’s voice and remove these methods before it’s too late. Maybe an act of Congress to forbid it via legislation?

Ninja July 10, 2017 3:18 PM

At some point the use of whatever means to forge ‘evidence’ of anything is going to backfire massively into people not believing anything. When you have a total collapse of trust you have a problem. We are already seeing an erosion in trust. In the past you could mostly trust a man to follow his word, now you need signed, registered documents. This is just the next step. What happens after should be interesting to watch.

neill July 10, 2017 3:23 PM

besides crypto signing pics & videos the solution might be looking at the numbers

there are billions of imaging chips out there, and while it might be possible to forge one or two videos you won’t be able to access and forge ALL

so it’s e.g. 1 forge vs. 5 unmodified videos of the same event

albert July 10, 2017 3:54 PM

From the article:

“…Unless government and business leaders seriously face this challenge, we will have to live in a society where there is no ultimate arbiter of truth….”

I’d say we’re there now.

Reliance on technology isn’t the answer, and certainly our technically-challenged leaders have been proven incapable and/or unwilling to rise to the challenge.

I’ve written off the MSM as ‘arbiters’ of Truth. It only serves as an echo chamber for the corporately-controlled government puppets.

I’ve written off ‘Social Media’. I view them as fads, not information sources.

That being the case, why worry about who said what? What difference does it make?

We live in a Vast Digital Wasteland. It’s a matter of opinion, but it seems to me that 90%* of the digital information available online today is useless BS. Perhaps we have reached the point of diminishing returns; where the effort to extract, vet, and digest the information exceeds the value of the information.

Nevertheless, +1 for everyone who at least tries to discuss the issue. I have other windmills to tilt at.

“Everybody’s stupid” – Chuck Mertz, thisishell.net


*Excluding reference sources.

. .. . .. — ….

Joshua Bowman July 10, 2017 4:18 PM

It’s not just that forgery will get easier; competent forgers will make the quality as good as they can, and then dramatically reduce the entire quality to pretend it’s a cell-phone or dash cam recording or some other mediocre medium like that, then upload it to youtube which further degrades the quality and automatically strips out any way to authenticate anything. I’d say the technology to forge youtube uploads of badly lit consumer camera footage is already here for anyone motivated enough to do the grunt work, and that’s a much more pressing problem than being able to forge 4K video. It’s only going to get easier, too.

Winter July 10, 2017 4:31 PM

“At some point the use of whatever means to forge ‘evidence’ of anything is going to backfire massively into people not believing anything. When you have a total collapse of trust you have a problem. ”

This is not the future this is an apt description of last year’s elections.

As alwsys, those who trust no one end up trusting the pied piper of Hamelin. People do not even need evidence to storm into a pizza joint armed with a gun to search out a conspiracy.

tyr July 10, 2017 5:49 PM

@Rachel

The most dangerous forgries take place in
the heads of humans. Photography made it
quite apparent when Lowells camera saw
no canals on Mars. He had been seeing
them and drswing what he saw for years.

Naive realism tells you that what you can
see is real. A quick glance at optical
illusions shows you that your internal
mechanisms are so faulty they cannot be
trusted without verifications. Society
has constructed itself on the basis of
false notions like consensus based TV
formed reality but that is all coming
apart as those who ‘manufacture consent’
have allowed the curtain to slip with
the Wizard of Oz in full view now.

The real conspiracies do not think of
themselves as conspiracies they think
of themselves as the pillars of the
community guiding us for our own good
since we couldn’t possibly stand to
know the truth. The worst falsehoood
there is that a community or public
exists as something real. An economist
proved mathematically that demand is
not an aggregate. Likewise the idea
that there is some aggregated mass
called the public is also false.

If you dig back you’ll find Bernays,
Lippman, and the Creel commission under
the rock when they discover how to fake
the news for fun and profit.

Do not ever assume culture or ideology
is true or that it has your best interests
as a goal.

The VR we live in gets more interesting
every day… : ^ )

Tim#3 July 11, 2017 6:30 AM

I suspect that one likely and lucrative area for this to happen very soon will be for dashcam videos being modified in order to make false insurance claims. There are already many fake dashcam videos around that have been created for humourous purposes, no doubt people with the appropriate skills are using them for other purposes too.

What kinds of software, hardware and user ability does it take to say nicely edit a video to add or remove a vehicle from it?

cine July 11, 2017 6:37 AM

Recording technologies will get more sophisticated, too, making their outputs harder to forge. Still, I agree that the advantage will go to the forgers and not the forgery detectors.

The reasons for this aren’t obvious to me. For a forgery to be undetectable the it needs to be perfect. Any one mistake can be detected. And every new feature of the recording device needs to be forged perfectly to make it undetectable. This seems to give the advantage to detectors. This would be especially true if the makers of recording devices actively tried to make forging hard.

This seems like the opposite of many security cases where attackers can exploit any flaw in the security, but defenders need to make sure there are none. Something that seems to give an advantage to the attackers.

Dan H July 11, 2017 6:44 AM

@Josh

You’re perpetuating fake news yourself almost a decade later. Sarah Palin NEVER said she could see Russia from her house, Tina Fey said that in a Saturday Night Live skit in a mockery of Palin.

What Sarah Palin actually said was “They’re our next-door neighbors. And you can actually see Russia, from land, here in Alaska, from an island in Alaska.”

W.C. Brown July 11, 2017 7:53 AM

I wrote a funny piece about this in my first book. The A.I. needs to get two people arrested so he creates a “surveillance” video of them stealing cheese from the refrigerator of a judge.

David July 11, 2017 8:36 AM

@Dan H, I believe you and @Josh are actually on the same page. Josh was merely saying that [some] people still make that attribution error, even though Fey’s impression wasn’t meant to be a convincing fake intended to dupe the populace. I think his point was the bar is set very low to convince/fool the general public.

wumpus July 11, 2017 10:02 AM

Depends what “court” the forensics are used in.

The comments above assume media publishers are the ones using forensics to decide which “proofs” should drive the narrative of the story. Judging from what I’ve seen, forensics will be pretty much the bottom of the barrel (didn’t help Dan Rather much).

In a court of law, I’d imagine that forensics will match how they are typically used: prosecutors will hire forensics “experts” based on their conviction rate. If the defense can’t afford a similar budget, conviction is a certainty.
http://bostonreview.net/books-ideas/nathan-robinson-forensic-pseudoscience-criminal-justice

Impossibly Stupid July 11, 2017 10:57 AM

I don’t think you can just look at the end product and make a “forgery” judgement. Like other security issues, there is a whole process involved when content is produced or edited or distributed. Every step in the process introduces artifacts that can make it easier or harder to detect that something was changed. That puts a lot of power in the hands of the original producer, even in the era of perfect digital copies.

trsm.mckay July 11, 2017 4:14 PM

Nice discussion. I have thought about this a number of times, but this triggered new thoughts. I am concerned about fooling the non-experts as well, though agree with @(Anon • July 10, 2017 11:11 AM) that this is not a new problem to human culture. Also need to add the (almost mandatory) reference to Brin’s “The Transparent Society”.

New to my thinking is the relationship between “intentionality” and “authority”. Assuming that there is widespread surveillance and excellent forgery tools. I am on the same path as @Anon and @Jarrod that we may end-up with some type of official assertion mechanism (historically signing with witnesses). Been thinking of this since the early days of “legal” digital signatures, and making sure that your agreement depends upon more than just being able to (mis)access the device that can “digitally sign” things.

Originally I had thought this would occur more like the past – something distributed and fairly individual like a notary public or “fair witness”. But there may be some centralized verification (the transaction was recorded by the official surveillance mechanism). Withholding official sanction for people outside the favor of government has a long history with plenty of historical precedent (inter-racial marriages and usury laws come to mind). Does anything change when this withholding is done by denying the use of “signing keys” (or some similar crypto stamp of approval)?

Another aspect of intentionality is recognizing the difference between casual discussions, and specific intent. Brin’s book brings this up in the context of lack-of-privacy, hopes that people become less interested in things that largely used to happen behind closed doors. The distopian government scenario is making casual discussions/actions criminal. This already happens with authoritarian cultures, where even casual questioning of leadership can result in severe punishment. What will happen if through surveillance is achieved.

Chris Zweber July 11, 2017 4:28 PM

The technology already exists.

Lyrebird.io – create forged voice clips with only 1 minute of audio from your target

Face2Face (Stanford) – create forged video clips of people speaking

Combine the two.

This is going to seriously undermine the current justice system but in the future there will be so much redundant evidence of major crimes.

Yeah you can claim the cops forged your wiretap conversation but good luck when its just one piece of evidence in tandem with pervasive constant drone surveillance, evidence subpoenaed from your smart devices etc.

Wael July 11, 2017 6:16 PM

@Bruce,

I am not worried about fooling the “untrained ear,” and more worried about fooling forensic analysis.

I’m not sure whether we should or shouldn’t be worried.

Still, I agree that the advantage will go to the forgers and not the forgery detectors.

Double-edged sword. Forgers could be TLSs and forgery detectors could be defense lawyers. Who has better forensic analysts and who has better forgers? Who’s playing defense and who’s playing offense? Should we be more worried in one situation than the other?

Godel July 11, 2017 7:49 PM

Today: “University of Washington researchers have now created a system that converts audio clips into lip-synced videos of the speaker.

In order for the system to work, it needs to analyze approximately 14 hours of existing footage of the person speaking – the researchers are hoping to reduce that figure significantly, perhaps down to one hour. Utilizing a neural network, it learns which of their mouth shapes accompany which speech sounds.”

http://newatlas.com/lip-synced-videos/50442/

Nick P July 11, 2017 8:52 PM

@ Bruce Schneier

It’s highly-likely that they’ll be able to forge evidence in court. Maybe even using today’s technology in deep learning. Just look at the more realistic output of this program that generates cat faces from random noise using patterns it learned looking at cat faces.

Meow Generator

One, online commenter also saw immediate value in that project:

“A dream of cat video clickfarmers came true.”

Clive Robinson July 12, 2017 6:58 AM

@ Tim#3,

I suspect that one likely and lucrative area for this to happen very soon will be for dashcam videos being modified in order to make false insurance claims.

I suspect that it will not happen for very long for a couple of reasons.

Firstly insurance companies are getting politicians to legislate against payouts. Have a look at UK legislation changes on whiplash claims.

Secondly, the other witness problem. Making false claims often relies on either their being no contradictory witnesses or supplying fraudulant witnesses.

Modern cars now have logs in sensors for things like gps, speed, millage, vibration etc, commercial vehicles have tachographs and other drivers have web cams. That is there is a lot of sources available for cross refrencing that are likely to pop up.

But there is also a trend for insurance companies to “knock-for-knock” claims the minute there is an allegation that evidence is false. That way they outsource the issue to either the police or the courts, and most claiments will not bother to persue it, especially when they find out the cost of legal action…

J. Peterson July 12, 2017 3:18 PM

I recall a lecture given by Hany Farid where he gave an overview of his techniques for detecting digital photo tampering. After the lecture, somebody in the audience congratulated him for “solving the problem” of detecting forged digital photos.

Farid just laughed, and said “I haven’t solved the problem – I’ve just started an arms race.”

Livin' Large July 12, 2017 8:46 PM

It’s quite possible Artificial intelligence will advance to such a state of realism, you’ll be seeing Humphrey Bogart star with Marilyn Monroe and Paul Newman in films of their own making. The director could be Federico Fellini with music performed by Bach, Bonham, Squire, and Hendrix. These virtual actors/musicians/directors will generate their own content. Copyright will be turned on it’s head.

Aidan Herbert July 13, 2017 12:06 AM

….would the forged virtual experience be better than the real virtual experience

Aidan Herbert July 13, 2017 11:31 AM

..if the advantage goes to the forgers, are we doomed to a future to fake virtual reality?

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.