The Risks of Mandating Backdoors in Encryption Products

Tuesday, a group of cryptographers and security experts released a major paper outlining the risks of government-mandated back-doors in encryption products: Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications, by Hal Abelson, Ross Anderson, Steve Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter Neumann, Ron Rivest, Jeff Schiller, Bruce Schneier, Michael Specter, and Danny Weitzner.

Abstract: Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement channels going dark, these attempts to regulate the emerging Internet were abandoned. In the intervening years, innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing vastly larger quantities of data. Today we are again hearing calls for regulation to mandate the provision of exceptional access mechanisms. In this report, a group of computer scientists and security experts, many of whom participated in a 1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates. We have found that the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20 years ago. In the wake of the growing economic and social cost of the fundamental insecurity of today’s Internet environment, any proposals that alter the security dynamics online should be approached with caution. Exceptional access would force Internet system developers to reverse forward secrecy design practices that seek to minimize the impact on user privacy when systems are breached. The complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws. Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.

It’s already had a big impact on the debate. It was mentioned several times during yesterday’s Senate hearing on the issue (see here).

Three blog posts by authors. Four different news articles, and this analysis of how the New York Times article changed. Also, a New York Times editorial.

EDITED TO ADD (7/9): Peter Swire’s Senate testimony is worth reading.

EDITED TO ADD (7/10): Good article on these new crypto wars.

EDITED TO ADF (7/14): Two rebuttals, neither very convincing.

Posted on July 9, 2015 at 6:31 AM101 Comments

Comments

Sergiy July 9, 2015 8:11 AM

I keep wondering why anyone feels this is a possibility – that authorities might institute restrictions on the use of encryption, that such a strong and concerted response is necessary. What is the source of pessimism you expressed previously? If you let them, the gov’t will do anything? If this is infeasible, why worry?

Dave Oldcorn July 9, 2015 8:24 AM

Ross Anderson’s comment on whichever idiot who sold this dog of an idea to the PM needing a robust rollocking seemed particularly prescient to me.

Bob S. July 9, 2015 9:04 AM

Comments by high level military and police officials seem aimed at gaining legal protections for disabling encryption to allow MASS SURVEILLANCE of electronic data.

That leads me to believe currently military and police agencies are conducting indiscriminate warrant-less, groundless collection, review and storage of personal data of American citizens as well as, of course, everyone else.

The harder they push to corrupt encryption, the harder we should push to encrypt everything.

I agree with the view personal electronic data is tangible property and therefore subject to constitutional protections. Unfortunately, only a handful of critical elected officials and judges hold that same view.

Jayson July 9, 2015 9:15 AM

The probability of such a proposal getting passed is extremely low. Doubly so since there doesn’t appear to be much corporate money backing the proposal and it would effectively kill American technology companies as consumers left in droves.

This appears to be happening for political reasons. It continues sending the message that encryption is bad and to create a chilling effect on its use. Further, by taking the offense, it diverts resources and attention away from their ongoing dubious activities.

Anon July 9, 2015 9:40 AM

Part of the problem is that back-door is such a vague term that it’s not clear what people are actually debating. Let’s suppose for a moment that the UK government wanted the root certificate from a CA. I don’t think that would make the internet much less secure. While in theory, a foreign intelligence agency could hack GCHQ, recruit a GCHQ employee, try to get an operative hired at GCHQ, or physically break into GCHQ, I think it would be 100x easier for the foreign government to go the direct route and hack the commercial CA, recruit someone that works at the commercial CA, get one of their people hired at the commercial CA, or physically break into the commercial CA.

gordo July 9, 2015 9:42 AM

Links to the two Senate hearings held on July 8:

Senate Committee on the Judiciary
“Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy”
http://www.judiciary.senate.gov/meetings/going-dark-encryption-technology-and-the-balance-between-public-safety-and-privacy
[Hearing length is 03:16:39. Due to a late start, the hearing begins at the 19:51 mark of the video. Page includes PDFs of the chairman’s, ranking member’s, and witnesses’ statements]

Senate Select Committee on Intelligence
Counterterrorism, Counterintelligence, and the Challenges of “Going Dark”
http://www.intelligence.senate.gov/hearings/counterterrorism-counterintelligence-and-challenges-going-dark
[Hearing length is 01:49:15. This hearing started on time, and begins around the 00:50 mark of the video.]

herman July 9, 2015 9:53 AM

There is no risk of the network going ‘dark’. They just want everyone to think that the common encrypted systems are better than they really are, so that is why they have to keep moaning and bitching and asking for backdoors.

gordo July 9, 2015 10:23 AM

The approach presented at the Senate hearings by DOJ and FBI seems to be that business entities, however they might go about it, would be responsible for maintaining the means of access to their customer’s, and user’s data, and data/devices.

RSaunders July 9, 2015 10:30 AM

Comey’s best quote:

“Silicon Valley is full of folks who, when they stood in their garage years ago, were told, ‘Your dreams are too hard to achieve.’ … Thank goodness they didn’t listen. Maybe this is too hard, but given the stakes … we gotta give it a shot. And I don’t think it has been given an honest hard look.”

That sounds to me like someone who’s intentionally ignoring all of computer science from 1997 to today. Enormous effort has been applied to the issues of trust, software correctness, and cryptography in the last 20 years. Alas, the outcome is a well-founded understanding that what he wants isn’t feasible.

Looking harder for scientific evidence for Creationism isn’t how science works. Science drops ideas that don’t match observations.

Deimos July 9, 2015 10:53 AM

My hope is that this round of the crypto wars helps in the development and adoption of ubiquitous strong cryptography. The first step is awareness. The second step is the development and availability of easily deployable tools (have you ever tried to use pgp email on an iPhone?). I wonder if the third step is changing the attitude of government, to get the enormous resources of the likes of the NSA devoted to strengthening rather than weakening crypto systems. The only thing that might accomplish this third step is a really strong backlash against government surveillance, and this round of the crypto wars, following on the revelations of Edward Snowden and others, seems like yet another opportunity to take the offensive in this regard.

tin foil hat July 9, 2015 10:57 AM

Why does FBI Director Comey advocate creating a backdoor into American citizens’ data for Russia, China, and other hostile foreign intelligence services?

Based on his position, taking into account the FBI’s counterintelligence mission, I assess some significant probability that he is now working for a foreign power.

He can’t be unaware that mandating backdoors in the products used by Americans will necessarily create entry points for foreign governments. So, why would he advocate those entry points? Why?

d33t July 9, 2015 11:35 AM

Maybe it’s about time we elected officials (worldwide) who actually understand today’s technology and the problems / possibilities that now exist? Attempting to ban encryption or continue to back door everything will ensure that criminals (wow such a broad term now) run free with everyone’s personal information. They already do, but it will be much worse.

ac July 9, 2015 11:40 AM

Is there any chance that this is a ruse? If the FBI proclaims loudly and frequently that they are completely powerless against off-the-shelf encryption products, might that just be to draw attention away from their other capabilities (keyloggers, custom malware, etc) which could get at data without defeating encryption?

I don’t know. Sometimes I feel whenever someone says something in Washington, my first reaction is: They can’t possibly really believe this. Well, if so, what is the advantage for them of saying something that they know isn’t true? Kinda the lying version of “Who benefits?”

Steve July 9, 2015 12:37 PM

What if all this huffing and puffing by our “security” agencies over “unbreakable” encryption is nothing but a smokescreen to misdirect the public into using encryption which has already been broken by those selfsame “security” agencies?

What better way to gull targets for increased surveillance than to snooker them into using broken encryption by self-identifying themselves through the use of encryption.

Maybe.

Maybe not.

Who knows what’s real and what’s merely a reflection of your own paranoia once you enter the house of mirrors.

Simple July 9, 2015 12:39 PM

From the WSJ article:

“A Justice Department official … couldn’t specify how many search warrants were thwarted by encryption technology, but said it had become a growing issue for investigators in the past two years. Recently released Justice Department statistics show that of the 25 state and federal wiretaps that ran into encryption last year, officials could decipher just four.”

So get a warrant for the encryption keys.

Sergiy July 9, 2015 12:44 PM

@”Is there any chance that this is a ruse?”

Many have wondered that, but I would not give them that much credit. Any really smart deep thinkers are kept in the back. It’s about power politics and money. So maybe authorities use the fight against terrorism as a pretense and their fear is really about domestic unrest. Or maybe someone turned the machine on but no one knows how to turn it off. Or maybe lawyers shoot for the stars hoping some judge will give them the moon.

tin foil hat July 9, 2015 12:47 PM

@Steve

Who knows what’s real and what’s merely a reflection of your own paranoia once you enter the house of mirrors.

The adversary is real. Even paranoids have enemies.

grayslady July 9, 2015 12:47 PM

@ gordo: Thanks for the links.

Having just watched Citizenfour recently, I was struck by part of the title of the subcommittee hearing, “going dark”, and the comment Snowden made at one point in the movie. To paraphrase, Snowden said that the NSA is constantly fearmongering about “going dark” when in fact their capabilities were magnifying astronomically.

Rolf Weber July 9, 2015 12:55 PM

I agree with almost anything in the paper, but cannot comprehend why a manufacturer KEK is such a bad idea. I mean, this was basically what Apple did before.
It has its advantages, like when a user forgets his password, or when he dies. Law enforcement with a valid warrant had than access too, of course.

I agree that it is likely a bad idea when manufacturers are forced to do it. But I don’t see serious security risks.

Daniel July 9, 2015 1:00 PM

I’m beginning to sense that behind the curtains there is a growing disconnect–if not acrimony–between the military-industrial complex (MIC) in the USA and the criminal justice system (CSJ) in the USA. This because the MIC is increasingly concerned about the impact weak security (including weak encryption) has on cyber warfare and international relations in general. However, while strong encryption makes sense internationally it creates real problems domestically in terms of crime fighting. I suspect that this is ultimately due to a dichotomy in funding. If there is a credible terrorist threat the NSA has the resources to work around strong encryption. But if the problem is more mundane such as child pornography or money laundering then the FBI doesn’t have the resources to deal with strong encryption. It’s a case where what is good for the NSA/CIA and what is good for the DEA/FBI are not the same thing.

Steve July 9, 2015 1:16 PM

@tin foil hat

The adversary is real. Even paranoids have enemies.

And sometimes paranoia is just paranoia.

Peter Franusic July 9, 2015 1:26 PM

Susan Landau provides an amusing characterization of the brain damage that the FBI calls “exceptional access”: “The idea of exceptional access looks more like magical thinking than a realistic solution to a complex technical problem.”

Comey needs to be fired and replaced by someone who can lead the FBI to fulfill its mission without their jackboots trampling on our Constitution. The days of analog telephony and simple wiretaps are gone. The FBI needs to innovate new methods of investigation in this new Digital Era. We need a leader who gets it.

gordo July 9, 2015 1:38 PM

The below quote is FBI Director Comey responding to a question from Senator Heinrich at the Intelligence Committee hearing of July 8th.[1]

Re: “exceptional access” [2]

Thank you, Senator. I agree very much, which is why I’m so excited about this opportunity, because I think things like this hearing will drive this conversation because we need to do it together. They are the source of the innovation and the expertise. We need their help in solving this.

I’d never heard, until I read, I read, the executive summary, and I went through that paper, pretty quickly, the rest of it. I’d never heard the term “exceptional access.” My reaction, already, is, I don’t want exceptional access. I want ordinary access. Where a judge issues an order, and folks are able to comply with the order that a judge issues.

There are providers who, because of their business model, encrypt, as I understand it, strongly encrypt, the communications in motion, but they’re visible to them and on their servers that they control. It’s part of their business model because they want to be able to sell you ads, and so they need to be able to see the content. For those providers, some of whom are huge providers, we are able to serve a judge’s order, and get the content in a counter-terrorism case, or an espionage case, or a serious criminal case, of communications that the judge has authorized us to do. I don’t think that those folks think that their system is materially vulnerable.

So I wonder (again, folks should not be looking to me for technical advice), I wonder whether that isn’t an example that we should use in our conversations with the companies, but every company’s going to be different, which is why I don’t think one-size-fits-all, because some of the companies at issue, that the terrorists use, are three guys in a garage who started this end-to-end encrypted app. Our ability to work with them may be very different from some bigger company.

I don’t think that we want to be seen as “We’re going to impose this fix on all of you.” We want to talk about how we can solve this. I don’t want to demonize the company’s either. They love their country. They care about public safety. I know that from private conversations. It’s about we care about these two things. How do we maximize both of these? Maybe it’s impossible, or maybe the scientists are right. I’m not ready to give up on that yet.

[1] Video markers: Sen. Heinrich [32:36 – 36:37]; FBI Dir. Comey [36:38 – 38:32]
http://www.intelligence.senate.gov/hearings/counterterrorism-counterintelligence-and-challenges-going-dark

[2] Term used in the Keys Under Doormats paper. The phrase “exceptional access” occurs a total of 72 times in the document. It’s first used in the phrase “exceptional access mechanisms” in the abstract. That particular phrase also occurs in the conclusion, and once in the body of the document.

David C July 9, 2015 2:02 PM

I doubt the “bad guys” that the FBI wants to go after are going to use the broken encryption. They will go find the good encryption and use it. Only law abiding folks are going to be using the broken stuff. It’s like if a criminal wants a gun to rob a bank, he will steal one or buy one on the black market outside of gov regulation. This idea that if google, apple, or anyone makes a encryption that is openable by the government, is going to be used by the crooks is nonsense.

Clive Robinson July 9, 2015 2:10 PM

@ RSaunders,

That sounds to me like someone who’s intentionally ignoring all of computer science from 1997 to today.

Yes and all the science tells us what he wants is not possible.

So what’s his agenda?

Firstly I don’t think it’s FUD for all the d1p 5h1ts up on the hill, even though most would fail to pass their high school exams these days.

Secondly I don’t think it’s aimed at turning US companies into the worlds “don’t step in it” mess.

I actually suspect it’s an attack on “Open Source” code.

I recon he’s shooting well over target on the bet he will meet with stiff resistance to beat him down. That what we will end up with is the old “Govt licence for crypto products” yet again, but worse than Wassner Agreement due to TTP etc) With the entry price set very high and only “closed source” making the grade with any TTP or any US Trade Treaty country getting hammered six ways to Xmas via the ISDR rules if they refuse to alow US backdoored code into their market place.

I guess we will have to wait and see but I recone they are going to “stuff the genie back in the bottle” one way or another or give Pandora “Liberty” skull fractures by smashing the box lid on her head till the box is closed.

Clive Robinson July 9, 2015 2:32 PM

@ David C,

I doubt the “bad guys” that the FBI wants to go after are going to use the broken encryption. They will go find the good encryption and use it.

Wrong and right.

They will use the bad stuff to avoid standing out. However they will use the good stuff and hide it in the junk they send via the bad stuff.

Thus they will not stand out, and will know when the FBI et al has looked without a warrent.

Nick P July 9, 2015 2:38 PM

I proposed a while back developing a high assurance lawful intercept system just in case we were ever forced to use one. I was (and still am) convinced we can beat most of the technical risks and many of the operational risks for key-holders. It’s the others, esp government foul play, that prevent me from endorsing something like that while decentralized options are available.

Anyway, I have another idea for one that’s more interesting. The FBI and their pals keep telling everyone we can do this without tremendous cost and risk. I suggest we have DARPA or someone fund a demonstrator that the FBI is asked to use and would benefit them. Develop it with domain experts on each aspect with EAL6+ development process for hardware, firmware, and software. Diversified hardware fabbed at different places with distributed voting algorithms. TEMPEST cage and power supplies from yet another party. Easy to use client library (i.e. NaCl modified) that both does solid encryption and properly does the escrow part. Everything we can. Also, what it costs to do all this at every level in ways that might be applied to COTS estimates.

Then, we let government use it for police and intelligence work. We can build such a site in each country that’s pro-backdoors. Let it operate for several years to see what happens. Mandatory disclosure of how any breaches happened and high rewards for those who achieved them with immunity granted. The resulting simulation will show us what we’re actually capable of. More likely, though, it will have more value when FBI and company refuse to use it despite it being certified secure by NSA. We then say: “they’re telling us regular companies can do this safely and affordably but they won’t trust the best, premium escrow option government funding ever built? Do they think escrow is secure and feasible? Or have they just been lying to everyone about the risks and burdens?”

And then we drop the QED on them. And repeatedly reference the contradiction in all future debates.

Gerard van Vooren July 9, 2015 2:46 PM

@ Daniel

“If there is a credible terrorist threat the NSA has the resources to work around strong encryption. But if the problem is more mundane such as child pornography or money laundering then the FBI doesn’t have the resources to deal with strong encryption. It’s a case where what is good for the NSA/CIA and what is good for the DEA/FBI are not the same thing.”

On tweakers.net, a Dutch tech site, the paper is also being discussed. A guy named WhatsappHack is arguing “your” comment in such a way that I consider it being worth translating it to English.

“Yes the famous “but think of the children!” argument, where you can not go against it without being seen as heartless person who does not think of innocent little children, hehe. That fits in the list “child” and “terrorism” at home to justify mass surveillance and monitoring at all.

Well, I’ll be heartless:
These are very marginal “what if” situations. In self run away that is anyway not weighted against further problems caused. That a child runs away must therefore ensure that the overall security and protection should be undermined. Yeah, I do not think so …
Anyway, although it suffered in those situations also will be great IF there was something on that “phone” that could help, it does not outweigh the big picture where many * many * more problems arise … And therefore it’s a very bad idea to abandon encryption.

It’s not a black or white story, but it should now be made quite clear that the disadvantages of encryption could absolutely not outweigh the advantages.”

The thing is that the four horsemen argument is a thought terminating cliché. We should dare to take the next step and really discuss the problem at hand, and the problem goes way beyond the four hoursemen.

winter July 9, 2015 3:07 PM

This aligns with Cameron’s crussade against good crypto in the UK, ie, protection for the common man.

There are echo’s of that in other countries (we cannot convict pedophile X because he uses Truecrypt).

What never is said is that these capabilities are never used, never, to protect a member of the public. They can be, and are, only used to damage the lives of those surveilled. It is like the “whatever you say can and will be used against you”.

I would like to draw your attention, again, to the marvelous talk: Don’t talk to the police.
youtube.com/watch?v=6wXkI4t7nuc

This surveillance is there to make it unnecessary to talk to the police. They already can pick and chose from whatever you have said or written in any context.

winter July 9, 2015 3:15 PM

@Gerard
“The thing is that the four horsemen argument is a thought terminating cliché. We should dare to take the next step and really discuss the problem at hand, and the problem goes way beyond the four hoursemen.”

So what is the scale of the problem at hand? How many children are harmed and hom many people killed by terrorists because of strong crypto?

How many criminals are running free because if strong crypto?

They won’t tell us.

This reminds me of the “need” for cctv and ID cards: Neither has prevented serious crimes.

rgaff July 9, 2015 3:33 PM

Law enforcement always had and always will have access to decryption keys… it’s called GETTING A WARRANT… and then coming and bashing through my doors and windows and SERVING IT TO ME military style at gunpoint AND CONFISCATING MY DEVICES!!!

This has always been there. This always will be there. (though it has not always been with so many huge guns and with so much destruction of property!)

So it’s not going away. They’re just whining because they’re dissatisfied with the way things have always been, they want MORE… they don’t want to bother with slightly inconvenient warrants under the constitution, they want access to whatever they want, whenever they want. They are crybabies, throwing temper tantrums. They don’t want any constitution, they want to be little hitlers, with unlimited power, and they want it now.

Anon July 9, 2015 3:40 PM

We’re only at 34 posts and we’ve already reached the point of Godwin’s Law.

Gerard van Vooren July 9, 2015 3:47 PM

@ winter

“So what is the scale of the problem at hand? How many children are harmed and hom many people killed by terrorists because of strong crypto?”

The problem at hand is a global insecure IT architecture because of the lack of strong crypto.

“How many criminals are running free because if strong crypto? They won’t tell us.”

Even if they did, can we trust the numbers they are saying? We need proof so that we can verify what they say. I don’t trust spokesmen.

And when it comes to numbers, when the NSA budget would be spend in health care research many more people would be saved. It would be boring and politically not “hot”, but it really would. The number of people being killed by terrorism is roughly the same as being killed by lightning, but the people who die because of cancer and heart diseases is significant.

James July 9, 2015 3:55 PM

@rgaff

No, a warrant can’t constitutionally compel you to incriminate yourself.

It’s not only privacy which is at stake but equally the privilege against self incrimination.

Most debates about encryption touch on privacy, but the presumption of innocence and the right not to incriminate oneself are no less important.

One positive bit in the hearing at the Judiciary Committee was a consensus that the Fifth Amendment generally forbid compelled key disclosure.

Now the privilege does not exist in the UK but even there, you can only be punished for failing to disclose your encryption key, if the police can prove beyond a reasonable doubt that you already have it.

I hope that the next evolution is development of encryption systems which allow people always to decrypt a decoy filesystem in case they are compelled by the police, criminals or terrorists to decrypt their data.

gordo July 9, 2015 4:02 PM

The End of Encryption? NSA & FBI Seek New Backdoors Against Advice from Leading Security Experts

Juan González and Amy Goodman | Democracy Now! | July 8, 2015 | 14 min.

http://democracynow.org – FBI Director James Comey is set to testify against encryption before the Senate Intelligence Committee today, as the United States and Britain push for “exceptional access” to encrypted communications. Encryption refers to the scrambling of communications so they cannot be read without the correct key or password. The FBI and GCHQ have said they need access to encrypted communications to track criminals and terrorists. Fourteen of the world’s pre-eminent cryptographers, computer scientists and security specialists have issued a paper arguing there is no way to allow the government such access without endangering all confidential data, as well as the broader communications infrastructure. We speak with one of the authors of the paper, leading security technologist Bruce Schneier.

http://www.democracynow.org/2015/7/8/the_end_of_encryption_nsa_fbi

Nick P July 9, 2015 4:21 PM

@ Anon

Lol excellent observation. It might be useful if we carefully think out a new, appropriate analogy for these things. I’ve been using Hoover given he blackmailed his way into great power and control of government using domestic surveillance. Matches specific risk of current situation while putting NSA’s powers into perspective. Far as the other stuff, might need new comparisons.

@ Gerard

“And when it comes to numbers, when the NSA budget would be spend in health care research many more people would be saved. It would be boring and politically not “hot”, but it really would. ”

Letting Congress know they’d rather kill voters or dumb down children than cut back on useless surveillance might have a bit more dramatic impact. Especially if the public saw that.

“‘Think about the children,’ the FBI director says as he request money spent saving their lives to be spent on his organization instead. That’s despite our infant mortality rate being way too high. The FBI director seems more interested in killing children for a higher budget than protecting them by more wisely spending what he already has,” said Nick P in front of Congress and FBI Director.

tyr July 9, 2015 4:35 PM

What I find hilarious is the assumption the garage based
genius gangs are going to repeal the inherent mathematics
to make it easier for the donut stuffers to do less work.

If you view technology as magic then this seems neatly
feasible. The truth is law enforcement already has so
many toys and so much better methods than they had in
the old dayes it is cruel of them to cry about how it
is harder to do the job. How many more people do they
need in jail ? Shouldn’t that mean there are less of
the bad guys loose on the streets ? I realize that
getting off your big fat ass and going out to do some
real police work is hard on the clowns who sit around
impersonating teen aged girls and swapping girly pics.

I remember when the only people who had crypto were
nation state militaries but police were able to do
their job without whining about doing it. The sudden
idea that a total surveillance wiretap on everyone is
needed because it became possible is horseshit of the
worst kind. I was also around when the garage geniuses
were cobbling together the junk they foisted off on a
gullible public by a bunch of swindlers who were less
than trustworthy to say the least. Not eveyone was that
bad but the ones who got rich were not safe to have around
your chickens by themselves.

What is extremely telling is that the IC and LEO bands
have made these outrageous claims without a scrap of
evidence for them. One look at the New York skyline is
enough to refute most of them.

I also see the OPM breach has grown another level of
involvement. I’d like to see the government secure the
data they are trusted with first, then we can talk
about wrecking the security of the rest.

Nick P July 9, 2015 4:37 PM

@ Bruce Schneier

re Snowden’s return

“I don’t know how many people are going to have to retire before [he can come back]”

Lmao that was awesome you just dropping that line in.

gordo July 9, 2015 4:44 PM

@ grayslady,

Prof. Swire speaks to the “going dark” claim in his prepared testimony:

Second, it is more accurate to say that we are in a “Golden Age of Surveillance” than for law enforcement to assert that it is “Going Dark.” In previous writings, I have agreed that there are indeed specific ways that law enforcement and national security agencies lose specific previous capabilities due to changing encryption technology. These specific losses, however, are more than offset by massive gains, including: (1) location information; (2) information about contacts and confederates; and (3) an array of new databases that create digital dossiers about individuals’ lives. … (p. 2)

http://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Swire%20Testimony.pdf

Clive Robinson July 9, 2015 6:03 PM

@ Nick P,

I proposed a while back developing a high assurance lawful ntercept system just in case we were ever forced to use one. I was (and still am) convinced we can beat most of the technical risks and many of the operational risks for key-holders.

The problem is that even if your system works the one operational risk you must defend against the FBI et al would not stand for in any way shape or form.

In times past, getting access to letters in the mail was difficult not just for technical reasons but because the law required a fairly high standard of suspicion to convince a senior judge or minister of state, as well as the issue being one of national security in the older more traditional sense. The same applied to telegrams / cables and later telex network. Likewise the standard required to intercept even dialed numbers was very high, and recording of voice exceptional. The invention of “comercial codes” and “code cards” in the Victorian era was not to do with secrecy or privacy, because that was more or less guaranteed, it was in effect for “compression” to save per letter costs.

Similarly to get at past written communications required access to private papers, and this again required that a high standard of suspicion be put before a senior judge. And access would be forbiden if the papers were of legal, religious or medical corespondance. Further corespondence to and from journalists was likewise heavily restricted.

Thus if the police and other domestic legal authorities wanted access you would be notified of this by the presentation of a warrant to gain access. Further having received the warrant you could then challenge it’s issue and obtain copies of the suspicions put befor the judge and if false or groundless chalenge them and have the officer presenting such falsehoods punished up to and including a life tarrif, fines and civil compensation. Thus the law enforcment officers were quite carefull about what they submitted.

These days law enforcment wants to access your communications without your knowledge, the number of warrents requested far exceeds the capacity of the limited number of senior judges so now the warrents can be issued by magistrates, who may not have a sound apreciation of what reasonable let alone higher standards of suspicion are, and thus can be easily “gulled” into issuing warrants that not only should not be, but also having the application sealed, so you cannot access it to challenge it. But worse the definition of “communication” has little or no meaning any longer. For instance a voice mail on the land line phone in your home, once would have been treated the same way as a deliverd letter, ie beyond access without a presented warrant as it “had” reached it’s destination and was nolonger being communicated. Now it’s fair game on the pretence that, as you are assumed not to have heard it –even if you have– it’s thus still “in communication” and accessible without a warrent being presented to you.

Thus your private “papers” are laid bare for considerable periods of time without you becoming aware. This was in the fairly recent past not legaly possible, now it is. But with computers the pretense of argument about “might not have heard” is now a fiction of not just “might not have read” but also of “in plain sight”, thus no warrent need be presented, and in some cases even obtained.

It is this “unfettered access” at “any time or place”, “without the persons knowledge” that the likes of the FBI will not give up. Not willingly, infact not at all, and it’s loosing this that the “going dark” arguments are realy about. Which is the problem with one of the systems you outlined in the past, because it required a warrant “be presented” and thus you would be “warned off” by it and thus communicate this knowledge directly or indirectly to another party.

The problem is that any system that does not require the presentation of a warrant is neither secure or free of abuse. And even if a warrant is presented the door once opened, stays open into the past long long prior to the warrants issue and potentialy long long into the future, which is another thing the law enforcment agencies want. Then there is the issues of misrepresentation, we know that law enforcment take phones etc from people in custody and take data from them for various reasons without the owners consent, as well as using the phone to send fake messages. Designing a system to allow some “lawfull access and use” but not “unlawfull access and use” is an imposibility.

I could go on, but you get the general argument, which is “They will not allow systems that prevent their unfettered, unlawful access and use at any time” because no matter what it is they do, legal or otherwise they will come up with some “for the greater good” or “think of the children” argument to be able to do it. Then just as we have seen with the NSA they will find a smart lawyer to make any legal box you put them in not apply, thus they can not alow any technical constraints preventing that.

To think otherwise is going against the reality of past behaviours.

SteveMB July 9, 2015 6:13 PM

@Rolf Weber — The reasons for tech corps abandoning the old system that allowed them to keep a key copy are twofold. First, in the wake of the Snowden revelations, they need to maintain a “Caesar’s wife must be above suspicion” posture vis-a-vis getting caught in bed with the Feds. Second, the stakes keep rising with the addition of new applications (e.g. payment apps like Apple Pay), and now it’s at a tipping point where the only way to keep the cyberattacks down to a manageable level is to limit the value of the prizes waiting for a successful intruder.

MrC July 9, 2015 6:20 PM

@ rgaff:

You are incorrect about warrants, at least as applies to the US. The Fifth Amendment privilege against self-incrimination generally protects against being compelled to disclose a password. The only exception sometimes applicable is the “foregone conclusion” doctrine, which requires disclosure if the prosecutor can show the judge independent proof of what’s sitting behind the encryption (e.g., a wiretap of you saying the records of your real estate fraud scheme are encrypted on your laptop; a cop’s testimony about the kiddie porn he saw on the computer when he arrested you, before it was powered off)**, so that it’s no longer a question of “what’s there?” but rather “we know what’s there, hand it over.”

Importantly, note that the above only applies to “contents of the mind” such as passwords. Other authentication methods, such as physical tokens or biometrics, do not have Fifth Amendment protection. (Well, the smart-alec who used his male member for the iphone’s biometric lock might have some kind of constitutional privacy argument against compelled disclosure, but I wouldn’t count on it…)

** both examples from real cases

MrC July 9, 2015 6:32 PM

(apologies for the double post)

Also, one always has the option of refusing to comply with a warrant or court order. Sure, you’ll be held in contempt and jailed, but it may be worth it if (1) the criminal case would likely collapse without the encrypted data, (2) you’d likely lose the criminal case otherwise, and (3) the likely jail time for conviction significantly exceeds the likely jail time for contempt.

Anon July 9, 2015 6:43 PM

@MrC

That’s not quite how contempt works. Contempt charges can have indefinite sentences under the theory that the defendant has the ability to end the sentence at any time simply by choosing to comply with a court order.

Sergiy July 9, 2015 7:16 PM

This will not be solved on the Hill, or even in the Supreme Court. It will be done behind closed doors and will be camouflaged to look like something else. Int’l courts and tribunals will be in charge, to which the US is party, and if you want to do business then you’ll have to do along with it. And the “it” will be a mechanism in which you essentially deposit keys thereby receiving a kind of token to communicate with, like a certificate, and without it you won’t communicate. On top of it, there will soon be another technical advancement that totally screws up everything, for everyone. No, not another algorithm, not another cipher or key exchange thingy. Bigger. All these arguments will be rendered mute.

Anon July 9, 2015 7:17 PM

@gordo

I think Bruce undercuts a lot of his own case in that democracy now link. First, what are LEOs likely to be most worried about: they get a warrant to search a known drug house and in the process they seize computers and smart phones whose files they want to read. Hacking might not work that great because you don’t what burners to hack until after they’ve been seized.

According to Bruce,

It’s an interesting question, because while encryption is a very powerful too and very strong, computer security is very weak. We, as scientists, don’t know how to build secure computers. So I can protect the encryption of your phone, but I can’t stop someone from hacking into it..

Assuming that’s true, a highly sophisticated foreign government might be able to accomplish all the industrial espionage they want against the US via hacking. CEOs of fortune 500 companies, unlike drug lords, probably don’t switch to a new burner every few days. The new damage something could do in terms of industrial espionage should be compared to what we have now, not some hypothetical universe where IP theft isn’t already taking place on a massive scale.

MrC July 9, 2015 8:05 PM

@Anon:

It’s not that straightforward. Contempt comes in civil and criminal varieties. Civil contempt penalties may be remedial — e.g., make you pay the legal fees of a party hurt by your conduct — or coercive — which is what you’re thinking about. While the traditional rule was that indefinite imprisonment as a coercive penalty was possible in theory, there’s two major caveats to that: First, the federal government and some states have put a limit on such imprisonment. For instance, in cases of refusing to testify or provide information or evidence, 28 USC $ 1826 puts an 18-month limit on jail time for civil contempt of federal courts. Second, the judge has to let you out once it’s apparent that jail time has lost its coercive effect and more of it won’t work. As a practical matter, they’ll usually let you out after a year or two and move on the charging you with criminal contempt, which has fixed predefined sentences like any other crime.

If one were really concerned about it, you could probably devise some sort of electronic deadman’s switch that would render the password invalid at some point, thus making compliance impossible, thus ending the coercive effect, thus ending your imprisonment.

Or you could just avoid generating independent evidence of what’s encrypted behind your password in the first place, so that a valid warrant/order never issues.

Or you could, you know, not commit crimes…

Dirk Praet July 9, 2015 8:26 PM

@ Nick P

I was (and still am) convinced we can beat most of the technical risks and many of the operational risks for key-holders.

Most and many will not do. Everything has to be covered, and although I get your (professional) interest in building such an architecture, I’m far from sure to which extent it is really possible, and – if so – at what price and complexity.

But in the end, that’s not the biggest problem. The main issue here is whether or not we want our governments to have such dangerous capabilities that in the wrong hands spell way more doom than they can possibly solve. When Madison introduced the 4th Amendment in Congress, it was not meant as a grant of privilege to the people but as a deliberate constraint on government power. What types like Comey and Cameron are proposing has nothing to do with catching terrorists or pedophiles, it’s about taking and preserving control through legalised mass and targeted surveillance strong encryption is one of the few defenses the people have against.

As you know, Thomas Jefferson himself invented the wheel cipher. As an American minister in France, he understood only too well the importance of strong message encryption and would probably have kicked Comey’s stupid butt over this preposterous proposition.

As a society, the US has embraced cars and guns because of the good they can do. With that, they have also accepted the tens of thousands of deaths caused by them each year (in the US alone). These figures are nowhere near those for people falling victim to terrorists or pedophiles, let alone those that could have been prevented by exceptional access to the encrypted communications of the perpetrators. How many lives are saved from oppressive governments by strong encryption each year seems to be an element often conveniently forgotten in the debate.

Everybody can do the math for himself, and even from only this angle the entire debate is completely irrational and nothing more than a smokescreen for something totally different. To put it simply: it’s not about the FBI “going dark”, but about Comey “going dork”.

@ James

I hope that the next evolution is development of encryption systems which allow people always to decrypt a decoy filesystem in case they are compelled by the police, criminals or terrorists to decrypt their data.

You’ve never worked with Truecrypt ?

One positive bit in the hearing at the Judiciary Committee was a consensus that the Fifth Amendment generally forbid compelled key disclosure.

Which is exactly why they want the exceptional access at vendor/ISP level. Then again, there’s always the extrajudicial 5$ wrench.

@ Dave Oldcorn

Ross Anderson’s comment on whichever idiot who sold this dog of an idea to the PM needing a robust rollocking seemed particularly prescient to me.

s/rollocking/bollocking . It’s reasonable to assume Home Secretary Theresa May put it him up to it. She’s not a very smart person.

@ RSaunders

Maybe this is too hard, but given the stakes … we gotta give it a shot. And I don’t think it has been given an honest hard look.

Well, since the man’s is obviously not listening to what an elite group of US/UK experts are saying, perhaps he needs to broaden his horizon and ask one David Vincenzetti in Milan, Italy. From what I’ve heard, his company has an excellent reputation.

@ Rolf Weber

I agree that it is likely a bad idea when manufacturers are forced to do it. But I don’t see serious security risks.

Neither does Theresa May.

gordo July 9, 2015 8:56 PM

@ anon,

Re: drug house

Parallel construction via Stingray?

Re: industrial espionage

Endpoint security is the weakest link?

Thoth July 9, 2015 9:07 PM

@Nick P, Clive Robinson
We already have a live example from the Clipper Chip era of the US Government mandating employees to use Clipper Chip installed PCMCIA cards while the open community rejected it outright and Matt Blaze broke the Clipper Chip’s LEAF field and exposed it’s weaknesses.

What I am very curious about is whether the Government used Clipper Chip actually contains a Govt trapdoor to disable the LEAF function thus showing that the US Warhawk Govt were wary and uncomfortable about the Clipper Chip and their publicly advertised usage of Clipper Chips in a vote of confidence for escrow were actually a fake show.

Maybe someone with a good supply of what’s remaining of the Clipper Chip could supply Ross Anderson’s team to decap a couple of those and find out the truth for us.

Up till now after the days of the Clipper Chip, neither DARPA nor any cryptographer or security engineer could show that they have a working secure sample of escrow that meets some form of baseline privacy standards (because no such standards nor research existed yet).

On one hand if we could disprove from a field experiment the use of escrow encryption being effective, we could have settled the rest of the debate. But on the other hand, allowing the Warhawk Govts of the world giving them even a single centimetre of allowance would be disastrous as the saying goes giving an inch and asking for a yard stick.

I think in the safest consideration, DARPA and the war mongering contractors could come up with something while in parallel we continue to resist and have liberal use of privacy and security enhancing tools.

rgaff July 9, 2015 9:38 PM

@James

when I said “at gunpoint” I didn’t mean that the government would force me to say stuff at gunpoint… I meant, they’d force me to hand over physical objects at gunpoint… such as all my computer devices where all my ENCRYPTION KEYS are stored! If I’ve password protected them with any kind of remotely memorable password, they can crack that no problem, they don’t need to ask me that ever. It’s the much longer non-memorizable keys stored on my devices that they confiscate this way.

So… as I was saying, the government always has had and always will have access to my encryption keys when they get a warrant, come and invade my house, and get them! They just don’t want to do this. They are too lazy. They’d prefer to destroy the constitution and democracy instead. They like being little hitler dictators instead.

rgaff July 9, 2015 9:44 PM

P.S. I love Godwin’s Law, we should whip it out first post every time, in the middle of congress, right to their faces… 😛

James July 9, 2015 10:12 PM

@Dirk Praet • July 9, 2015 8:26 PM

“You’ve never worked with Truecrypt ?”

Yes I have, but Truecrypt/Veracrypt/Ciphershed will not help someone to plead the Fifth when his phone is seized by the government.

It would be easy for Apple, Microsoft and Google and other operating system vendors to implement a steganographic filesystem that would effectively raise the Fifth Amendment as a shieled in all government investigations.

A deniable filesystem baked into all cellphones and operating systems would effectively put all data at rest beyond the power of government subpoenas, except in cases where targeted Surveillance was worth the cost.

I frankly see no downsight to granting absolute deniability to all data at rest.

If the government can’t use compelled decryption without an expensive targeted investigation, we are almost there.

Thoth July 9, 2015 10:49 PM

@James
Truecrypt and it’s variant’s plausible deniability have already been shown to have cracks by our host, @Bruce Schneier.

Link: https://www.schneier.com/paper-truecrypt-dfs.pdf

Julian Assange have his Rubberhose FS called Marutukku which is not in development or maintenance anymore.

Link: https://web.archive.org/web/20110726185300/http://iq.org/~proff/rubberhose.org/

Plausible deniability works on a concept of having a basket of plausible eggs. As long as one of the eggs stands out, it will attract attention and the name of the game is to ensure neither of the eggs looks anymore interesting than the other. Metadata, file sizes, encodings and headers … frequency of access, data contents .. there are many factors that must be considered when looking into plausible deniability. If any of the factors are off, it defeats the system itself. Many of us are simply bad at making up innocent looking dummies let alone automating it.

Physical possession of the secrets can compromise it’s security as well (e.g. forensic imaging and all that sort).

In short, Plausible Deniability is not very ideal. Bruteforce cracking would have rendered it less secure and lying about or not revealing your passwords would have brought you torture anyway.

I would propose using tamper resistant hardware modules and to split the key over quorums where a sort of oblivious circuitry to form the keys (whereby the hardware holds one of the quorum keys) with self destruct mechanism.

Andrew July 10, 2015 1:14 AM

Encryption backdoor CANNOT COME ALONE. It also means real encryption ban. Also, legal hardware backdoors.

Rolf Weber July 10, 2015 3:04 AM

@Andrew

What you say is simply wrong.

For example the government could force smartphone manufacturers to always add a manufacturer-KEK to all smartphone. But this wouldn’t mean that custom ROMs without the manufacturer-KEK are outlawed.

Or they could force popular messenger providers like WhatsApp to add a backdoor to their infrastructure. But this wouldn’t mean the use of other or foreign messengers are outlawed.

Again: I’m not saying it is a good idea if governments do any of this (it is most likely a bad idea). But key escrow / backdoors are by far not the same as encryption bans.

Spaceman Spiff July 10, 2015 5:29 AM

If idiots like Comey have their way, it will be back to paper, invisible ink, and one-time pads for secure communications, or steganography with embedded messages in Facebook pictures that use a one-time pad for decryption. Hopefully Obama will get a clue and fire Comey’s ass!

Dirk Praet July 10, 2015 6:55 AM

@ James, @ Toth

I would propose using tamper resistant hardware modules and to split the key over quorums where a sort of oblivious circuitry to form the keys (whereby the hardware holds one of the quorum keys) with self destruct mechanism.

You’re right about the plausable deniability issue. Most LEA’s that are technologically up to speed by now know about the features of Truecrypt & Co. and given the right tools have a good chance of discovering hidden volumes. At which time it’s back to the 5$ wrench or a legal disclosure procedure. I guess the same applies to other current or future concealed file systems.

In this context it is interesting to note that in US case law pleading the 5th is not a magic bullet as trials have gone both ways. In addition, lying about a hidden volume can carry perjury charges.

The method you propose sounds interesting, beit that hardware quorum keys would have to be unique and the device sufficiently tamper-proof to withstand disk extraction for cloning purposes. Or initiate self-destruct at an attempt to do so.

Rene Daigneault July 10, 2015 9:06 AM

Absolutely agree with d33t. The government can’t just blindly ban this and that. The world has already been unsafe enough. Thanks for the great post and comments also!

albert July 10, 2015 10:34 AM

@tin foil hat, @Steve,
Paul Krassner said it best: “Reality Paranoia” – when you think someone’s out to get you, and they really are.”
.

fsd July 10, 2015 11:24 AM

I don’t understand what’s the problem with all this fbi ranting? You just add additional level of encryption. And that’s it. Matroska in matroska in matroska… Authentication problem is another thing.

Gerard van Vooren July 10, 2015 2:17 PM

After reading the Peter Swire’s Senate testimony one can only wonder why people such as Comey and Cameron stick to their agenda.

“Based on the top-secret briefings and the knowledge of the members, the Review Group unequivocally recommended the following: strong encryption, without backdoors, is essential to cyber security, national security, and the prevention of cyber-crime. The Review Group was aware of law enforcement and intelligence agency concerns about “going dark.” We simply found no basis for weakening cyber security due to the going dark arguments.”

(highlights are mine)

Carl July 10, 2015 4:55 PM

@ Clive Robinson, “The problem is that any system that does not require the presentation of a warrant is neither secure or free of abuse.”

You definitely hit the nails right on its head with that statement. We need oversight before, not after, the facts.

James July 11, 2015 6:27 AM

@Dirk Praet

“You’re right about the plausable deniability issue. Most LEA’s that are technologically up to speed by now know about the features of Truecrypt & Co. and
given the right tools have a good chance of discovering hidden volumes. At which time it’s back to the 5$ wrench or a legal disclosure procedure. I guess
the same applies to other current or future concealed file systems.”

No, you are wrong, it’s impossible to prove that a hidden volume exists.

A hidden volume properly used inside a file container will to a forensic investigator look like random data.

This was the issue in the 11th circuit grand jury subpoena case.

Under the foregone conclusion doctrine, guessing that a hidden volume might exist is not adequate for the government.

The existence of encrypted data must be proven by the government.

And it’s clear under Fifth Amendment caselaw that the government can’t make up for its lack of knowledge by compelling the suspect to provide the missing puzzle.

Regarding using physical or other illegal coersion to compel a suspect to cooperate, I’ll just say that such methods will get the criminal case thrown out regardless of the probative value of the discovered evidence.

Even physical and probative evidence derived from a coerced confession is always categorically excluded at trial.

“In this context it is interesting to note that in US case law pleading the 5th is not a magic bullet as trials have gone both ways.”

Only if the government can prove the existence of encrypted data, and that the individual is able to decrypt.

In all the encryption cases wherein the Fifth Amendment was not successfully pled did the suspect act in a stupid way.

So yes, the Fifth Amendment if invoked properly is a magic bullet.

” In addition, lying
about a hidden volume can carry perjury charges.”

There is civil and criminal contempt, but criminal contempt requires proof beyond a reasonable doubt.

In order for a criminal conviction in an encryption case the government must prove beyond a reasonable doubt that (1) that the hidden volume exists, and (2) that the defendant is falsely denying its existence.

@MrC

“Or you could, you know, not commit crimes…”

Sorry, but whether you are a criminal is not germane to pleading the Fifth Amendment.

Even the innocent may plead the Fifth when compelled to produce incriminating evidence.

Please don’t imply that there is any connection between the need to plead the Fifth and being a criminal.

Dirk Praet July 11, 2015 8:49 AM

@ James

No, you are wrong, it’s impossible to prove that a hidden volume exists.

I wouldn’t bet on it. TCHunt is a freeware Windows utility to check for hidden volumes. ADS Examiner does an excellent job analyzing hidden NTFS Alternate Data Streams. I have come across similar tools on other platforms. For a judge, “guessing” is one thing, but “reasonable suspicion” as articulated by an expert witness is an entirely different qualification.

Even physical and probative evidence derived from a coerced confession is always categorically excluded at trial.

In general, yes, but the moment someone pulls the “terrorism” or “national security” card, all bets are off. As in Gitmo or secret detention facilities.

Only if the government can prove the existence of encrypted data, and that the individual is able to decrypt.

Not necessarily. In the case of Jeffrey Feldman, U.S. Magistrate William Callahan Jr. initially denied the FBI’s request to have him decrypt his hard drives. He later reversed his decision when the feds succeeded in decrypting one of the drives that was shown to contain kiddie pr0n, at which moment Feldman lost the protection under the 5th of the other encrypted hard drives.

Until such a case makes it to SCOTUS, I suspect we will continue to see conflicting court decisions.

MrC July 11, 2015 11:53 AM

@ Dirk:

You misunderstand the Feldman case. James has it correct. It was known all along that the drives contained encrypted kiddie porn. The issue was whether Feldman had access to it, thus implying that it was his kiddie porn. Surprisingly, the judge initially ruled that the government hadn’t proved access from the facts that he owned the computer, lived alone, and the only user account had his name. When the feds decrypted one drive and found his personal financial records there (along with kiddie porn), that was the final nail in the coffin for the access issue — the government had proved that he had access independently. And, to repeat, they had previously proved independently that drives contained kiddie porn. At that point, the “foregone conclusion” doctrine kicks in.

By contrast, “there may be a hidden container here,” “there’s probably a hidden container here,” “there’s definitely a hidden container here, but we don’t know what’s in it,” and “there’s definitely a hidden container here, and we do know what’s in it, but we don’t know if the defendant could access/control it” all fail to overcome the Fifth Amendment. A positive result from TCHunt or whatever would be useless without independent proof of what’s inside and that the defendant had access/control.

As for perjury: Perjury requires a lie. A naked invocation of the Fifth Amendment is merely a statement of “I’m not talking to you.” In order to get to perjury, the defendant would have to say, “no, there isn’t any kiddie porn on those drives.” Then you’ve got perjury because the defendant lied.

@James:
I’m well aware that innocent people have privacy rights, including the right not to cooperate with a criminal investigation. I never meant to imply they didn’t. I’m also aware that prosecutors sometimes engage in witch hunts (see, e.g., Aleynikov, Sergey) and that one might wish to refuse decryption of non-incriminating encrypted material in order to avoid a wrongful conviction. I was discussing a particular hypothetical in which you had committed a crime, were faced with an order to decrypt (having lost on the “foregone conclusion” issue and exhausted appeals), and then had to weigh the consequences of decrypting (certain conviction) versus the consequences of refusing (possible conviction plus certain contempt). I apologize that I did not make it clear that my comment was confined to the hypothetical.

With respect to contempt: You can have civil contempt in a criminal case, or vice versa, and the same act can be both civil and criminal contempt. The most likely outcome in a case like my hypothetical (and assuming federal court) would be that first you’d be held in civil contempt and coercively jailed until you gave up the password. If you held out 18 months, you’d be released, then tried and convicted for criminal contempt, and sentenced to a fixed term under the analogous sentencing guidelines for obstruction.

Skeptical July 11, 2015 5:15 PM

As one would expect given the authors, the paper is excellent. And the questions it raises are certainly fair, and should certainly be asked before any proposal is implemented.

One reservation I have is that the paper seems to assume (and, as the paper itself notes, no specific proposals have been put forth publicly, rendering evaluation difficult) that any proposal would be targeted at resolving a very broad array of challenges of posed by strong encryption. That is, in the paper’s framing of the issues and the critique it raises, there is an underlying assumption that any proposal would seek to solve e.g. every possible use of end-to-end encryption in real-time messaging services, rather than simply seeking to ameliorate a portion of them and acknowledging that some cases will be beyond the reach of a realistic proposal to resolve. As I recall, the proposals discussed in the United States during the 1990s sought to be, at best, partial solutions, not complete solutions.

Another reservation, connected with the first, is what I view as the potential exaggeration of the risk posed by enabling government to, with appropriate legal authority, decipher a large number of otherwise resistant enciphered communications. The challenge of limiting the damage that might be caused by the compromise or failure of any single component of the n components that might comprise the government’s technical capability can be addressed in a number of ways. These ways might require a tradeoff with desired features – a safer system might require greater delay, for example, but the paper seems to assume that the desired features are themselves brittle and precise as requirements and do not admit of any flexibility.

Perhaps the perception that any proposal will be all-or-nothing in its attempt to address the real challenges posed by widely available strong encryption has been unintentionally driven by the rhetoric from officials that receives the most attention in the press. But this strikes me as a misperception aided and abetted by the effect that public attention seems to have on the brains and mouths of almost every official in government.

Apart from those reservations, I found it vaguely disappointing that the authors themselves did not suggest a way forward. I’m unsure if this is because the authors disbelieve that there are any appreciable tradeoffs posed by ubiquitous strong encryption and that therefore there is no problem to be addressed, or if this is because such an endeavor would be a far more extensive undertaking, requiring the involvement of more people and a dedication of time and energy beyond that which any of them could feasibly commit to it.

My personal view is that a good legislative proposal would allow for flexibility in actual implementation and use, along with mandated, continuous testing, strong auditing and oversight, and the involvement of independent agencies and experts in all phases. Indeed, a legislative proposal might even be phased – providing at first the funding for the design and testing of various frameworks and perhaps a small-scale rollout, and then requiring another legislative proposal to be introduced, based on the experiences and results derived from that first phase, to proceed further.

Such a course is politically demanding, but offers the best chance for achieving either an appropriate system or for concluding that, at present, none is feasible.

One last point, which perhaps I should have led with.

Addressing the challenges of strong encryption in a politically accountable and publicly discussed manner is tightly connected with the achievement of better information security public policy and practices.

That is, without addressing those challenges in such a way, law enforcement and intelligence must continue to resort to finding accidental backdoors, surreptitiously inserting hidden backdoors, or other means of gaining knowledge of sought-for information. Without addressing those challenges, there must continue to be a balancing of the equities in which – relative to the state of affairs with those challenges were addressed – there is greater likelihood of a finding against disclosure of a vulnerability.

Put differently, without the achievement of a policy that better addresses the tradeoffs of strong encryption, there is greater likelihood of a case arising in which government’s sole access to important information rests upon the existence of a hidden, difficult to exploit, but widely distributed, vulnerability; and in such a case, the government will need to balance the benefits and costs of disclosing the vulnerability, and will likely decide in favor of non-disclosure (for some period of time). With the achievement of a public solution, however, the probability of such a case is relatively diminished, which is an advantage that must be included in any assessment of a proposal.

I’d urge those who find revolting the prospect of even thinking about ways to address these tradeoffs to consider carefully the likely alternatives to not addressing them (and not simply their most desired alternatives, as those may not be feasible). Those whose perspective on information security holds the US Government to be a prime adversary – some of which group have gone so far as to wish for the stolen OPM data to be publicly posted, which I’d hope was simply the author speaking impulsively rather than a statement revealing of that author’s actual worldview and values – are unlikely to be able to assess with any realism the actual alternatives before us.

A system in which law enforcement and intelligence agencies actually find themselves in the dark – which, to be clear, is an actual goal of some of the more radical, and quite vocal, persons engaged in discussions about surveillance (Wikileaks and some of its most visible associates, for example) – is not going to be a politically acceptable system. And a system in which law enforcement and intelligence agencies must gain physical access to one of the endpoint devices involved in a communication is one in which they will be quite substantially in the dark.

To continue along our present path will simply result in a world of greater interconnectivity and a world of progressively more uncertain security. At a certain point, this too will become politically unacceptable, though unfortunately I think the toll exacted for us to reach that point will be quite high.

So proposals for change must take into account the legitimate value which most of us place on the activities of law enforcement and intelligence agencies. This doesn’t mean that the most radical proposals from any side need be adopted. In fact the best proposal is likely to be one in which the most radical from each side will find infuriating, and one which even more moderate law enforcement and intelligence officials, along with civil liberties groups, will find troublesome or suboptimal.

There’s going to be a requirement, imposed by reality, that some significant number of the entities involved – private industry, law enforcement and intelligence, civil libertarians and others – be able to understand the perspective of the others involved, discuss the issues in good faith, and arrive at a compromise.

If that requirement is not met, then we will earn a system that is the haphazard result of oligarchical economic forces and reactionary politics. And though I am an optimist, I view that path with great misgiving. We are not God’s special snowflakes, good intentions do not guarantee good results, and the most frequent result of evolution (99% of the time) is extinction. I have neither trust in the complete centralization of power nor faith in the magic of markets to inevitably arrive at optimal states. I believe in the uninspiring wisdom of pragmatic compromise, and I am disturbed by what seems to be our increasing inability to do so. Even revolutions, to be successful, must also be compromises.

Dirk Praet July 11, 2015 5:16 PM

@ MrC, @ James

You misunderstand the Feldman case. James has it correct. It was known all along that the drives contained encrypted kiddie porn.

Thanks for the clarification. Additional reading up on the Feldman case and legal reviews of the Eleventh Circuit Court’s decision corroborates your comments on the issue. I should have done so before posting. Feldman and Fricosu forfeited protection under the 5th because the prosecution could show with “reasonable particularity” that they already knew what was on the drives. And which makes all the difference.

So the lesson to successfully take the 5th seems to be: encrypt, but make sure not to talk or give away any clues to anyone of what is there.

Buck July 11, 2015 7:48 PM

@Skeptical

In fact the best proposal is likely to be one in which the most radical from each side will find infuriating, and one which even more moderate law enforcement and intelligence officials, along with civil liberties groups, will find troublesome or suboptimal.

Have you read David Brin’s work yet? It certainly seems to fit the criteria you have laid out here! A timely comment was actually posted to this blog yesterday.

mb July 11, 2015 7:50 PM

The same stupid pipe dreams once again…

What exactly has changed since last time?

Is backdoor’ed encryption no longer massively favoring attackers? Of course it is. Or is it that magically everyone is going to abide by the law? If I need proper encryption I use a tool that provides that. If US products don’t I don’t use them. Oh it’s illegal? Yeah. So is pot smoking, tax fraud, driving under the influence and most importantly probably that what you are trying to decrypt assuming that you had a good reason to try it in the first place.

If I’m with organized crime or a terrorist organization…of course I wouldn’t use illegal cryptography. I mean. I probably stab people in the face with a knife just for fun on a regular basis. But using banned encryption? Oh that’s a whole different level of evil. I can’t square that with my conscience…

The bottom line is. Everyone becomes a whole lot less secure while those that should simply don’t give a rat’s ass and use a tool for which there is no backdoor access.

Oh wait. That’s not true. Only those who happen to be unfortunately located in the US because no one else is affected by US regulations. Now what would that mean for the affected IT industry in the US?

A. They are going to happily stick to that stupid and useless plan
B. They are going to ‘Switzerland’

Is that even a choice?

Clive Robinson July 12, 2015 5:16 AM

@ James,

… it’s impossible to prove that a hidden volume exists A hidden volume properly used inside a file container will to a forensic investigator look like random data.

Oh if only that were true, it would make life oh so simple, but it’s not.

Firstly you have to understand that what you are calling random can be charecterised in all sorts of ways, to decide on what sort of oracle has produced it. It’s one of the reasons stego can be spotted even if you can not decode it.

The lifetime of a hard disk or any other memory device starts after it has passed testing at the manufacture, and is mostly in a known state (that is it’s had a low level format to a known condition then a known test pattern is laid down on it). This is generaly not secret information, and there are various ways it can be found out.

When the storage device is put to use all writes are determanistic within the low level block addressing etc. So much so that if time stamps are acurate you can walk a storage device backwards quite a way. The problem with this is it is limited by freeded blocks getting reused, however with modern very high capacity drives and ordanary usage block rewrite does not happen. In this respect the storage device is like a compost heap in your garden or the geological strata that build up. Some years ago I did some work in this area and you realy would be surprised at what can be found in terms of structural information such as file/container size where it is on the device giving it’s size and age, and that’s before you start looking in applications for “last files used” etc.

Now by far the majority of applications write “meaningful information” to a storage device not random garbage, and meaningfull information is usually far from random, even if you do not know how to turn it back into human usable information.

The interesting thing is that that meaningfull information usually has charecteristics that show “the language” that was used to write it thus it can be linked to particular applications by “the language” they use (by language I mean “what the application speaks” not “what the user speaks” although that is usually fairly obvious).

So if you scan a storage device, you will find unused sections that are as expected from the last low level format/test which are “known” and used sections, that contain “meaningfull information” of some kind. If you find blocks of information that appear realy “random” then these are suspicious, further analysis will then start to give “structure” to these blocks that can depending on how the “crypto containers work” show not just that they exist but even how they have been used (especially true with devices that have “usage levelling” to improve their reliability”).

Whilst LEO’s might not bother to dig down to this sort of analysis, at the moment, people are starting to make tools to do so, and as with any competative market the price will drop to the point LEO’s will buy and use them as standard.

So the game changes for the defendant, in that they now have to prove a difficult thing, which is that either they did not know the containers were there, or that they don’t have the keys, or access to the keys that unlock them, to the point it satisfies the person adjudicating. Whilst the prosecution, are probably going to “bend the truth” in some way. Oh and you can be certain even if the adjudicator believes you, the LEO’s and prosecutors won’t, and you will become a “person of interest” to be dealt with at a later stage, even if they have to make you guilty in some way (such is the proven case over and over again, some LEOs don’t like the embarrassment of “having one get away”).

You will if you search this blog for my name Nick P and several others, find our discussions on the issue of proving you don’t have the keys, and more importantly nor could you. This has come about on several occasions to do with the likes of crossing from country to country and the customs authorities not having to work with a presumption of innocence.

mb July 12, 2015 6:17 AM

@Skeptical

What kind of framework can you possibly create that is not a legal mine field and still effectively prevents easy circumvention? I can’t think of anything that does not heavily rely on ignorance. I also can’t think of anything that isn’t going to do extensive damage.

mfp July 12, 2015 10:48 AM

@ mb

‘legal framework not legal mine field’: special allowances for IT people, to have clean technology instead of the defective-by-design one. But must be bound to something else, can’t be a license by itself; it must be some wide spectrum binding; something implicit into diplomas and ’10 years of equivalent experience’ (on work or associative activities; it works as a starter for curious kids).
With that in your hands you are able to knock to Motorola/Cisco/Intel’s doors, as well as any mobile phone shop, ask for complete documentation about their devices (I mean the cpu vhdl, the source of bios/uefi, etc), and they must give it to you; and don’t bother about warranty terms. As well as produce in line with these provisions: can’t integrate the crypto seed right into the cpu, or the (Intel) Management Engine into the northbridge… it must be a standalone chip so that a pro with his iron can bypass or rewrite its functions.
Add a small detail to complete the framework: all distributed pro docs must be ‘bugged softly’; so no wannabe is able to use it without a little time spent on debugging.
Empower the specialised professional association (ex: Australia) to disambiguate accidents when they occur: anti-rogues, in the pro league.
Then, cross your fingers.

Those provisions already exist in other contexts: a farmer can carry a machete, doctors can carry heavy drugs and pedoporno (sent by the parents of their little patients, to have an opinion of a bad rush in the panties), a baseball player can carry a bat, cops can carry everything. Nobody ask them any question, until they use that stuff badly.

In Italy we have something like that as an example for amradio: to use those frequencies you must pass a public exam OR be an Electronic/Telco/IT engineer. And one engineer can legally authorise others to use a radio by simply associating into the same club, as far as the club have public record of the associates (list of subscribers), and the engineer sign the overview of the radio activities of his associates. It should be extended to mathematicians, physicians, cops, army guys, and whoever else might have similar capabilities to the one you get when you study to pass the specific exam, example: in that exam you must know the morse code and international phonetic alphabet… that no engineer knows any more, it’s out of university programs since the 70s … but engineers have plenty capabilities, for this reason they are granted of that allowance without any exam.
Overall, there are professional associations for engineers, doctors, lawyers, etc; and some of those professionals the day of the diploma instead of listening a fancy speech from Steve Jobs and the CocaCola founder, must oath on their professional ethic code, one by one, in front of hemeritus professors, other new doctors, and their own families and friends (if any); see ‘Giuramento di Ippocrate’ for the doctors (the actual one: http://www.omceo.me.it/index.php?area=ordine&page=cod_deo-giuramento ). Those professors are the same that will judge those new doctors the day there’s the doubt that they killed their patient on purpose, or didn’t pay enough attention while on duty, lack of due diligence, etc…

James July 12, 2015 2:31 PM

@Clive Robinson
“So the game changes for the defendant, in that they now have to prove a difficult thing, which is that either they did not know the containers were there,
or that they don’t have the keys, or access to the keys that unlock them, to the point it satisfies the person adjudicating. Whilst the prosecution, are
probably going to “bend the truth” in some way. Oh and you can be certain even if the adjudicator believes you, the LEO’s and prosecutors won’t, and you
will become a “person of interest” to be dealt with at a later stage, even if they have to make you guilty in some way (such is the proven case over and
over again, some LEOs don’t like the embarrassment of “having one get away”).”

But that’s the point at least under the Fifth Amendment’s foregone conclusion exception; the suspect does not have to disprove the possibility or likelihood that there are hidden containers, or that he knows the decryption key.

Even without a constitutional privilage against self incrimination, the government must still establish all the facts in order to prove obstruction beyond a reasonable doubt.

If there is a gap in between what the government knows and what it can prove about the suspect’s knowledge of the encryption it can’t compensate for its lack of knowledge by compelling the defendant to disprove the government’s conjecture.

If the government can’t prove that a storage device contains a finite number of hidden layers, it can’t compel you to produce something you don’t have.

Also I think that a forensic investigation of a storage device in your scenario would be very difficult if the file container is created on one device and then copied/moved to another device which has been overwritten to full capacity several times.

You might perhaps guess that several hidden layers of encryption exist within random/encrypted data but the suspect is not obligated to disprove the suspicion that the high entropy data on the storage device is encrypted data.

Even under UK’s RIPA § 49 which departs from the self incrimination privilege the government must still prove the existence of encrypted data, and that the person of interest is able to decrypt.

It’s notable that there have been no reported convictions under RIPA involving Truecrypt hidden volumes or steganographic encryption only obviously proven encryption schemes.

The best analogy is to the government’s power to compel by warrant the production of a physical key to a strongbox.

If the government can prove (1) that the particular strongbox exists and (2) that you have the means to open it, you can be compelled to unlock it.

Most password schemes are analogous to a safe or a strongbox, and proving existence, custody and ownership is often easy.

However, a hidden volume or steganographic layer is more like hiding things in plain sight.

While I think the government might punish me for failing to open a strongbox, if it can already be proven that it exists and that I have the key, punishing me for failing to decrypt evidence from nothing should be more difficult especially when the government’s only case is that the storage device contains high entropy data.

@MrC
“With respect to contempt: You can have civil contempt in a criminal case, or vice versa, and the same act can be both civil and criminal contempt. The most
likely outcome in a case like my hypothetical (and assuming federal court) would be that first you’d be held in civil contempt and coercively jailed until
you gave up the password. If you held out 18 months, you’d be released, then tried and convicted for criminal contempt, and sentenced to a fixed term under
the analogous sentencing guidelines for obstruction.

Yes, but criminal contempt requires proof beyond a reasonable doubt.

In such a bifurcated proceeding which is criminal in nature the defendant must still be afforded the panoply of constitutional rights of criminal defendants.

The evidentiary standard for criminal contempt is not the same as for civil contempt.

And if someone is held in civil contempt for months for failing to disclose his encryption key, here assuming that there was no viable Fifth Amendment objection, imposition of criminal contempt would not necessarily follow from a finding of civil contempt.

The evidentiary standard for civil contempt is a preponderence of evidence not proof beyond a reasonable doubt, so defenses like ‘I have forgotten my password’ that would not get someone of the hook in a civil contempt hearing might still raise sufficient doubt in the criminal contempt prosecution.

James July 12, 2015 2:42 PM

@Clive Robinson
“You will if you search this blog for my name Nick P and several others, find our discussions on the issue of proving you don’t have the keys, and more importantly
nor could you. This has come about on several occasions to do with the likes of crossing from country to country and the customs authorities not having
to work with a presumption of innocence.”

Sorry, if what you meant was that the presumption of innocence does not exist at the border, I agree.

However, the worst that may happen in such a scenario is that you are rejected at the border or become a person of interest.

But for a criminal conviction, the presumption of innocence still has some force, especially when the government’s only evidence is high entropy data.

And that’s likely the case even under RIPA for a criminal conviction.

Note that all the RIPA cases wherein people have been convicted for failing to disclose their keys have concerned easily proven encryption schemes.

I am not aware of any successfully prosecuted case under RIPA wherein the government successfully argued that a defendant was guilty on account of failing to a ccount for random data or the failure to decrypt possible hidden volumes.

Aaron Schoenberger July 12, 2015 6:40 PM

Having backdoor access is a horrible idea in my opinion because any access of that type can be exploited and it will only be a matter of time before someone other than the government gains access. We’re living in difficult times were encryption and other measures help protect individuals and companies, yet government agencies want even less security and backdoor solutions. It’s a difficult dilemma.

Clive Robinson July 13, 2015 12:08 AM

@ James,

I think you and I are looking in different directions with regards to this issue.

You are looking at what “has happened” with the courts whilst I’m looking to what “can happen” in the future.

Prosecuters do not get sanctioned for chancing their arm, where as defendents get brutalized if they or there representatives do. It’s part of the ethos of “plee bargaining”, that is if you fight you will be more heavily prosecuted than if you “roll over”. This is not an “equity of arms” that you get taught in civics classes or historic legal theory, it’s “injustice on the cheap”. Likewise the notion of parole where you have to show “acceptance and repentance” of the crime you’ve been convicted of but may not have actually committed. As for retrial on “new-evidence” that’s only going to happen if the state can appear blaimless, that is in effect you have to show an individual is responsible via malpractice etc, or it’s some new “science” (see history of “hair evidence”, “bullet fragment metallurgy”, “blood/body fluid matching” and even finger print analysis).

Thus prosecutors argue such things as because a computer can be seen “in plain sight” all evidence on it must therefore be “in plain sight” which it clearly is not but courts have listened to.

Each time such an argument gets put forward the courts edge ever closer to accepting it. Thus the contents of a computer are nolonger considered the same as documents in a locked draw or vault. Thus a sensible person should put their laptop in a vault when not in use… which is if you think about it as ludicrous as it sounds, but that’s the way it is.

Thus as you get taken to court some often considerable time after your supposed crime, you get judged by the standards on that day not the day you supposadly committed the alleged crime.

Normally case law on evidence moves so slowly it’s not realy a factor in a court case, but with ICT it’s being made up “on the hoof” virtualy everyday and thus looking into the future is a very necessary thing to do. And with the current Dave & Obama show, being what would otherwise be considered “overly pessimistic” is sufficiently insufficient to be considered recklessness. Therefore you realy do have to plan for the worst imaginable and hope you don’t get noticed and dragged into being the next show trial. Thus you will have to consider where things will be not just tomorrow but in the next fifty years or so you hope to live…

Which brings us back to what is naturaly random and what is not. As I’ve indicated meaningful data that has not been encrypted has charecteristics that identify it. Data that has been encrypted has very different charecteristics and this stands out, which means even on heavily re-used drives it’s still going to stand out.

Further if you unlock a crypto container, any further encryption inside it can be identified in a similar way.

The only way around this issue is to “encrypt everything independently”, that is not just Full Disk Encryption but every file created each using it’s own key which is not stored even encrypted on the computer. Which means having an unguessable pass phrase/word for each file created, you can remember or having an alternative way to manage the file keys.

With few human minds capable of remembering one unguessable password, the alternative route needs to be considered.

The only catch is if you could get the file keys in the past what is stopping you getting them now or in the future? And how do you convince a judge of this…

There are ways but at some point they all rest on “encrypted communications”. Thus if the Dave and Obama show get their way and all encrypted traffic is both backdoored and recorded for ever, proving you don’t have access to the file keys becomes harder, depending on the meaning “communications” is given by the prosecutors and courts…

History shows that electronic communications were once truly ephemeral, then somebody invented recording devices… Laws were passed to make the interception of “communications” illegal. However “answer phones” and “call recording for training” became the norm, thus rather than go after the communications the authorities went after the recordings. The problem then arose of “when does a communication end?” That is does a voice mail held by the service provider count as a communication still in progress or an illegal third party recording? Can it be one, the other, both or neither? The courts are still making their minds up on this one, which means the sands are shifting for the “C” in ICT which does not bode well for the “I”. Because it’s very difficult to mitigate risks around an endevor when the sands shift alowing no bed rock of certainty on which to build.

When it comes to maintaining your privacy, I think there are ways/options other than the “don’t have any”. However the question at the end of the day is what will the prosecutors argue to profitably get their way? Which you will note is not a privacy issue, so why is it relevant? Because they are the real opponent and some prosecutors on the make, make psychopaths look soft and cuddly in comparison. To them the possibility of the defendants innocence does not feature in their outlook, they will make mud stick, and any mud will do, the more of it the better, they have a future and the defendant is just somebody to bulldoze into their career path.

Sancho_P July 13, 2015 4:43 AM

@Skeptical:
Thank you for trying to make Comey & Co not looking like single idiots. Noone but you could pack their unsound ideas so nicely into words and paragraphs, attach a flag and add some tears, just to recruit the american nationalists in fighting the world’s evil. NOBUS! Ein Folk, ein …

However, each of your paragraphs ignores the technical and ethical arguments that had been brought up before and/or is far from being realistic.
We can’t even have secure communication using today’s equipment, how could we have secure and (only for the good guys) backdoored communication / devices?
Fix that broken system, secure data and communication, respect privacy – this would be a step forward. To have the connection “metadata” is more than they need.

@James, Clive Robinson
I think it depends on your “opponent”. If it’s national security we are doomed in each and every country of the world, they’d chain you naked to the floor, deprive you from sleep and waterboard you for months (America) or behead your childred in front of you (ISIS), your choice.
Anyway, in the criminal case America has the advantage of plea bargaining.
But we aren’t criminals, so to protect intelectual property or business proposals, contacts and so on I’m with @James, a hidden volume (beware of closed sourece OS) should be the way to go.

CallMeLateForSupper July 13, 2015 10:17 AM

@Clive Re: Comey’s simulated(?) belief in the do-ability of a frontdoor backdoor

“I actually suspect it’s an attack on “Open Source” code.”

That very thought occured to me last week when I encountered – for the upteenth time in almost as many anti-key-escrow papers and articles – the qualifier “commercial” in definitions of crypto things (standards; hardware; software) that government (specifically its TLAs) should be prohibited from undermining/weakening (read: “effing with”). Commercial stuff shouldn’t be effed with, but neither should non-commercial stuff. So why that pesky little qualifier “commercial”?

Words matter. The devil is in the details.
Remember “Read my lips: No new taxes”?
Remember the shock of learning that interception/storage != collection?

rgaff July 13, 2015 10:32 AM

@Clive Robinson

A system-wide default “secure erase” feature that technically does it by encrypting and then throwing away the key might help put lots of similar looking “random like” data laying around…. of course the devil is still in the details, as to how easily data created with that utility is distinguishable from a hidden volume…

Gerard van Vooren July 13, 2015 4:18 PM

Question. What was the consensus of this hearing and what are the next steps (if any)?

gordo July 13, 2015 5:35 PM

@ Gerard van Vooren,

Question. What was the consensus of this hearing and what are the next steps (if any)?

My take away is that it’s way too early for consensus, and that more hearings are ahead.

Both committees want to hear from the technical community. That can be seen implicitly from the committee chairs’ opening statements and remarks (see below), and explicitly from some members of each of the committees during their question time.

The “Keys under Doormats” paper was put into the record by Senator Leahy at the Judiciary Committee hearing, and by Senator Burr at the Intelligence Committee hearing.

As Chariman Grassley said at the Judiciary Committee hearing, in his prepared statement:

This hearing is intended to start a conversation in the Senate about whether recent technological changes have upset the balance between public safety and privacy. … I hope the Senate takes a first step at seeing if any consensus is possible on this important and complicated issue.

http://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Grassley%20Statement1.pdf

And, as Chairman Burr said, later in the day, at the Intelligence Committee hearing, in his opening remarks:

Director Comey, you have said that the encryption now readily available is “equivalent to a closet that can’t be opened” or “a safe that can’t be cracked.” You have an opportunity today to speak to the Committee – and the American people – and convince us that in order to keep the American people safe, you need to be able to “open that closet,” and to “crack that safe.” There are no easy answers and we are embarking on what will be a robust debate. Director, you wrote Monday that part of your job is to “make sure the debate is informed by a reasonable understanding of the costs.” I look forward to your testimony, this discussion, and I appreciate your being here.

http://www.burr.senate.gov/press/releases/chairman-burrs-opening-remarks

khigh July 15, 2015 7:38 AM

So, various governments are pushing for ways to limit or bypass data encryption technologies.

Shall we think of it this way: It’s like taking away a vaccine in order to make a biological-warfare weapon more effective.

rgaff July 15, 2015 9:50 AM

@ khigh

That’s an excellent analogy. I need to remember that one!

An outright ban on encryption is effectively a ban on all electronic commerce. Any intentional weakening of encryption makes all electronic commerce much more dangerous. We need to put it in terms of money, because that’s what people understand. Just imagine all online stores pulling out of a major country (either suddenly in the case of a ban, or slowly in the case of weakening).

tyr July 16, 2015 2:38 PM

@rgaff et al

A system-wide default “secure erase” feature that technically does it by encrypting and then throwing away the key might help put lots of similar looking “random like” data laying around…. of course the devil is still in the details, as to how easily data created with that utility is distinguishable from a hidden volume..

You wrote the above to Clive. This idea seems to have enough merit
to discuss at length. You could add this to Bleachbit as an option.
First do the normal (odd word for it) scrub of erased data, Then
generate an equivalent fill of random garbage or use something
non random. For example pick a random page from a random ebook
from gutenberg chop out enough text for a filler block encrypt
that and throw away the key. If you keep the old this block is
discarded header it looks like empty space. That’s not a spec
document just an idea which may have merit. I’d imagine that a
three letter agency who broke your trove of random garbage
would be less than pleased to discover you weren’t Fu Manchu
after all that work.

*unless your system picked Sax Rohmer ebooks at random

rgaff July 16, 2015 9:19 PM

@ tyr

I like your improvement to my idea, that way if it ever were decrypted per chance, you’d not be leaking the data you tried to securely delete like in my original suggestion… The trick is in efficiently finding that “random garbage” that looks enough like everything else after encryption…

rgaff July 17, 2015 12:00 AM

Of course, if the stuff you’re deleting really isn’t important, then the most efficient “random garbage” is that very stuff itself! This applies in cases where you don’t really need a “secure delete” per se… in cases where you do, of course you have to do more to it, as you suggest.

rgaff July 17, 2015 3:25 PM

OMG I never read those rebuttals posted in Bruce’s post at the end… I can’t even get past the first couple paragraphs of the first one!

It’s like the government saying:

“Since ISIS benefits from the use of electricity, we must ban all use of electricity! Come on technical guys, you’re smart, come up with an alternate way to power your electronic devices than electricity! You got us to the moon which is much harder, I’m sure you can do this measly thing which is much easier”

This is similar to their arguments because a ban on encryption and privacy is a ban on all commerce. It’s integral. It’s the electricity of commerce in a connected world. Any mandated significant weakening of encryption and privacy is the same thing only the effect is slower over more time. Everyone who refuses to see this has another agenda, it’s that simple.

sideline July 19, 2015 10:32 AM

Why are the American citizens, also referred to by the phrase “We, The People”, not demanding that the government use encryption that the people can decrypt? That was one of the promises of the Clipper Chip: that only the government was required to use it, and it would ensure that government communication could always be read by the people. All of the current debate centers around whether or no the government should have access to the communication of the governed. It deflects debate away from whether the governed should have access to the communication of the government. Americans no longer seem to trust the government, and the U.S. government definitely does not trust the citizen. Rather than debating whether to allow secret actions, we should be trying to regain that trust and cooperation.

Gerard van Vooren July 19, 2015 12:35 PM

@ sideline

“Americans no longer seem to trust the government, and the U.S. government definitely does not trust the citizen. Rather than debating whether to allow secret actions, we should be trying to regain that trust and cooperation.”

Let’s start with accountability.

hukill July 24, 2015 4:07 AM

@Dirk Praet

TCHunt doesn’t check for Truecrypt hidden volumes. It just seeks for files with random content, and then guesses them to be Truecrypt volumes.

There is no way to passively prove that a random file is an encrypted volume, or that the encrypted volume contains hidden volume.

A simple example why outlawing encryption is impossible:
1. Take an encrypted volume (A) and container with normal files (e.g. your family pictures or movies) of an equal size (B), then perform a XOR operation: A XOR B = C
2. Delete B, now you will have A and C left, both are files with random content
3. If somebody asks you to “decrypt” A or C, you can just say that they are one-time pads to each other: A XOR C = B, which is your container of innocent files

Dirk Praet July 24, 2015 6:01 AM

@ hukill

TCHunt doesn’t check for Truecrypt hidden volumes. It just seeks for files with random content, and then guesses them to be Truecrypt volumes.

It is indeed impossible to verify TC containers. The app tries to identify files that might be, thus creating “reasonable and articulate suspicion” for LE. Only when the TC boot loader replaces the normal boot loader is it possible to positively determine that TC is present and so lead to the logical inference that a TC partition is also present.

Plausible deniability may hold up in court (or not), but it will not protect you from an adversary who’s not playing by the book (10$ wrench), or in a jurisdiction where you can be detained indefinitely without charges (e.g. Sweden).

Clive Robinson July 25, 2015 3:05 AM

@ Dirk Praet, hukill,

TCHunt doesn’t check for Truecrypt hidden volumes. It just seeks for files with random content, and then guesses them to be Truecrypt volumes.

As I’ve said before there are various types of random, encrypted files using a recognised chaining mode tend to have a very non natural set of statistics. Encryption using other types of mode have different statistics which the “(in)famous Tux picture” showes quite well. But files that contain random data from a non encryption process have statistics that have known characteristics that can be used to identify the contents of the file.

From atleast as early as WWI non machine ciphers could be identified from each other by the statistics they produced. Thus the enemy cryptoanalysts could identify an attack stratagem prior to puting pencil to paper.

During WWII SOE were initialy lied to about the security of “poem codes” by the advice given from the Secret Service supposedly on the specific instructions given by the then ‘C’. Even though the Germans had relativly few crypto analysts the poem code made their lives much more fruitful than they would otherwise have been. One of the issues that arose from SOE switching to OTPs was that such traffic stood out clearly from the poem code traffic.

A solution to this problem was thought desirable thus various methods were considered. But it was a case of reinventing the wheel as other secret services had in the past solved the problem.

The Nhilist hand cipher system had a simple pre coding step that provided statistic flatening (we use compresion systems these days). It was realised that if you applied the deflatening process to the ciphertext the resulting statistics would hide the traffic in with other cipher traffic and cause the crypto analysts to waste much time and effort thus reducing their value as a resource to the enemy.

Whilst one of the first things you get told is “you have to compress before you encrypt” you rarely if ever get told to “inflate” the ciphertext to give it “natural randomness” so it does not stand out from other traffic / files.

If this “third stage” process was added then TCHunt would either not work or take significantly longer to find the container files.

I’ve played with doing this, and I’ve found that a process that takes the ciphertext and makes modifications to make it look like various forms of compressed files can have interesting results, one of which is most standard compression programes don’t take kindly to being fed “correct looking” files, that are in effect anything but. A simple example is “run length encoding” if you make your relativly small file look like it has a lot of runs in it the inflation process can take up a great deal of resources, if not causing the program to crash out…

I would urge others to consider such “third stage” games as it makes “cheap off the shelf tools” like TCHunt a waste of time for unskilled investigators, and those “authoritarian followers” that do not have sufficient ability to do much more than get others to press their uniforms and polish their helmets.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.