The Importance of Strong Encryption to Security

Encryption keeps you safe. Encryption protects your financial details and passwords when you bank online. It protects your cell phone conversations from eavesdroppers. If you encrypt your laptop—and I hope you do—it protects your data if your computer is stolen. It protects our money and our privacy.

Encryption protects the identity of dissidents all over the world. It’s a vital tool to allow journalists to communicate securely with their sources, NGOs to protect their work in repressive countries, and lawyers to communicate privately with their clients. It protects our vital infrastructure: our communications network, the power grid and everything else. And as we move to the Internet of Things with its cars and thermostats and medical devices, all of which can destroy life and property if hacked and misused, encryption will become even more critical to our security.

Security is more than encryption, of course. But encryption is a critical component of security. You use strong encryption every day, and our Internet-laced world would be a far riskier place if you didn’t.

Strong encryption means unbreakable encryption. Any weakness in encryption will be exploited—by hackers, by criminals and by foreign governments. Many of the hacks that make the news can be attributed to weak or—even worse—nonexistent encryption.

The FBI wants the ability to bypass encryption in the course of criminal investigations. This is known as a “backdoor,” because it’s a way at the encrypted information that bypasses the normal encryption mechanisms. I am sympathetic to such claims, but as a technologist I can tell you that there is no way to give the FBI that capability without weakening the encryption against all adversaries. This is crucial to understand. I can’t build an access technology that only works with proper legal authorization, or only for people with a particular citizenship or the proper morality. The technology just doesn’t work that way.

If a backdoor exists, then anyone can exploit it. All it takes is knowledge of the backdoor and the capability to exploit it. And while it might temporarily be a secret, it’s a fragile secret. Backdoors are how everyone attacks computer systems.

This means that if the FBI can eavesdrop on your conversations or get into your computers without your consent, so can cybercriminals. So can the Chinese. So can terrorists. You might not care if the Chinese government is inside your computer, but lots of dissidents do. As do the many Americans who use computers to administer our critical infrastructure. Backdoors weaken us against all sorts of threats.

Either we build encryption systems to keep everyone secure, or we build them to leave everybody vulnerable.

Even a highly sophisticated backdoor that could only be exploited by nations like the United States and China today will leave us vulnerable to cybercriminals tomorrow. That’s just the way technology works: things become easier, cheaper, more widely accessible. Give the FBI the ability to hack into a cell phone today, and tomorrow you’ll hear reports that a criminal group used that same ability to hack into our power grid.

The FBI paints this as a trade-off between security and privacy. It’s not. It’s a trade-off between more security and less security. Our national security needs strong encryption. I wish I could give the good guys the access they want without also giving the bad guys access, but I can’t. If the FBI gets its way and forces companies to weaken encryption, all of us—our data, our networks, our infrastructure, our society—will be at risk.

This essay previously appeared in the New York Times “Room for Debate” blog. It’s something I seem to need to say again and again.

Posted on February 25, 2016 at 6:40 AM55 Comments

Comments

rodmar February 25, 2016 7:01 AM

Regarding the current debate between Apple and the FBI the way I see it is that the FBI is not asking for a backdoor. Its asking Apple to let them use the backdoor that they already have there because if Apple is technically able to do what the FBI is asking it means that they already have a backdoor in all of their devices.

Peter February 25, 2016 7:38 AM

“Strong encryption means unbreakable encryption” –
OTP isn’t exactly practical for everyday use .

Besides, I’m not sure if “computationally unfeasible” and “unbreakable” are the same thing . While many of the algoritms in use may be “unbreakable” atm, who knows what tomorrow brings ?
Another problem is that many of the techniques used atm offer next to no forward secrecy, or none at all ..

LIJO February 25, 2016 8:00 AM

Nicely put Bruce. When I hear the ‘arguments’ from what I call the ‘power is might’ brigade, they usually have such a poor understanding of the technology side of the story, and it’s far reaching implications. I’m sure we can think of at least one case where an aspiring politician is leaning on this to promote their career.

Wm February 25, 2016 8:01 AM

I guess people are too lazy to personally encrypt their own messages and the feds know this. With the past revelations about Microsoft and Apple opening doors to data access to the authorities, no one should ever trust that their data is secure on these devices. Personally encrypting your data with pgp, gpg, otp before sending is the only sure way to be safe. That should be done on an air gaped computer. The feds can attack companies like they are currently doing with Apple, but they cannot attack individual Americans who are encrypting their data before sending. Today, Ben Franklin would say, “Those who give up strong, secure encryption for security will end up with no security at all”.

Diogenes February 25, 2016 8:56 AM

1) If I’m understanding the Apple situation correctly, if the FBI has a warrant, can’t Apple close the door and decrypt the phone themselves and then hand the info to the FBI?

2) Linux has a package that handles OTP pretty easily, though it is a command line utility. Is there any reason they couldn’t develop a GUI to make it easier? Also I found this list of OTP packages.

3) The MAIL has been a vehicle for bad things for years but the government has never suggested that we have to have transparent envelopes to send letters to one another. In fact, I think it might take more effort for the government to open a letter than it takes for them to ask Google to give them access to user info. Load up on stamps and envelopes, folks 🙂

I’d appreciate any feedback you might have. Thanks.

Ricochet February 25, 2016 9:17 AM

@Wm & Diogenes

There are too many problems with email and PGP stands out like dogs balls statistically (is a X-KEYSCORE marker as you would expect), plus doesn’t hide your meta-data or buddy lists, and there is no perfect forward secrecy.

You’d be better off using One Time Pads with purely physical means (dice, pen/paper, conversion tables) and posting hidden messages using steganography to Schneier’s forum for a laugh. Your recipient just needs to know which forum and what your call sign might be. Totally deniable.

Air-gapped doesn’t mean shit these days, plus you risk transfers of vengeful code with each USB or other medium to/from the ‘safe’ computer.

See here for a nice list of problems – “15 reasons not to start using PGP” http://secushare.org/PGP

Ricochet looks like a nice alternative which solves the trust and meta-data problems:

https://ricochet.im/
http://invisible.im/#faqs

Ricochet is a different approach to instant messaging that doesn’t trust anyone in protecting your privacy.

Eliminate metadata. Nobody knows who you are, who you talk to, or what you say.
Stay anonymous. Share what you want, without sharing your identity and location.
Nobody in the middle. There are no servers to monitor, censor, or hack.
Safe by default. Security isn’t secure until it’s automatic and easy to use.

Ricochet uses the Tor network to reach your contacts without relying on messaging servers. It creates a hidden service, which is used to rendezvous with your contacts without revealing your location or IP address.

Instead of a username, you get a unique address that looks like ricochet:rs7ce36jsj24ogfw. Other Ricochet users can use this address to send a contact request – asking to be added to your contacts list.

You can see when your contacts are online, and send them messages (and soon, files!). Your list of contacts is only known to your computer – never exposed to servers or network traffic monitoring.

Everything is encrypted end-to-end, so only the intended recipient can decrypt it, and anonymized, so nobody knows where it’s going and where it came from.

Q: So specifically, what problems does Ricochet solve?

A. Ricochet makes it possible for its users to communicate without leaving a retrospectively recoverable forensic trail behind on third-party servers. It also protects users from passive snooping by oppressive governments.

In the case of a traditional instant messenger conversation, the service provider (Yahoo, Microsoft, Google, AOL etc) will have records of which user accounts have communicated with each other and when.

Ricochet leaves behind no such trail. It also doesn’t log messages on either end, and when used in anonymous/unauthenticated mode the software will leave behind very little (and eventually, we hope, no) forensic evidence linking a user to a conversation.

Q: What problems does Ricochet not solve?

A: If a user is already the subject of targeted surveillance, Ricochet cannot facilitate secure, anonymous chats. This is not the problem it is seeking to solve. If the user is the subject of a targeted investigation by state security services, the investigating agency might do something as simple as take a time-stamped video recording of both ends of a conversation to prove that it happened.

Other problems with Ricochet include private key jacking, spam/fishing/malware, and further audits (and cash) needed for essential development.

Tiago February 25, 2016 9:18 AM

@Diogenes
1 – Apple doesn’t have that ability. They could develop it and do as you suggest but then Apple would have the ability to decrypt ALL iPhones in the world. If that tools falls in the wrong hands it’s the biggest issue to this problem. Of course you can claim that Apple having the ability to decrypt all iPhones it’s already an issue

2 – OTP are unbreakable. That’s because any stored encrypted data can be the decryption of any message. The point is a decryptor (legitimate person or attacker) would need the key.

3 – It’s hard to eavesdrop on all mails as you say. It’s easy for electronic communication. That’s just the way technology works

JdL February 25, 2016 9:34 AM

Any weakness in encryption will be exploited — by hackers, by criminals and by foreign governments.

Later:

This means that if the FBI can eavesdrop on your conversations or get into your computers without your consent, so can cybercriminals.

And:

I wish I could give the good guys the access they want without also giving the bad guys access

I hope I live to see the day Mr. Schneier includes the U.S. government among his list of criminals who want to hack your private data. To view the government any other way is to display extreme naïvety, IMHO.

David Leppik February 25, 2016 10:16 AM

Encryption is identity.

In person, you can prove who you are because real, live people are hard to impersonate. But online, encryption is the only difference between two nodes that can’t be faked easily.

In the flesh, your voice, face, and mannerisms are your identity.

Online, a private key is your identity.

I’m surprised we don’t hear more about securing private keys. How do Apple, Red Hat, Google, and Microsoft protect the keys they use to validate their OS updates? You’d think every state-level spying organization would try to get their hands on those keys.

David Leppik February 25, 2016 10:42 AM

What the FBI is really asking for is the use of Apple’s private key in order to sign a system update. They also want Apple to do the development work for free.

It is possible for Apple to write a custom version of iOS that only runs on that phone, and refuses to boot if it has the wrong serial number. In fact, it’s possible (though much more difficult) for others to write a custom OS, but without Apple’s private key, the phone would not accept the “upgrade.”

So if you trust Apple’s public key encryption, and you trust Apple’s ability to keep their private key safe, you can trust this to be a one-time-per-court-order exploit.

So far nobody in this debate has questioned either the encryption technology or the security of Apple’s corporate private key. Those are prerequisites of strong encryption.

Which means that (for once) I have to disagree with Bruce. If strong encryption (and key signing) works, then Apple can create an exploit that only they can use. If not, then Apple isn’t creating a backdoor, they are exploiting a pre-existing backdoor.

Either way, the real issue isn’t how good the security is, it’s that Apple is being ordered to break it. And if the government wins, it will be all to easy to convince a judge that there’s no meaningful distinction between ordering a manufacturer to break in after the fact and ordering them to build bad security in the first place.

Z February 25, 2016 10:55 AM

@David Leppik : I agree with your analysis.

Regarding your last paragraph, this court case could also precisely helps making the distinction between the two. It could be seen as a fair compromise between the FBI wishing to perform their investigation, and Apple wishing to be able to engineer systems as secure as possible. I believe this why this court case is so interesting: it actually drives a wedge between both issues.

Matt February 25, 2016 10:59 AM

There is a misconception that Apple introduce disk encryption in IOS 8. This is wrong. It was already introduced for iphone 3GS on IOS 3.

Previously iphone data is decrypted upon boot up. What Apple probably did for law enforcement was they had a custom boot software to bypass the passcode lock screen and extract the decrypted data. Apple has NEVER cracked the encryption.

In IOS 8, Apple changed the security setting so the data stays encrypted until the user keys in his passcode.
Bypassing the passcode lock screen is useless now as the data is unreadable.

References:
Why cant apple decrypt your iphone by Matthew Green
David Schuetz’s explanation of ios encryption

Another good reading is IOS hacker Will Strafach’s opinion on BGR

ianf February 25, 2016 11:02 AM

@ rodmar – see it however you like it; Apple doesn’t have that backdoor, and even if they had, they wouldn’t give it to the FBI. In fact, this entire petition-to-compel warrant is so preposterous and bizarre that one has to wonder whether Robert X. Cringely‘s hypothesis of the whole skirmish being intended to create an, to the USGov unfavourable, legal precedent, isn’t about right.

@ Diogenes – you are “not understanding the Apple situation, never mind how (in)correctly. The point of contention here is not that iPhone 5c; that is but a pretext a honeypot for public consumption, prop in the battle for something larger: the government usurping itself the mandate to dictate how a business goes about its business. In short, it is a conflict of legitimacy; or in colloquial, E-ZY to unnerstan’ terms: whose “what needs to be done” scenario [pace V. I. Lenin], the USGov or, in this pilot case, Apple’s, is to be the legit one.

Spellucci February 25, 2016 11:20 AM

I originally read this on http://www.nytimes.com/roomfordebate/2016/02/23/has-encryption-gone-too-far. Denise Zheng’s counterargument was, “But when technology companies design systems that don’t allow them to comply with court orders, they are effectively telling law enforcement to up their game in hacking mobile phones.” She did not provide any evidence for this. Her whole specious argument hinges on this unfounded supposition. It is concerning that the deputy director for the Strategic Technologies Program at the Center for Strategic and International Studies, would build speculation on speculation as if that was an argument for U.S. companies cooperating with an already overreaching government.

old*man*c February 25, 2016 11:23 AM

@David Leppik –
If “Apple” can do it for Apple-approved purposes, then some person or persons at Apple can do it for their own purpose, and then eventually one of them will do it for some other purpose given sufficient incentive (e.g. blackmail, appeal to patriotism, cash).

Given that “anyone can build a security system that they can’t break” – I presume that Apple could actually build a system that THEY can’t break. From some discussions in this blog of what’s required for Apple to do the requested breaking in this instance, the new system should at a minimum (necessary but not sufficient) require a long, secure password before installing any upgrades. And if you forget your password, tough luck, no recovery path for your data. They could sell that as an “Iphone 7 (Secure)” and for those who don’t want to bother with long passwords, they can sell an “Iphone 7 (Insecure)” (presumably with some improved name that wouldn’t reduce sales to zero).

Spellucci February 25, 2016 11:25 AM

@David Leppik, to further your point, if the U.S. government has the right to coerce Apple into building custom software to help decrypt a terrorist’s phone, what’s to prevent China from coercing Apple into building custom software to help decrypt a U.S. Embassy staff member’s phone on suspicion they are a spy?

thomas February 25, 2016 11:30 AM

“If the FBI gets its way and forces companies to weaken encryption, all of us — our data, our networks, our infrastructure, our society — will be at risk.”

This is the FBI trying to create the backdoor! Using Apple to break in. Creating a precedent. They don’t really care about the data on the that particular phone. They want to weaken security.

Z February 25, 2016 11:33 AM

@Spellucci: absolutely nothing prevent China for requesting something from a corporation operating on its territory. Nations are sovereign. This is true whether Apply complies with the FBI request or not.

And there are plenty of precedents. China forces Google to not show censored webpages to Chinese nationals. In the EU, companies are forced to implement the “right to be forgotten”. Nations coerce corporations to do things all the time.

Will February 25, 2016 11:46 AM

The “key” is the key control. It doesn’t matter whether your crypto is “unbreakable” if your key controls are lame.

bluonek February 25, 2016 11:47 AM

While I appreciate the use of the age old news tactic of taking something fairly moot and/or irrelevant and using it’s momentum to prove a point – especially a valid and extremely important point – I do have to note that all this hoopla about asking Apple to put a “backdoor” in the iPhone is a bit misleading.

The “backdoor” (security flaw) already exists. There is currently no (ok, ok, very ineffective) security mechanism to prevent an attacker to build a basic OS that removes the password attempt protections etc and allows an “attacker” to brute force the password. This is all the government is asking of Apple – because AFAIK this is all they can do – and any technical savvy person or group could do this.

As ridiculous as Mr. McAfee can be in his interviews, that’s one point he makes quite validly.

Terrence February 25, 2016 11:49 AM

“Anyone can design a system that they can’t break” is a dumb statement that people keep repeating. Every system is designed by someone.

This statement is invariably interpreted to mean:

1)
“only systems designed by someone else are secure”

2)
“if you design something, anything, it can’t possibly be secure.”

No wonder we’re in this stinking mess.

Will February 25, 2016 11:50 AM

Access control boils down to a contract. This is an area where blockchain technology (i.e, digital contracts) and digital “oracles” that validate contract compliance is a potential solution.

Mark February 25, 2016 12:25 PM

3)
“one cannot be confident in the security of a system until it is independently verified by technical experts from the cryptographic community”

Which of these interpretations do you fall under?

Jason R February 25, 2016 12:36 PM

@Z – While nations are sovereign, that doesn’t give them omnipotence to do and demand anything. Yes, China told Google to block “bad” webpages. Google gave China the finger and redirects all Google.cn to Google.com.hk, outside of China’s jurisdiction.

China could demand Apple give them this custom software that lets them break into any phone. Apple could just shutdown all Apple stores. There will be an instant “black market” for Apple devices out of TW, HK and other neighboring locations that China doesn’t control.

Z February 25, 2016 12:56 PM

While Google “gave the finger” to China, what most probably happened is that both organisations ended up settling their dispute in private, since both see benefits in Google operating in China. If China wanted to seize Google assets on their territory, they could certainly have done so, like plenty of countries have done plenty of time in the past. No, countries are not omnipotent, but they certainly have control over what is legal and illegal on their territory, and control over the enforcement of these laws.

Nobody is saying Apple is powerless here, and they can certainly use some of their weight to restrict nations from interfering with their operations. But as COUNTLESS examples have shown, there’s a limit to this power – and Apple know this very well. In the end, Apple (or Google) will do whatever it can do against China, just like they do against the US, just like they do with the EU. Those are separate relationships, and nothing a US judge can decide will have any bearing one the decision from a Chinese judge.

Clive Robinson February 25, 2016 1:33 PM

@ rodmar,

… the way I see it is that the FBI is not asking for a backdoor. Its asking Apple to let them use the backdoor that they already have there…

The FBI is not asking to use a backdoor.

It is asking Apple to design the equivalent of a backdoor, then sign the code of that backdoor with Apples “code signing key” then use the softeare update process to install it on the phone.

However contry to what the FBI / DOJ have said in court filings, this will not be a one off event, it’s now known that there is getting on for 200 other phones from non-terrorist cases that the DOJ / FBI want to get into.

As has been pointed out the “third terrorist” argument is no more than the FBI pi55ing in the wind.

But consider a bit further. There is the “undue burden” asspect. The FBI / DOJ say it will not be an “undue burden for Apple”. Whilst some might believe that, there is another aspect the FBI / DOJ are not mentioning.

What of the people Apple task with developing this “backdoor” for the FBI?

The very few people who could do this will then have very very valuable knowledge. There are many people who would stop at very little to get this knowledge. Thus the FBI via the DOJ and Court are telling Apple to select a small number of people (/victims) and paint bl**dy great targets on their backs and,those of their loved ones…

You need to ask who is responsible should one of these people or a member of their family is put under duress by an unknown party wishing to obtain the secrets?

Before people say this is unlikely to happen, they need to get out and about a bit more in the world. Such duress kidnappings are rife in many places in the world and have happened in all western nations in living memory, and many have turned out badly for the victims and their relatives (“dead men tell no tales”, thus don’t make good witnesses).

So whilst the DOJ might argue it’s not an “undue burden” for Apple I don’t think you can say the same thing for those that have to design the backdoor…

Terrence February 25, 2016 1:41 PM

@Mark

Which system “independently verified by technical experts” (whatever that means) has never failed?

You mean the open-source stuff that was reviewed by a bazillion “experts” but determined only years later after half the planet was using it, to have dangerous vulnerabilities?

Or do you mean the crypto app that was recommended by all the “experts” and downloaded in faith by millions only to be mysteriously abandoned by – we don’t even know who – developers in E Europe?
Oh wait, never mind. Some guys mowed lawns after school for a few months to raise enough money to finally have it evaluated it by “experts.”

No, you must mean crypto stuff that absolutely no one can measure the strength of but instead, just wait a long long long time to see if anyone finds something wrong with.

Because that’s what corporations are staking their entire future on right now.

Clive Robinson February 25, 2016 1:52 PM

@ Diogenes,

… if the FBI has a warrant, can’t Apple close the door and decrypt the phone themselves and then hand the info to the FBI?

Not realy no. You need to look up “the rules of evidence” specificaly to do with “the chain of custody” and also “fruit of the poisoned vine”

Another big fat untruth in the FBI / DOJ paperwork and statments ignores these problems.

Specifically it’s not possible for Apple or the FBI to make a “forensic copy” of the phone. Which means that there is a question as to what happens on the phone when the backdoor software runs?

If the recovered data was ever to be used as evidence in court it could be chalenged, and this would force Apple to reveal the backdoor software into “court records” which would mean in effect the public domain, or the FBI / DOJ would have to drop the case.

With something like 200 other phones qued up waiting to go through the same process, there is absolutly no way the FBI / DOJ is going to protect Apples “Trade Secrets” on this by backing out of court cases.

So the whole thing is a stich up by Comey and Co. to prove that theirs are not brass but steel and bigger than anyone elses including the US President.

Scam Dunk February 25, 2016 1:57 PM

Yes, it is a matter of more or less security.

People don’t get that. Worse yet many others simply don’t care.

FBI is playing the National Security trump card. Except, the bad guys are dead and based on many months of investigation, can’t or won’t say if there were others involved, or not.

In short they are playing games. All that can be gained by using the “All Writs Scam” is about 18 minutes worth of connect time on the company phone. Logically, there is nothing there they don’t know already.

But, maybe Apple is gaming too. There are rumors of a back room deal afoot. Apple may want a very large fee to crack their own phone and receive a guarantee of immunity from any lawsuits or criminal charges that might result from their efforts.

One of my biggest objections to the FBI scam is they are using an arcane and vague law to get the judge, who was a former federal prosecutor, to write the BACKDOOR LAW they cannot get right now from Congress.

I hope the government loses. That would be a small victory for US.

Clive Robinson February 25, 2016 2:00 PM

@ Ricochet,

Ricochet uses Tor therefore it can not protect meta-data, as Tor does not actually protect meta-data it only appears to under certain assumptions that have been shown to be false (as recent court cases have shown).

ianf February 25, 2016 3:07 PM

@ Clive […] “the whole thing is a stitch up by Comey and Co. to prove that theirs are not brass but steel and bigger than anyone elses including the US President.

You can’t be serious about that… were that the case (let’s call this spade a conspiracy of surveillance bureaucracy), they’d have instigated it in the worst possible moment, opposing a lame duck POTUS, who therefore has little to lose to oppose them in force. Presumably they are cynically sober enough to understand that they could lose, and be out on their [euphemism alert!] hind quarters (“the higher they climb, the harder they fall”).

Besides, were that the case, a sort of by-the-book intergovernmental insubordination against the better judgement of the DoJ/ White House, I dare say it would have been stopped by now. So there must be more to it than some simplistic motivation that we, the outsiders, can discern. Remember the promise of Tim Cook’s as of 18 hours ago:

    […] At one point Cook told ABC that he has not yet spoken to Barack Obama about his legal standoff with the justice department. He then said unequivocally that he “will.”

    “We need to stand tall, and stand tall on principle,” Cook said. “This should not be happening in America.”

[1m20s] https://m.youtube.com/watch?v=VsRBgvlVXeM [abc news via The Guardian]

CallMeLateForSupper February 25, 2016 3:17 PM

@bluonek
“There is currently no (ok, ok, very ineffective) security mechanism to prevent an attacker to build a basic OS…’

So you build your FBiOS. Stand by for bitter disappointment as soon as you try to force-feed it to an iPhone 5: “Bugger off, mate” (AU version only; void where prohibited; YMMV) You will not have signed your FBiOS with Apple’s signing key. Don’t even think about borrowing that. (“From my cold, dead hands” comes to mind)

“This is all the government is asking of Apple”

“That is all”?! That is plenty, but there is also the small matter of setting a dangerous and abhorrent legal precedent. Precedent is inseparable from our system of justice; one could argue that it keeps the wheels of justice running true. There are any number of local LEAs, TLAs and governments who are licking their chops, pre-filling requests for search warrants, and rooting for FBI to prevail here. Believe it. Manhattan,NY District Attorney, Cyrus Vance, Jr. publically stated that he has [between 100 and 200] phones that he wants to break into. And a number of PDs across the country have made similar statements.

It would get very interesting if a foreign govt. came knocking with a warrant to break the confiscated phone of a U.S. person. Without the precedent that Comey wants to set, Apple can tell all comers to go pound salt; with that precedent in U.S. law books, Apple could do no such thing. Do you, bluonek, really want that latter coin to drop? I don’t.

Mark February 25, 2016 3:37 PM

@Terrence
You want perfect code. That’s fine. I empathize. I think we would all feel a lot more confident in our cryptographic applications if we could just use them and know they are perfectly secure. But that simply isn’t reality.

And I’d have a lot more confidence in an application that has been audited and given the thumbs up from a security researcher (e.g. Schneier) than something that got an A+ from your TA.

JG4 February 25, 2016 5:14 PM

I don’t know if this is weak encryption or poor design, but that is splitting hairs. It’s like the trick question, “Are they incompetent or malicious?” They are both. That’s as good as it gets on the planet of unintended consequences. We hope that your stay is enjoyable.

That ‘Connected Car’? It’s Insecure Too
http://market-ticker.org/akcs-www?post=231159

http://www.usatoday.com/story/tech/news/2016/02/24/nissan-disables-app-hacked-electric-leaf-smart-phone-troy-hunt/80882756/

Lawrence D’Oliveiro February 25, 2016 5:58 PM

Encryption is a tool, not a weapon. It is right to tightly regulate, even ban, weapons, but not tools. This is because the uses of tools are primarily constructive, whereas this is not true of weapons.

Niko February 25, 2016 6:47 PM

@CallMeLateForSupper

As discussed above, foreign governments are sovereign and a legal ruling in the US probably has very little impact on what foreign governments do or don’t do. If China asks Apple or any other company to build a backdoor, sure Apple can tell China to pound sand. China in turn could ban the selling of Apple products in China. As I think Clive pointed out in another thread, China is quickly becoming the dominant economic market and it would be very hard for Apple or any other company to say no, assuming China wasn’t asking for anything that would create legal or criminal liability for Apple in other jurisdictions.

Tõnis February 25, 2016 8:54 PM

“I wish I could give the good guys the access they want without also giving the bad guys access …”

I don’t. There are no “good guys” who could potentially want access to my data; they’re all bad guys. Loved Apple’s brief^ … all except for the Conclusion where Apple states that it has “great respect” for the “professionals” at the DOJ and FBI. There’s no reason Apple should glorify tyrants even if it believes (erroneously) that their intentions are good.

Things have gotten so bad (i.e. government is so far out of control) that Americans need to brush up on their civics, in earnest, if liberty is to survive. Even courts have stated that rights won’t be passively defended, they must be vigorously asserted. This whole notion that anything is okay so long as it’s pursuant to a warrant or some law — even when it’s a bad law! — is hopelessly flawed. Compared to every other population, the American people have extreme and ultimate lawful (i.e. constitutional) control over their government, and I’m not talking about the delusion that one is actually making a difference by voting for either bad candidate A or bad candidate B. The answer lies in the jury system. Any judge who states that a juror may not vote his conscience is a liar. I don’t condone indiscriminately setting an accused free without considering any evidence when there’s an actual victim who has been hurt, but when lawmakers enact bad laws, Americans don’t have to suffer under them. They can set their peers free, and there’s nothing the tyrant on the bench can do about it. He cannot change a juror’s not guilty vote. I stand ready to serve as a juror, to vote my conscience, and to nullify^^ all bad law.

^https://assets.documentcloud.org/documents/2722196/Motion-to-Vacate-Brief-and-Supporting-Declarations.pdf

^^http://fija.org (disclaimer: I’m not a member yet.)

Tim van Beek February 26, 2016 2:59 AM

“It’s something I seem to need to say again and again.”

Yes, obviously, as the article of Mrs. Zheng that is supposed to present “the other side” of the debate at the NYT proves, unbelievably.

Please keep it up!

Clive Robinson February 26, 2016 4:26 AM

@ Niko,

To protect the power grid, the obvious solution is what Bruce proposed in 2012, disconnect the SCADA systems from the public internet.

You would be surprised just how difficult it is to get people to do that, from the industrial control engineers up through accountants and CEOs. Even Bruce needed some persuading originally.

Some of us have been pushing that particular rock uphill for over two decades and thus have a great deal of sympathy for Sisyphus…

Peter Hillier February 26, 2016 5:19 AM

Regardless of all the discourse I’ve read on this subject, which I’ll remind everyone impacts globally, not just the US, I haven’t heard anyone discuss the hypocrisy of even remotely allowing the FBI to get away with this request, when the NSA sponsors the very set of standards (Common Criteria) that it had begged Apple to adhere to for years to ensure the encryption protocols on the devices were solid.

65535 February 26, 2016 5:49 AM

“The FBI wants the ability to bypass encryption in the course of criminal investigations. This is known as a “backdoor,” because it’s a way at the encrypted information that bypasses the normal encryption mechanisms… I can tell you that there is no way to give the FBI that capability without weakening the encryption against all adversaries. This is crucial to understand. I can’t build an access technology that only works with proper legal authorization, or only for people with a particular citizenship or the proper morality. The technology just doesn’t work that way.

“If a backdoor exists, then anyone can exploit it. All it takes is knowledge of the backdoor and the capability to exploit it. And while it might temporarily be a secret, it’s a fragile secret. Backdoors are how everyone attacks computer systems.”- Bruce S.

I agree with Bruce. Encryption keeps you safe.

In the case of the FBI trying to set irreversible legal Precedent it is a clear power grab by the FBI to use Mass Surveillance to prosecute minor vice crimes such as pot sales and pornography.

Further, with the number of experts reviews the FBI malware to break iPhones in an open court, the details and possible the actual program will leak out into the wild. Other repressive countries and criminals will want it very same type of code – and will get it one way or another. This must be fought tooth and nail.

Apple’s legal response to the FBI:

https://assets.documentcloud.org/documents/2722434/Motion-to-Vacate-Brief-and-Supporting-Declarations.pdf

Stackpole February 26, 2016 6:22 AM

In re the last line of the posting:

“It’s something I seem to need to say again and again.”

Andre Gide put it this way…

Everything that needs to be said has already been said. But, since no one was listening, everything must be said again.

ianf February 26, 2016 6:42 AM

@ Peter Hillier “hasn’t heard anyone discuss the hypocrisy of even remotely allowing the FBI to get away with this request, when the NSA sponsors the very set of standards (Common Criteria) that it had begged Apple to adhere to for years to ensure the encryption protocols on the devices were solid.

That’s because no one can (hypo)criticize what another at the same table hypothesizes, and a third then hyperventilates and hyperboles ;-))

In truth, the FBI and the NSA’s interests in this matter are contrary, thus not viable for comparison. The DHS was supposed to iron out the wrinkles in the fabric of US TLAs counter-terrorist/something/etc activities, but instead it simply created another layer of bureaucracy as a source of funding for weaponization of local police forces, while cementing the distance between the investigative security branches. Those that, smelling the consolidating rat, now are even more fiercely independent than before.

Any particular (spectacular?) such “encryption begging” by the NSA that you’d care to refer to?

Jim February 26, 2016 3:17 PM

@ David Leppik

Well said.

I have some small points, and I’m not sure how much they matter in the big picture.

  1. Apple deigned the iOS update process such that the updates must be encrypted for each specific phone, with the public key of each individual phone. And, the encrypted blob must be signed by the Apple private key. (around page 5 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf)
  2. I think what the FBI is asking is for is for Apple to remove the “delete Secure Enclave password after 10 tries” feature, and the wait X seconds between tries. (around page 50)
  3. The FBI will still have to brute force the password to the Secure Enclave, but that will be relatively trivial if the password is just 4 numbers. If the device password was a real high-entropy passphrase, the task would not be trivial.

So, I think the FBI is really asking for the ability to brute force the password.

Say the surveillance target used one of the non-Apple messaging apps that requires the user to manage the public/private keys, like Threema, and that app was developed using best-practice crypto, I think the FBI/NSA/GCHQ will still not be able to brute force that app’s data before the heat-death of the universe.

Personally, I think a legitimate government that is “of the people and for the people”, will need a method to search digital data, like they can search a house… if they have a legitimate court order. It’s interesting that there is no door or lock they can’t break, but there is crypto that is truly unbreakable.

Math is free, and once let out of Pandora’s Box, it can’t be taken back. https://twitter.com/tweetnacl.

Niko February 26, 2016 11:34 PM

@Clive

The other option is to accept that hacks are going to happen and focus on ways of detecting hacks quickly(intrusion detection), minimizing the damage that they cause(preventing cascading failures, human in the loop), and recovering quickly(backup systems). Saying that Apple/Microsoft/Google products have vulnerabilities and hacks are inevitable as implied in the article above seems like a too easy out for the power companies to “pass the buck” on liability and responsibility for securing their own systems.

I also wonder if Bruce has changed his mind any. In 2009, https://www.schneier.com/blog/archives/2009/09/the_exaggerated.html , he claims that the risks of cyberterrorism and cyberwarfare are greatly exaggerated. Attacks on the operations of critical infrastructure(ex. taking down the power grid) would seem to fall into one or both of those categories.

Mark Mayer February 28, 2016 1:52 PM

@all
The following, mostly written as statements, is more of a question for you all. These are some assumptions and what I think follows from those assumptions, and the overall question is “do I have this right, or is there a mistake in my assumptions or my logic?” As many of you know, I claim layman status. I have not studied mathematics in depth nor have I studied logic in a formal way (beyond basic if/then statements and basic high school geometry proofs). This might make me more of a super-layman, as does the fact that I spend time on this blog, wresting with concepts some of you throw back and forth with ease. Restated, implied questions (made explicit in this paragraph) is “Am I getting this right? If not, would you tell me why not in term I can understand?”

Unbreakable vs. Strong Encryption
The reason we prefer the term Strong Encryption instead of Unbreakable Encryption is because there is not such thing as Unbreakable Encryption. Strong Encryption is effectively unbreakable on a human timescale, given certain assumptions we make about calculating power, current or future. The breakability by brute force of any given encryption scheme can be mathematically determined, and timescales determined by key length. Breaking encryption is mostly matter of guessing the key: longer keys are better. (This might be obvious to most if not all people that comment here, but isn’t obvious by the general public.)

Encryption Is Only as Strong as Its Implementation
A weak implementation of strong encryption is effectively weak. A weak implementation is one which can be penetrated to find vulnerabilities to be exploited (the activity commonly known as hacking or cracking. These exploits are used to either extract the key or bypass the implementation to get access to the unencrypted or decrypted data. Strong implementation seeks to mitigate this. To use an example from a current controversy, Apple’s scheme involves imposing a ten guess limitation (or the data is destroyed) and setting an interval between guesses to prolong the timescale of brute force attacks. (Digression: a strong implementation should be unbreakable by its inventor, or what’s the point?)

Convenience and Usability
Another issue with implementation is usability. Strong encryption is for naught if it isn’t used, and the general public (hereafter GP)has problems using good passwords (and indeed there is a branch of the security field devoted to weak (easily guessable) vs strong (hard to guess) passwords. (A rule of thumb is that longer and random passwords are better than short and easy to guess. Another is that larger ranges of “numbers” for each bit are stronger; thus alphanumeric (26 + 10) is stronger than numeric (10) while including all the symbols available on a keyboard is stronger still.) An ongoing problem is that the GP is not skilled at memorizing long passwords of random digits. Four to six digits seem to be within the the GP’s capability. (Some have argued that people should be able to memorize 10 random numbers because in the age before autodial, people commonly memorized 10 digit phone numbers. However, it should be pointed out that area codes and the first three digits of a phone number were not random. In the time and place I grew up, I was in the 818 area code, as were most of my friends. Among us, the sequence 247 was very common as the first three digits after the area code. Therefore, these numbers are easier to memorize.

Back to our example, Apple’s security system is designed for the GP. They hope that their safeguards (the two mentioned above) make convenient short passwords viable. More recently, they introduced biometric based keys with a password fall back. Their goal is the make strongly implemented strong encryption easy and convenient to use. This should be the goal of security for the GP. Those that administer and secure networks on a day to day basis know how hard it is to train their users to use strong passwords (as well as avoid other attacks).

Enter the FBI
The FBI discovered or learned that Apple’s scheme has a potential vulnerability. Some are claiming that a potential vulnerability is the same as a vulnerability. I can see the distinction, but it looks to me to be mostly one of semantics. The difference is that Apple doesn’t already have exploit but is being compelled to create one. The vulnerability is this: Apple’s scheme is implemented at a low software level (the firmware), and can be replaced/reprogrammed with alternative code that bypasses the safeguards. This is even true for their newer handsets built with Secure Enclave because Secure Enclave is programmable.

So we can have vociferous arguments whether Apple’s claim that their implementation is so good even they can’t break it (the holy grail of security). This claim is literally true for the time being. (I take Apple at its word that it doesn’t already have the exploit not because I naively believe “companies never lie”, but because being caught in a such a lie can have severe repercussions for a publicly company like Apple, repercussions that are both financial, legal, and related to its brand.) That said, only those in possession of Apple’s private key signature can load firmware that bypasses the safeguards. I have seen claims that this fact alone means that Apple’s statement it can’t break its own security is false, but that claim seems absurd. How else could an encrypted phone’s security work without the handset maker having the private signature?

@Clive Robinson – this is what you’re talking about regarding making targets of certain Apple engineers, no? In my host country, the criminal cartels have the standard bribery offer of “plata o plomo”, silver or lead. That is, bribe money or a bullet to the head (or torture, torture of family, etc.) It seems to me that while this is true for the engineers that create firmware workaround, it is especially true for any with access to or control of the key signature.

Other TLAs, Other Nation’s Security Services
While the NSA would certainly like to have a backdoor to the iPhone (that is if they don’t already have one), they probably don’t really need one. According to Michael Hayden, ex-NSA chief, they get everything they need from bulk collection and metadata. Current NSA director Michael Rodgers has given his support to the FBI, but hasn’t claimed that encryption represents an existential crisis for the NSA, as did Jim Comey, FBI director. (While I love “Kremlin watching” and guessing what this means vis a vis the relationship between rival bureaucracies, I’ll leave that for another time” or maybe a footnote if I get to it.)

I think it is safe to say that China has a system of bulk collection and metadata similar to the NSA. I find it plausible that their condition to Apple and others is that they will forgo a backdoor if no one else has one. To insure compliance, they audit Apple’s handsets. I find this plausible because for the past 30+ years, China has shifted from a totalitarian government to an authoritarian one (with swings of the pendulum between more and less control). This shift occurred because for the past 30+ years, China’s goal has been economic development using the engine of global market capitalism. (We still hear of terrible human rights abuses, but by and large the Chinese GP is no longer subject to ideological “re-education” and strict enforcement of “correct thought”. The Communist Party rules with a looser grip than it did in the years prior to Deng Xiao Peng taking control of the CP.) Anyway, encryption is good for business, and business is the business of the Chinese government currently. Strenuous mass thought control gets in the way of business and individual dissidents that get too much public attention trigger a tailored response. Metadata and bulk collection helps the government identify emerging dissident threats for targetted surveillance.

Other nation states can work with the U.S. (the five eyes), and potentially with China. Russia has its own thing going, but doesn’t seem to be sufficiently powerful to demand backdoors and weak implementations from multi-national companies. I don’t know as much about the political scene in Russia, other than that the government seems to be a partnership between the former USSR security apparatus and the criminal organizations that thrived under the USSR. Basically the same set up, minus socialist ideology.

Is the U.S. Government the Enemy?
The foundation of government is the Social Contract, and that contract at its most basic is “we shall create a sovereign entity that we shall endow with enough power to keep us from murdering, raping, and pillaging one another”. This is the basic purpose of government, to protect us from each other, whether the other is a neighbor/countryman or an outsider. Another word for the conditions created by the contract is “orderly society”. You already know this if you’ve read your Hobbes.

In practice is doesn’t always work that way. When a government concentrates too much power, we see a tendency (most probably a rule) for the government to be the perpetrator of the murder, rape and pillage. The U.S. government and other democratic constitutional governments can be viewed as an attempt to protect us from our elected sovereign, to limit their power. It’s a balancing act. Too little power, and the government cannot create an orderly society.

Any student of U.S. government knows that the U.S. government is not a monolithic power. Obviously, power is divided between the three branches of the Executive (implementation of rules and laws, etc), the Legislature (creator of rules and laws), and the Judiciary (interpreter of rules and laws). But, some will argue, the Executive is a monolith because it is controlled by a single leader, the President. I would argue that the control is rather weak because the executive is broken up into competing bureaucracies that at least compete for funding and sometimes have deeper rivalries because of overlapping missions.

A student of organizations will know that organizations, such as those departments and agencies in the Executive, constitute artificial living organisms in their own right. They seek to survive, even thrive, according to their own logic. To do this, they accumulate power. To accumulate power, they fight one another, have turf wars, etc. Sometimes they involve the other two branches as allies or adversaries.

I need to cut to the chase. All of this is a long winded way to say that certain parts of the government can be adversarial, but not all of it and not all the time. And while bureaucracies follow their own logic, they are staffed and led by people who mostly are well intended (even as they compete and jockey for position within their agencies and with other agencies. James Comey believes he has our best interests at heart. He thinks he is doing the right thing even as he engages in oppression. This might not be true, and Jimbo is a thoroughly evil bottom feeding scum that has put his ambitions above all us. But for the sake of debate, let’s give him the benefit of the doubt.

Moderate Language
“For the sake of debate” is exactly why Tim Cook makes a point of not excoriating the government and not accusing the FBI of evil intent. Both the public and those elected by the public are persuadable. James Comey might not be persuadeable, but that portion of the public that thinks like Comey, that identifies with him, or that has been swayed by his emotional and fear based arguments might be open to persuasion. Probably this is also why you don’t see Bruce Schneier engage in anti-government diatribes.

You don’t persuade someone by attacking them, by calling them names. Nor do you persuade them by attacking a person, institution, or ethos with whom they identify. That’s attacking their identity, same as attacking them personally, possibly worse.

There is also a large part of the public that might have complaints about government, but that are less likely to be persuaded by vitriolic attacks. “Reasonable” people are sensitive to trigger words that indicate irrational hate or other craziness. They might still be swayed by irrational and emotional arguments, but they will walk away from words that seem too angry. (Setting aside Donald Trump and other politicians on the campaign trail.)

Well, I went beyond what I planned to write, but I thought I should address some of the comments above without calling anyone out. Comments and criticism are appreciated. Thanks!

Mark Mayer February 28, 2016 2:14 PM

@Jim
Re:2
We don’t know how Apple implemented the features, how convoluted the code might be, but, if they used the most straightforward method, one wouldn’t delete the code. One would merely change values in those lines, raising the number of tries before erasure to a suitably high number and setting the time between tries to 0. No?

That seems almost too simple, which makes me think they did something devious to complicate making changes. But sometimes the most obvious answer is the correct one.

@oldmanc
Apple already has two phones like that, one with long passwords and one with short passwords. It just so happens that both these phones are one and the same. You can choose which phone you have by changing the password setting. Apple doesn’t need to market two different phones with two different security levels because it can be configured in software and it’s better to let the consumer choose which the want. It’s also better to give them the choice to change their minds.

I imagine that Apple chose a short default key length for the general public, the lowest common denominator. Perhaps they studied which key length people will actually use. Note that they changed the default from 4 to 6. In general, I think it’s probable that those with greater security needs, or unwilling to trade security for convenience, will set longer passwords.

Personally, I’m glad that iOS let’s me change the key length of my passcode. As a result of this latest controversy, I increased my key length by 3 digits.

Buck February 28, 2016 3:31 PM

@Mark Mayer

one wouldn’t delete the code. One would merely change values in those lines, raising the number of tries before erasure to a suitably high number

Sounds more risky to me… What if the number wasn’t high enough? Or the brute-forcer has a subtle bug that causes it to try ‘0000’ over and over again..? An integer overflow somewhere that would never have been detected if this firmware hadn’t been made?

I would think changing the number of tries from 10 to some very large number is not something that has ever been evaluated through existing QA procedures. If it were me, I’d feel much more comfortable deleting the body of the if(n>=10) — specifically the line that makes a call to the delete function. Maybe also delete that delete function definition for good measure. 😉

Clive Robinson February 28, 2016 11:00 PM

@ Mark Mayer,

It seems to me that while this is true for the engineers that create firmware workaround, it is especially true for any with access to or control of the key signature.

Whilst this is true the methods used to protect the code signing key like the master private key of CA’s is often done in ways where nobody actually gets to touch or see the key, and therefore there is nobody to bribe or threaten to get it.

Thus the most likely attack vector is that “a backdoor in the backdoor” would be put in so it’s not locked to a single phone.

There are many ways such “backdoor backdoors” could be added the one I would be tempted to use would be based on the work of Adam Young and Moti Yung, which hides an asymetric key system in an authentication system.

Thus those putting preasure onto the engineer can give him a backdoor that only they can use and is thus like the suspected NSA NOBUS on the Dual EC DRNG not provable by anybody else.

Yoshi February 29, 2016 6:09 PM

It’s very important not to overlook a distinction here that the blog author has overlooked:

Cracking encryption is not the same as engineering and installing a backdoor.

True cryptanalysis involves the science of understanding cryptological techniques and technologies and thus gaining the ability to gather either none, partial, or full information about the cryptological materials and/or environments.

Software engineering is almost a separate field in some ways.

As a quick analogy, a locksmith does not install a new backdoor in your house if you accidentally lock yourself out. And if law enforcement needs to consult a locksmith that locksmith doesn’t install a backdoor either. Instead, in both situations, they attempt to find a way to overcome or overwhelm, the security that the lock provides. Either the success is none, partial, or full. And in the process, they learn about the environment of the lock.

I think it’s quite normal and acceptable for law enforcement including but not limited to the FBI to be enabled and even expected to lawfully employ locksmiths and locksmith technology to get to evidence. Similarly this applies to cryptological issues that aren’t so different.

Again, the FBI could be granted cryptological access without needing to install a “backdoor” in any capacity. Apple doesn’t need protection from cryptanalytical breakins in the name of counterterrorism.

Interestingly, the FBI should be allowed to solicit the NSA to crack into any repository of evidence in whatever form if it’s in the deliberate process of gaining important evidence in the acts of counterterrorism.

The public is widely misinformed and our newscasters aren’t helping much either with their sensationalism.

Meanwhile, as a side note, is anybody concerned about the USA’s recent test launching of an ICBM? In terms of geopolitical stability as a factor in the war against terrorism this seems tragic to me.

Sincerely,
“Yoshi”

Carlindo Hugueney March 10, 2016 6:15 PM

Any chain is as weak as its weakest link. So what is the point of PGP using 2048 bit key pairs and 100+ character long passphrases for encryption key exchange while the actual message encryption is done with 256 bit keys? Please comment on that.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.