Entries Tagged "crypto wars"

Page 3 of 5

Australia Considering New Law Weakening Encryption

News from Australia:

Under the law, internet companies would have the same obligations telephone companies do to help law enforcement agencies, Prime Minister Malcolm Turnbull said. Law enforcement agencies would need warrants to access the communications.

“We’ve got a real problem in that the law enforcement agencies are increasingly unable to find out what terrorists and drug traffickers and pedophile rings are up to because of the very high levels of encryption,” Turnbull told reporters.

“Where we can compel it, we will, but we will need the cooperation from the tech companies,” he added.

Never mind that the law 1) would not achieve the desired results because all the smart “terrorists and drug traffickers and pedophile rings” will simply use a third-party encryption app, and 2) would make everyone else in Australia less secure. But that’s all ground I’ve covered before.

I found this bit amusing:

Asked whether the laws of mathematics behind encryption would trump any new legislation, Mr Turnbull said: “The laws of Australia prevail in Australia, I can assure you of that.

“The laws of mathematics are very commendable but the only law that applies in Australia is the law of Australia.”

Next Turnbull is going to try to legislate that pi = 3.2.

Another article. BoingBoing post.

EDITED TO ADD: More commentary.

Posted on July 17, 2017 at 6:29 AMView Comments

Encryption Policy and Freedom of the Press

Interesting law journal article: “Encryption and the Press Clause,” by D. Victoria Barantetsky.

Abstract: Almost twenty years ago, a hostile debate over whether government could regulate encryption—later named the Crypto Wars—seized the country. At the center of this debate stirred one simple question: is encryption protected speech? This issue touched all branches of government percolating from Congress, to the President, and eventually to the federal courts. In a waterfall of cases, several United States Court of Appeals appeared to reach a consensus that encryption was protected speech under the First Amendment, and with that the Crypto Wars appeared to be over, until now.

Nearly twenty years later, the Crypto Wars have returned. Following recent mass shootings, law enforcement has once again questioned the legal protection for encryption and tried to implement “backdoor” techniques to access messages sent over encrypted channels. In the case, Apple v. FBI, the agency tried to compel Apple to grant access to the iPhone of a San Bernardino shooter. The case was never decided, but the legal arguments briefed before the court were essentially the same as they were two decades prior. Apple and amici supporting the company argued that encryption was protected speech.

While these arguments remain convincing, circumstances have changed in ways that should be reflected in the legal doctrines that lawyers use. Unlike twenty years ago, today surveillance is ubiquitous, and the need for encryption is no longer felt by a seldom few. Encryption has become necessary for even the most basic exchange of information given that most Americans share “nearly every aspect of their lives ­—from the mundane to the intimate” over the Internet, as stated in a recent Supreme Court opinion.

Given these developments, lawyers might consider a new justification under the Press Clause. In addition to the many doctrinal concerns that exist with protection under the Speech Clause, the Press Clause is normatively and descriptively more accurate at protecting encryption as a tool for secure communication without fear of government surveillance. This Article outlines that framework by examining the historical and theoretical transformation of the Press Clause since its inception.

Edited to Add (4/12): Follow-up article.

Posted on April 4, 2017 at 2:14 PMView Comments

More on the Going Dark Debate

Lawfare is turning out to be the go-to blog for policy wonks about various government debates on cybersecurity. There are two good posts this week on the Going Dark debate.

The first is from those of us who wrote the “Keys Under Doormats” paper last year, criticizing the concept of backdoors and key escrow. We were responding to a half-baked proposal on how to give the government access without causing widespread insecurity, and we pointed out where almost of all of these sorts of proposals fall short:

1. Watch for systems that rely on a single powerful key or a small set of them.

2. Watch for systems using high-value keys over and over and still claiming not to increase risk.

3. Watch for the claim that the abstract algorithm alone is the measure of system security.

4. Watch for the assumption that scaling anything on the global Internet is easy.

5. Watch for the assumption that national borders are not a factor.

6. Watch for the assumption that human rights and the rule of law prevail throughout the world.

The second is by Susan Landau, and is a response to the ODNI’s response to the “Don’t Panic” report. Our original report said basically that the FBI wasn’t going dark and that surveillance information is everywhere. At a Senate hearing, Sen. Wyden requested that the Office of the Director of National Intelligence respond to the report. It did—not very well, honestly—and Landau responded to that response. She pointed out that there really wasn’t much disagreement: that the points it claimed to have issue with were actually points we made and agreed with.

In the end, the ODNI’s response to our report leaves me somewhat confused. The reality is that the only strong disagreement seems to be with an exaggerated view of one finding. It almost appears as if ODNI is using the Harvard report as an opportunity to say, “Widespread use of encryption will make our work life more difficult.” Of course it will. Widespread use of encryption will also help prevent some of the cybersecurity exploits and attacks we have been experiencing over the last decade. The ODNI letter ignored that issue.

EDITED TO ADD: Related is this article where James Comey defends spending $1M+ on that iPhone vulnerability. There’s some good discussion of the vulnerabilities equities process, and the FBI’s technical lack of sophistication.

Posted on May 13, 2016 at 6:55 AMView Comments

The Importance of Strong Encryption to Security

Encryption keeps you safe. Encryption protects your financial details and passwords when you bank online. It protects your cell phone conversations from eavesdroppers. If you encrypt your laptop—and I hope you do—it protects your data if your computer is stolen. It protects our money and our privacy.

Encryption protects the identity of dissidents all over the world. It’s a vital tool to allow journalists to communicate securely with their sources, NGOs to protect their work in repressive countries, and lawyers to communicate privately with their clients. It protects our vital infrastructure: our communications network, the power grid and everything else. And as we move to the Internet of Things with its cars and thermostats and medical devices, all of which can destroy life and property if hacked and misused, encryption will become even more critical to our security.

Security is more than encryption, of course. But encryption is a critical component of security. You use strong encryption every day, and our Internet-laced world would be a far riskier place if you didn’t.

Strong encryption means unbreakable encryption. Any weakness in encryption will be exploited—by hackers, by criminals and by foreign governments. Many of the hacks that make the news can be attributed to weak or—even worse—nonexistent encryption.

The FBI wants the ability to bypass encryption in the course of criminal investigations. This is known as a “backdoor,” because it’s a way at the encrypted information that bypasses the normal encryption mechanisms. I am sympathetic to such claims, but as a technologist I can tell you that there is no way to give the FBI that capability without weakening the encryption against all adversaries. This is crucial to understand. I can’t build an access technology that only works with proper legal authorization, or only for people with a particular citizenship or the proper morality. The technology just doesn’t work that way.

If a backdoor exists, then anyone can exploit it. All it takes is knowledge of the backdoor and the capability to exploit it. And while it might temporarily be a secret, it’s a fragile secret. Backdoors are how everyone attacks computer systems.

This means that if the FBI can eavesdrop on your conversations or get into your computers without your consent, so can cybercriminals. So can the Chinese. So can terrorists. You might not care if the Chinese government is inside your computer, but lots of dissidents do. As do the many Americans who use computers to administer our critical infrastructure. Backdoors weaken us against all sorts of threats.

Either we build encryption systems to keep everyone secure, or we build them to leave everybody vulnerable.

Even a highly sophisticated backdoor that could only be exploited by nations like the United States and China today will leave us vulnerable to cybercriminals tomorrow. That’s just the way technology works: things become easier, cheaper, more widely accessible. Give the FBI the ability to hack into a cell phone today, and tomorrow you’ll hear reports that a criminal group used that same ability to hack into our power grid.

The FBI paints this as a trade-off between security and privacy. It’s not. It’s a trade-off between more security and less security. Our national security needs strong encryption. I wish I could give the good guys the access they want without also giving the bad guys access, but I can’t. If the FBI gets its way and forces companies to weaken encryption, all of us—our data, our networks, our infrastructure, our society—will be at risk.

This essay previously appeared in the New York Times “Room for Debate” blog. It’s something I seem to need to say again and again.

Posted on February 25, 2016 at 6:40 AMView Comments

Judge Demands that Apple Backdoor an iPhone

A judge has ordered that Apple bypass iPhone security in order for the FBI to attempt a brute-force password attack on an iPhone 5c used by one of the San Bernardino killers. Apple is refusing.

The order is pretty specific technically. This implies to me that what the FBI is asking for is technically possible, and even that Apple assisted in the wording so that the case could be about the legal issues and not the technical ones.

From Apple’s statement about its refusal:

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks ­ from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers ­ including tens of millions of American citizens ­ from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

Congressman Ted Lieu comments.

Here’s an interesting essay about why Tim Cook and Apple are such champions for encryption and privacy.

Today I walked by a television showing CNN. The sound was off, but I saw an aerial scene which I presume was from San Bernardino, and the words “Apple privacy vs. national security.” If that’s the framing, we lose. I would have preferred to see “National security vs. FBI access.”

Slashdot thread.

EDITED TO ADD (2/18): Good analysis of Apple’s case. Interesting debate. Nicholas Weaver’s comments. And commentary from some other planet.

EDITED TO ADD (2/19): Ben Adida comments:

What’s probably happening is that the FBI is using this as a test case for the general principle that they should be able to compel tech companies to assist in police investigations. And that’s pretty smart, because it’s a pretty good test case: Apple obviously wants to help prevent terrorist attacks, so they’re left to argue the slippery slope argument in the face of an FBI investigation of a known terrorist. Well done, FBI, well done.

And Julian Sanchez’s comments. His conclusion:

These, then, are the high stakes of Apple’s resistance to the FBI’s order: not whether the federal government can read one dead terrorism suspect’s phone, but whether technology companies can be conscripted to undermine global trust in our computing devices. That’s a staggeringly high price to pay for any investigation.

A New York Times editorial.

Also, two questions: One, what do we know about Apple’s assistance in the past, and why this one is different? Two, has anyone speculated on how much this will cost Apple? The FBI is demanding that Apple give them free engineering work. What’s the value of that work?

EDITED TO ADD (2/20): Jonathan Zdziarski writes on the differences between the FBI compelling someone to provide a service versus build a tool, and why the latter will 1) be difficult and expensive, 2) will get out into the wild, and 3) set a dangerous precedent.

This answers my first question, above:

For years, the government could come to Apple with a subpoena and a phone, and have the manufacturer provide a disk image of the device. This largely worked because Apple didn’t have to hack into their phones to do this. Up until iOS 8, the encryption Apple chose to use in their design was easily reversible when you had code execution on the phone (which Apple does). So all through iOS 7, Apple only needed to insert the key into the safe and provide FBI with a copy of the data.

EFF wrote a good technical explainer on the case. My only complaint is with the last section. I have heard directly from Apple that this technique still works on current model phones using the current iOS version.

I am still stunned by how good a case the FBI chose to push this. They have all the sympathy in the media that they could hope for.

EDITED TO ADD (2/20): Tim Cook as privacy advocate. How the back door works on modern iPhones. Why the average American should care. The grugq on what this all means.

EDITED TO ADD (2/22): I wrote an op ed for the Washington Post.

Posted on February 17, 2016 at 2:15 PMView Comments

Worldwide Encryption Products Survey

Today I released my worldwide survey of encryption products.

The findings of this survey identified 619 entities that sell encryption products. Of those 412, or two-thirds, are outside the U.S.-calling into question the efficacy of any US mandates forcing backdoors for law-enforcement access. It also showed that anyone who wants to avoid US surveillance has over 567 competing products to choose from. These foreign products offer a wide variety of secure applications­—voice encryption, text message encryption, file encryption, network-traffic encryption, anonymous currency­—providing the same levels of security as US products do today.

Details:

  • There are at least 865 hardware or software products incorporating encryption from 55 different countries. This includes 546 encryption products from outside the US, representing two-thirds of the total.
  • The most common non-US country for encryption products is Germany, with 112 products. This is followed by the United Kingdom, Canada, France, and Sweden, in that order.
  • The five most common countries for encryption products­—including the US­—account for two-thirds of the total. But smaller countries like Algeria, Argentina, Belize, the British Virgin Islands, Chile, Cyprus, Estonia, Iraq, Malaysia, St. Kitts and Nevis, Tanzania, and Thailand each produce at least one encryption product.
  • Of the 546 foreign encryption products we found, 56% are available for sale and 44% are free. 66% are proprietary, and 34% are open source. Some for-sale products also have a free version.
  • At least 587 entities­—primarily companies—­either sell or give away encryption products. Of those, 374, or about two-thirds, are outside the US.
  • Of the 546 foreign encryption products, 47 are file encryption products, 68 e-mail encryption products, 104 message encryption products, 35 voice encryption products, and 61 virtual private networking products.

The report is here, here, and here. The data, in Excel form, is here.

Press articles are starting to come in. (Here are the previous blog posts on the effort.)

I know the database is incomplete, and I know there are errors. I welcome both additions and corrections, and will be releasing a 1.1 version of this survey in a few weeks.

EDITED TO ADD (2/13): More news.

Posted on February 11, 2016 at 11:05 AMView Comments

UK Government Promoting Backdoor-Enabled Voice Encryption Protocol

The UK government is pushing something called the MIKEY-SAKKE protocol to secure voice. Basically, it’s an identity-based system that necessarily requires a trusted key-distribution center. So key escrow is inherently built in, and there’s no perfect forward secrecy. The only reasonable explanation for designing a protocol with these properties is third-party eavesdropping.

Steven Murdoch has explained the details. The upshot:

The design of MIKEY-SAKKE is motivated by the desire to allow undetectable and unauditable mass surveillance, which may be a requirement in exceptional scenarios such as within government departments processing classified information. However, in the vast majority of cases the properties that MIKEY-SAKKE offers are actively harmful for security. It creates a vulnerable single point of failure, which would require huge effort, skill and cost to secure ­ requiring resource beyond the capability of most companies. Better options for voice encryption exist today, though they are not perfect either. In particular, more work is needed on providing scalable and usable protection against man-in-the-middle attacks, and protection of metadata for contact discovery and calls. More broadly, designers of protocols and systems need to appreciate the ethical consequences of their actions in terms of the political and power structures which naturally follow from their use. MIKEY-SAKKE is the latest example to raise questions over the policy of many governments, including the UK, to put intelligence agencies in charge of protecting companies and individuals from spying, given the conflict of interest it creates.

And GCHQ previously rejected a more secure standard, MIKEY-IBAKE, because it didn’t allow undetectable spying.

Both the NSA and GCHQ repeatedly choose surveillance over security. We need to reject that decision.

Posted on January 22, 2016 at 2:23 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.