Entries Tagged "backdoors"

Page 14 of 20

The Risks of Mandating Backdoors in Encryption Products

Tuesday, a group of cryptographers and security experts released a major paper outlining the risks of government-mandated back-doors in encryption products: Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications, by Hal Abelson, Ross Anderson, Steve Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter Neumann, Ron Rivest, Jeff Schiller, Bruce Schneier, Michael Specter, and Danny Weitzner.

Abstract: Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement channels going dark, these attempts to regulate the emerging Internet were abandoned. In the intervening years, innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing vastly larger quantities of data. Today we are again hearing calls for regulation to mandate the provision of exceptional access mechanisms. In this report, a group of computer scientists and security experts, many of whom participated in a 1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates. We have found that the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20 years ago. In the wake of the growing economic and social cost of the fundamental insecurity of today’s Internet environment, any proposals that alter the security dynamics online should be approached with caution. Exceptional access would force Internet system developers to reverse forward secrecy design practices that seek to minimize the impact on user privacy when systems are breached. The complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws. Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.

It’s already had a big impact on the debate. It was mentioned several times during yesterday’s Senate hearing on the issue (see here).

Three blog posts by authors. Four different news articles, and this analysis of how the New York Times article changed. Also, a New York Times editorial.

EDITED TO ADD (7/9): Peter Swire’s Senate testimony is worth reading.

EDITED TO ADD (7/10): Good article on these new crypto wars.

EDITED TO ADF (7/14): Two rebuttals, neither very convincing.

Posted on July 9, 2015 at 6:31 AMView Comments

What is the DoD's Position on Backdoors in Security Systems?

In May, Admiral James A. Winnefeld, Jr., vice-chairman of the Joint Chiefs of Staff, gave an address at the Joint Service Academies Cyber Security Summit at West Point. After he spoke for twenty minutes on the importance of Internet security and a good national defense, I was able to ask him a question (32:42 mark) about security versus surveillance:

Bruce Schneier: I’d like to hear you talk about this need to get beyond signatures and the more robust cyber defense and ask the industry to provide these technologies to make the infrastructure more secure. My question is, the only definition of “us” that makes sense is the world, is everybody. Any technologies that we’ve developed and built will be used by everyone—nation-state and non-nation-state. So anything we do to increase our resilience, infrastructure, and security will naturally make Admiral Rogers’s both intelligence and attack jobs much harder. Are you okay with that?

Admiral James A. Winnefeld: Yes. I think Mike’s okay with that, also. That’s a really, really good question. We call that IGL. Anyone know what IGL stands for? Intel gain-loss. And there’s this constant tension between the operational community and the intelligence community when a military action could cause the loss of a critical intelligence node. We live this every day. In fact, in ancient times, when we were collecting actual signals in the air, we would be on the operational side, “I want to take down that emitter so it’ll make it safer for my airplanes to penetrate the airspace,” and they’re saying, “No, you’ve got to keep that emitter up, because I’m getting all kinds of intelligence from it.” So this is a familiar problem. But I think we all win if our networks are more secure. And I think I would rather live on the side of secure networks and a harder problem for Mike on the intelligence side than very vulnerable networks and an easy problem for Mike. And part of that—it’s not only the right thing do, but part of that goes to the fact that we are more vulnerable than any other country in the world, on our dependence on cyber. I’m also very confident that Mike has some very clever people working for him. He might actually still be able to get some work done. But it’s an excellent question. It really is.

It’s a good answer, and one firmly on the side of not introducing security vulnerabilities, backdoors, key-escrow systems, or anything that weakens Internet systems. It speaks to what I have seen as a split in the Second Crypto War, between the NSA and the FBI on building secure systems versus building systems with surveillance capabilities.

I have written about this before:

But here’s the problem: technological capabilities cannot distinguish based on morality, nationality, or legality; if the US government is able to use a backdoor in a communications system to spy on its enemies, the Chinese government can use the same backdoor to spy on its dissidents.

Even worse, modern computer technology is inherently democratizing. Today’s NSA secrets become tomorrow’s PhD theses and the next day’s hacker tools. As long as we’re all using the same computers, phones, social networking platforms, and computer networks, a vulnerability that allows us to spy also allows us to be spied upon.

We can’t choose a world where the US gets to spy but China doesn’t, or even a world where governments get to spy and criminals don’t. We need to choose, as a matter of policy, communications systems that are secure for all users, or ones that are vulnerable to all attackers. It’s security or surveillance.

NSA Director Admiral Mike Rogers was in the audience (he spoke earlier), and I saw him nodding at Winnefeld’s answer. Two weeks later, at CyCon in Tallinn, Rogers gave the opening keynote, and he seemed to be saying the opposite.

“Can we create some mechanism where within this legal framework there’s a means to access information that directly relates to the security of our respective nations, even as at the same time we are mindful we have got to protect the rights of our individual citizens?”

[…]

Rogers said a framework to allow law enforcement agencies to gain access to communications is in place within the phone system in the United States and other areas, so “why can’t we create a similar kind of framework within the internet and the digital age?”

He added: “I certainly have great respect for those that would argue that they most important thing is to ensure the privacy of our citizens and we shouldn’t allow any means for the government to access information. I would argue that’s not in the nation’s best long term interest, that we’ve got to create some structure that should enable us to do that mindful that it has to be done in a legal way and mindful that it shouldn’t be something arbitrary.”

Does Winnefeld know that Rogers is contradicting him? Can someone ask JCS about this?

Posted on June 24, 2015 at 7:42 AMView Comments

History of the First Crypto War

As we’re all gearing up to fight the Second Crypto War over governments’ demands to be able to back-door any cryptographic system, it pays for us to remember the history of the First Crypto War. The Open Technology Institute has written the story of those years in the mid-1990s.

The act that truly launched the Crypto Wars was the White House’s introduction of the “Clipper Chip” in 1993. The Clipper Chip was a state-of-the-art microchip developed by government engineers which could be inserted into consumer hardware telephones, providing the public with strong cryptographic tools without sacrificing the ability of law enforcement and intelligence agencies to access unencrypted versions of those communications. The technology relied on a system of “key escrow,” in which a copy of each chip’s unique encryption key would be stored by the government. Although White House officials mobilized both political and technical allies in support of the proposal, it faced immediate backlash from technical experts, privacy advocates, and industry leaders, who were concerned about the security and economic impact of the technology in addition to obvious civil liberties concerns. As the battle wore on throughout 1993 and into 1994, leaders from across the political spectrum joined the fray, supported by a broad coalition that opposed the Clipper Chip. When computer scientist Matt Blaze discovered a flaw in the system in May 1994, it proved to be the final death blow: the Clipper Chip was dead.

Nonetheless, the idea that the government could find a palatable way to access the keys to encrypted communications lived on throughout the 1990s. Many policymakers held onto hopes that it was possible to securely implement what they called “software key escrow” to preserve access to phone calls, emails, and other communications and storage applications. Under key escrow schemes, a government-certified third party would keep a “key” to every device. But the government’s shift in tactics ultimately proved unsuccessful; the privacy, security, and economic concerns continued to outweigh any potential benefits. By 1997, there was an overwhelming amount of evidence against moving ahead with any key escrow schemes.

The Second Crypto War is going to be harder and nastier, and I am less optimistic that strong cryptography will win in the short term.

Posted on June 22, 2015 at 1:35 PMView Comments

UN Report on the Value of Encryption to Freedom Worldwide

The United Nation’s Office of the High Commissioner released a report on the value of encryption and anonymity to the world:

Summary: In the present report, submitted in accordance with Human Rights Council resolution 25/2, the Special Rapporteur addresses the use of encryption and anonymity in digital communications. Drawing from research on international and national norms and jurisprudence, and the input of States and civil society, the report concludes that encryption and anonymity enable individuals to exercise their rights to freedom of opinion and expression in the digital age and, as such, deserve strong protection.

Here’s the bottom line:

60. States should not restrict encryption and anonymity, which facilitate and often enable the rights to freedom of opinion and expression. Blanket prohibitions fail to be necessary and proportionate. States should avoid all measures that weaken the security that individuals may enjoy online, such as backdoors, weak encryption standards and key escrows. In addition, States should refrain from making the identification of users a condition for access to digital communications and online services and requiring SIM card registration for mobile users. Corporate actors should likewise consider their own policies that restrict encryption and anonymity (including through the use of pseudonyms). Court-ordered decryption, subject to domestic and international law, may only be permissible when it results from transparent and publicly accessible laws applied solely on a targeted, case-by-case basis to individuals (i.e., not to a mass of people) and subject to judicial warrant and the protection of due process rights of individuals.

One news report called this “wishy-washy when it came to government-mandated backdoors to undermine encryption,” but I don’t see that. Government mandated backdoors, key escrow, and weak encryption are all bad. Corporations should offer their users strong encryption and anonymity. Any systems that still leave corporations with the keys and/or the data—and there are going to be lots of them—should only give them up to the government in the face of an individual and lawful court order.

I think the principles are reasonable.

Posted on May 29, 2015 at 7:49 AMView Comments

How the CIA Might Target Apple's XCode

The Intercept recently posted a story on the CIA’s attempts to hack the iOS operating system. Most interesting was the speculation that it hacked XCode, which would mean that any apps developed using that tool would be compromised.

The security researchers also claimed they had created a modified version of Apple’s proprietary software development tool, Xcode, which could sneak surveillance backdoors into any apps or programs created using the tool. Xcode, which is distributed by Apple to hundreds of thousands of developers, is used to create apps that are sold through Apple’s App Store.

The modified version of Xcode, the researchers claimed, could enable spies to steal passwords and grab messages on infected devices. Researchers also claimed the modified Xcode could “force all iOS applications to send embedded data to a listening post.” It remains unclear how intelligence agencies would get developers to use the poisoned version of Xcode.

Researchers also claimed they had successfully modified the OS X updater, a program used to deliver updates to laptop and desktop computers, to install a “keylogger.”

It’s a classic application of Ken Thompson’s classic 1984 paper, “Reflections on Trusting Trust,” and a very nasty attack. Dan Wallach speculates on how this might work.

Posted on March 16, 2015 at 7:38 AMView Comments

FREAK: Security Rollback Attack Against SSL

This week, we learned about an attack called “FREAK”—”Factoring Attack on RSA-EXPORT Keys”—that can break the encryption of many websites. Basically, some sites’ implementations of secure sockets layer technology, or SSL, contain both strong encryption algorithms and weak encryption algorithms. Connections are supposed to use the strong algorithms, but in many cases an attacker can force the website to use the weaker encryption algorithms and then decrypt the traffic. From Ars Technica:

In recent days, a scan of more than 14 million websites that support the secure sockets layer or transport layer security protocols found that more than 36 percent of them were vulnerable to the decryption attacks. The exploit takes about seven hours to carry out and costs as little as $100 per site.

This is a general class of attack I call “security rollback” attacks. Basically, the attacker forces the system users to revert to a less secure version of their protocol. Think about the last time you used your credit card. The verification procedure involved the retailer’s computer connecting with the credit card company. What if you snuck around to the back of the building and severed the retailer’s phone lines? Most likely, the retailer would have still accepted your card, but defaulted to making a manual impression of it and maybe looking at your signature. The result: you’ll have a much easier time using a stolen card.

In this case, the security flaw was designed in deliberately. Matthew Green writes:

Back in the early 1990s when SSL was first invented at Netscape Corporation, the United States maintained a rigorous regime of export controls for encryption systems. In order to distribute crypto outside of the U.S., companies were required to deliberately “weaken” the strength of encryption keys. For RSA encryption, this implied a maximum allowed key length of 512 bits.

The 512-bit export grade encryption was a compromise between dumb and dumber. In theory it was designed to ensure that the NSA would have the ability to “access” communications, while allegedly providing crypto that was still “good enough” for commercial use. Or if you prefer modern terms, think of it as the original “golden master key.”

The need to support export-grade ciphers led to some technical challenges. Since U.S. servers needed to support both strong and weak crypto, the SSL designers used a “cipher suite” negotiation mechanism to identify the best cipher both parties could support. In theory this would allow “strong” clients to negotiate “strong” ciphersuites with servers that supported them, while still providing compatibility to the broken foreign clients.

And that’s the problem. The weak algorithms are still there, and can be exploited by attackers.

Fixes are coming. Companies like Apple are quickly rolling out patches. But the vulnerability has been around for over a decade, and almost has certainly used by national intelligence agencies and criminals alike.

This is the generic problem with government-mandated backdoors, key escrow, “golden keys,” or whatever you want to call them. We don’t know how to design a third-party access system that checks for morality; once we build in such access, we then have to ensure that only the good guys can do it. And we can’t. Or, to quote the Economist: “…mathematics applies to just and unjust alike; a flaw that can be exploited by Western governments is vulnerable to anyone who finds it.”

This essay previously appeared on the Lawfare blog.

EDITED TO ADD: Microsoft Windows is vulnerable.

Posted on March 6, 2015 at 10:46 AMView Comments

"Surreptitiously Weakening Cryptographic Systems"

New paper: “Surreptitiously Weakening Cryptographic Systems,” by Bruce Schneier, Matthew Fredrikson, Tadayoshi Kohno, and Thomas Ristenpart.

Abstract: Revelations over the past couple of years highlight the importance of understanding malicious and surreptitious weakening of cryptographic systems. We provide an overview of this domain, using a number of historical examples to drive development of a weaknesses taxonomy. This allows comparing different approaches to sabotage. We categorize a broader set of potential avenues for weakening systems using this taxonomy, and discuss what future research is needed to provide sabotage-resilient cryptography.

EDITED TO ADD (3/3): News article.

Posted on February 25, 2015 at 6:09 AMView Comments

More Crypto Wars II

FBI Director James Comey again called for an end to secure encryption by putting in a backdoor. Here’s his speech:

There is a misconception that building a lawful intercept solution into a system requires a so-called “back door,” one that foreign adversaries and hackers may try to exploit.

But that isn’t true. We aren’t seeking a back-door approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process—front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks.

Cyber adversaries will exploit any vulnerability they find. But it makes more sense to address any security risks by developing intercept solutions during the design phase, rather than resorting to a patchwork solution when law enforcement comes knocking after the fact. And with sophisticated encryption, there might be no solution, leaving the government at a dead end—all in the name of privacy and network security.

I’m not sure why he believes he can have a technological means of access that somehow only works for people of the correct morality with the proper legal documents, but he seems to believe that’s possible. As Jeffrey Vagle and Matt Blaze point out, there’s no technical difference between Comey’s “front door” and a “back door.”

As in all of these sorts of speeches, Comey gave examples of crimes that could have been solved had only the police been able to decrypt the defendant’s phone. Unfortunately, none of the three stories is true. The Intercept tracked down each story, and none of them is actually a case where encryption foiled an investigation, arrest, or conviction:

In the most dramatic case that Comey invoked—the death of a 2-year-old Los Angeles girl—not only was cellphone data a non-issue, but records show the girl’s death could actually have been avoided had government agencies involved in overseeing her and her parents acted on the extensive record they already had before them.

In another case, of a Louisiana sex offender who enticed and then killed a 12-year-old boy, the big break had nothing to do with a phone: The murderer left behind his keys and a trail of muddy footprints, and was stopped nearby after his car ran out of gas.

And in the case of a Sacramento hit-and-run that killed a man and his girlfriend’s four dogs, the driver was arrested in a traffic stop because his car was smashed up, and immediately confessed to involvement in the incident.

[…]

His poor examples, however, were reminiscent of one cited by Ronald T. Hosko, a former assistant director of the FBI’s Criminal Investigative Division, in a widely cited—and thoroughly debunked—Washington Post opinion piece last month.

In that case, the Post was eventually forced to have Hosko rewrite the piece, with the following caveat appended:

Editors note: This story incorrectly stated that Apple and Google’s new encryption rules would have hindered law enforcement’s ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.

Hadn’t Comey found anything better since then? In a question-and-answer session after his speech, Comey both denied trying to use scare stories to make his point—and admitted that he had launched a nationwide search for better ones, to no avail.

This is important. All the FBI talk about “going dark” and losing the ability to solve crimes is absolute bullshit. There is absolutely no evidence, either statistically or even anecdotally, that criminals are going free because of encryption.

So why are we even discussing the possibility to forcing companies to provide insecure encryption to their users and customers?

The EFF points out that companies are protected by law from being required to provide insecure security to make the FBI happy.

Sadly, I don’t think this is going to go away anytime soon.

My first post on these new Crypto Wars is here.

Posted on October 21, 2014 at 6:17 AMView Comments

iPhone Encryption and the Return of the Crypto Wars

Last week, Apple announced that it is closing a serious security vulnerability in the iPhone. It used to be that the phone’s encryption only protected a small amount of the data, and Apple had the ability to bypass security on the rest of it.

From now on, all the phone’s data is protected. It can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments. A user’s iPhone data is now more secure.

To hear US law enforcement respond, you’d think Apple’s move heralded an unstoppable crime wave. See, the FBI had been using that vulnerability to get into people’s iPhones. In the words of cyberlaw professor Orin Kerr, “How is the public interest served by a policy that only thwarts lawful search warrants?”

Ah, but that’s the thing: You can’t build a backdoor that only the good guys can walk through. Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.

Backdoor access built for the good guys is routinely used by the bad guys. In 2005, some unknown group surreptitiously used the lawful-intercept capabilities built into the Greek cell phone system. The same thing happened in Italy in 2006.

In 2010, Chinese hackers subverted an intercept system Google had put into Gmail to comply with US government surveillance requests. Back doors in our cell phone system are currently being exploited by the FBI and unknown others.

This doesn’t stop the FBI and Justice Department from pumping up the fear. Attorney General Eric Holder threatened us with kidnappers and sexual predators.

The former head of the FBI’s criminal investigative division went even further, conjuring up kidnappers who are also sexual predators. And, of course, terrorists.

FBI Director James Comey claimed that Apple’s move allows people to “place themselves beyond the law” and also invoked that now overworked “child kidnapper.” John J. Escalante, chief of detectives for the Chicago police department now holds the title of most hysterical: “Apple will become the phone of choice for the pedophile.”

It’s all bluster. Of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping. And, more importantly, there’s no evidence that encryption hampers criminal investigations in any serious way. In 2013, encryption foiled the police nine times, up from four in 2012­—and the investigations proceeded in some other way.

This is why the FBI’s scare stories tend to wither after public scrutiny. A former FBI assistant director wrote about a kidnapped man who would never have been found without the ability of the FBI to decrypt an iPhone, only to retract the point hours later because it wasn’t true.

We’ve seen this game before. During the crypto wars of the 1990s, FBI Director Louis Freeh and others would repeatedly use the example of mobster John Gotti to illustrate why the ability to tap telephones was so vital. But the Gotti evidence was collected using a room bug, not a telephone tap. And those same scary criminal tropes were trotted out then, too. Back then we called them the Four Horsemen of the Infocalypse: pedophiles, kidnappers, drug dealers, and terrorists. Nothing has changed.

Strong encryption has been around for years. Both Apple’s FileVault and Microsoft’s BitLocker encrypt the data on computer hard drives. PGP encrypts e-mail. Off-the-Record encrypts chat sessions. HTTPS Everywhere encrypts your browsing. Android phones already come with encryption built-in. There are literally thousands of encryption products without back doors for sale, and some have been around for decades. Even if the US bans the stuff, foreign companies will corner the market because many of us have legitimate needs for security.

Law enforcement has been complaining about “going dark” for decades now. In the 1990s, they convinced Congress to pass a law requiring phone companies to ensure that phone calls would remain tappable even as they became digital. They tried and failed to ban strong encryption and mandate back doors for their use. The FBI tried and failed again to ban strong encryption in 2010. Now, in the post-Snowden era, they’re about to try again.

We need to fight this. Strong encryption protects us from a panoply of threats. It protects us from hackers and criminals. It protects our businesses from competitors and foreign spies. It protects people in totalitarian governments from arrest and detention. This isn’t just me talking: The FBI also recommends you encrypt your data for security.

As for law enforcement? The recent decades have given them an unprecedented ability to put us under surveillance and access our data. Our cell phones provide them with a detailed history of our movements. Our call records, e-mail history, buddy lists, and Facebook pages tell them who we associate with. The hundreds of companies that track us on the Internet tell them what we’re thinking about. Ubiquitous cameras capture our faces everywhere. And most of us back up our iPhone data on iCloud, which the FBI can still get a warrant for. It truly is the golden age of surveillance.

After considering the issue, Orin Kerr rethought his position, looking at this in terms of a technological-legal trade-off. I think he’s right.

Given everything that has made it easier for governments and others to intrude on our private lives, we need both technological security and legal restrictions to restore the traditional balance between government access and our security/privacy. More companies should follow Apple’s lead and make encryption the easy-to-use default. And let’s wait for some actual evidence of harm before we acquiesce to police demands for reduced security.

This essay previously appeared on CNN.com

EDITED TO ADD (10/6): Three more essays worth reading. As is this on all the other ways Apple and the government have to get at your iPhone data.

And an Washington Post editorial manages to say this:

How to resolve this? A police “back door” for all smartphones is undesirable—a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.

Because a “secure golden key” is completely different from a “back door.”

EDITED TO ADD (10/7): Another essay.

EDITED TO ADD (10/9): Three more essays that are worth reading.

EDITED TO ADD (10/12): Another essay.

Posted on October 6, 2014 at 6:50 AMView Comments

Could Keith Alexander's Advice Possibly Be Worth $600K a Month?

Ex-NSA director Keith Alexander has his own consulting company: IronNet Cybersecurity Inc. His advice does not come cheap:

Alexander offered to provide advice to Sifma for $1 million a month, according to two people briefed on the talks. The asking price later dropped to $600,000, the people said, speaking on condition of anonymity because the negotiation was private.

Alexander declined to comment on the details, except to say that his firm will have contracts “in the near future.”

Kenneth Bentsen, Sifma’s president, said at a Bloomberg Government event yesterday in Washington that “cybersecurity is probably our number one priority” now that most regulatory changes imposed after the 2008 credit crisis have been absorbed.

SIFMA is the Securities Industry and Financial Markets Association. Think of how much actual security they could buy with that $600K a month. Unless he’s giving them classified information.

Digby:

But don’t worry, everything Alexander knows will only benefit the average American like you and me. There’s no reason to suspect that he is trading his high level of inside knowledge to benefit a bunch of rich people all around the globe. Because patriotism.

Or, as Recode.net said: “For another million, I’ll show you the back door we put in your router.”

EDITED TO ADD (7/13): Rep. Alan Grayson is suspicious.

Posted on June 24, 2014 at 2:30 PMView Comments

1 12 13 14 15 16 20

Sidebar photo of Bruce Schneier by Joe MacInnis.