Entries Tagged "crypto wars"

Page 5 of 5

History of the First Crypto War

As we’re all gearing up to fight the Second Crypto War over governments’ demands to be able to back-door any cryptographic system, it pays for us to remember the history of the First Crypto War. The Open Technology Institute has written the story of those years in the mid-1990s.

The act that truly launched the Crypto Wars was the White House’s introduction of the “Clipper Chip” in 1993. The Clipper Chip was a state-of-the-art microchip developed by government engineers which could be inserted into consumer hardware telephones, providing the public with strong cryptographic tools without sacrificing the ability of law enforcement and intelligence agencies to access unencrypted versions of those communications. The technology relied on a system of “key escrow,” in which a copy of each chip’s unique encryption key would be stored by the government. Although White House officials mobilized both political and technical allies in support of the proposal, it faced immediate backlash from technical experts, privacy advocates, and industry leaders, who were concerned about the security and economic impact of the technology in addition to obvious civil liberties concerns. As the battle wore on throughout 1993 and into 1994, leaders from across the political spectrum joined the fray, supported by a broad coalition that opposed the Clipper Chip. When computer scientist Matt Blaze discovered a flaw in the system in May 1994, it proved to be the final death blow: the Clipper Chip was dead.

Nonetheless, the idea that the government could find a palatable way to access the keys to encrypted communications lived on throughout the 1990s. Many policymakers held onto hopes that it was possible to securely implement what they called “software key escrow” to preserve access to phone calls, emails, and other communications and storage applications. Under key escrow schemes, a government-certified third party would keep a “key” to every device. But the government’s shift in tactics ultimately proved unsuccessful; the privacy, security, and economic concerns continued to outweigh any potential benefits. By 1997, there was an overwhelming amount of evidence against moving ahead with any key escrow schemes.

The Second Crypto War is going to be harder and nastier, and I am less optimistic that strong cryptography will win in the short term.

Posted on June 22, 2015 at 1:35 PMView Comments

Eighth Movie-Plot Threat Contest Semifinalists

On April 1, I announced the Eighth Movie Plot Threat Contest: demonstrate the evils of encryption.

Not a whole lot of good submissions this year. Possibly this contest has run its course, and there’s not a whole lot of interest left. On the other hand, it’s heartening to know that there aren’t a lot of encryption movie-plot threats out there.

Anyway, here are the semifinalists.

  1. Child pornographers.
  2. Bombing the NSA.
  3. Torture.
  4. Terrorists and a vaccine.
  5. Election systems.

Cast your vote by number here; voting closes at the end of the month.

Contest.

Previous contests.

Posted on May 14, 2015 at 11:26 PMView Comments

The Eighth Movie-Plot Threat Contest

It’s April 1, and time for another Movie-Plot Threat Contest. This year, the theme is Crypto Wars II. Strong encryption is evil, because it prevents the police from solving crimes. (No, really—that’s the argument.) FBI Director James Comey is going to be hard to beat with his heartfelt litany of movie-plot threats:

“We’re drifting toward a place where a whole lot of people are going to be looking at us with tears in their eyes,” Comey argued, “and say ‘What do you mean you can’t? My daughter is missing. You have her phone. What do you mean you can’t tell me who she was texting with before she disappeared?”

[…]

“I’ve heard tech executives say privacy should be the paramount virtue,” Comey said. “When I hear that, I close my eyes and say, ‘Try to imagine what that world looks like where pedophiles can’t be seen, kidnappers can’t be seen, drug dealers can’t be seen.'”

(More Comey here.)

Come on, Comey. You might be able to scare noobs like Rep. John Carter with that talk, but you’re going to have to do better if you want to win this contest. We heard this same sort of stuff out of then-FBI director Louis Freeh in 1996 and 1997.

This is the contest: I want a movie-plot threat that shows the evils of encryption. (For those who don’t know, a movie-plot threat is a scary-threat story that would make a great movie, but is much too specific to build security policies around. Contest history here.) We’ve long heard about the evils of the Four Horsemen of the Internet Apocalypse—terrorists, drug dealers, kidnappers, and child pornographers. (Or maybe they’re terrorists, pedophiles, drug dealers, and money launderers; I can never remember.) Try to be more original than that. And nothing too science fictional; today’s technology or presumed technology only.

Entries are limited to 500 words—I check—and should be posted in the comments. At the end of the month, I’ll choose five or so semifinalists, and we can all vote and pick the winner.

The prize will be signed copies of the 20th Anniversary Edition of the 2nd Edition of Applied Cryptography, and the 15th Anniversary Edition of Secrets and Lies, both being published by Wiley this year in an attempt to ride the Data and Goliath bandwagon.

Good luck.

Posted on April 1, 2015 at 6:33 AMView Comments

David Cameron's Plan to Ban Encryption in the UK

In the wake of the Paris terrorist shootings, David Cameron has said that he wants to ban encryption in the UK. Here’s the quote: “If I am prime minister I will make sure that it is a comprehensive piece of legislation that does not allow terrorists safe space to communicate with each other.”

This is similar to FBI director James Comey’s remarks from last year. And it’s equally stupid.

Cory Doctorow has a good essay on Cameron’s proposal:

For David Cameron’s proposal to work, he will need to stop Britons from installing software that comes from software creators who are out of his jurisdiction. The very best in secure communications are already free/open source projects, maintained by thousands of independent programmers around the world. They are widely available, and thanks to things like cryptographic signing, it is possible to download these packages from any server in the world (not just big ones like Github) and verify, with a very high degree of confidence, that the software you’ve downloaded hasn’t been tampered with.

Cameron is not alone here. The regime he proposes is already in place in countries like Syria, Russia, and Iran (for the record, none of these countries have had much luck with it). There are two means by which authoritarian governments have attempted to restrict the use of secure technology: by network filtering and by technology mandates.

Posted on January 13, 2015 at 2:07 PMView Comments

More Crypto Wars II

FBI Director James Comey again called for an end to secure encryption by putting in a backdoor. Here’s his speech:

There is a misconception that building a lawful intercept solution into a system requires a so-called “back door,” one that foreign adversaries and hackers may try to exploit.

But that isn’t true. We aren’t seeking a back-door approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process—front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks.

Cyber adversaries will exploit any vulnerability they find. But it makes more sense to address any security risks by developing intercept solutions during the design phase, rather than resorting to a patchwork solution when law enforcement comes knocking after the fact. And with sophisticated encryption, there might be no solution, leaving the government at a dead end—all in the name of privacy and network security.

I’m not sure why he believes he can have a technological means of access that somehow only works for people of the correct morality with the proper legal documents, but he seems to believe that’s possible. As Jeffrey Vagle and Matt Blaze point out, there’s no technical difference between Comey’s “front door” and a “back door.”

As in all of these sorts of speeches, Comey gave examples of crimes that could have been solved had only the police been able to decrypt the defendant’s phone. Unfortunately, none of the three stories is true. The Intercept tracked down each story, and none of them is actually a case where encryption foiled an investigation, arrest, or conviction:

In the most dramatic case that Comey invoked—the death of a 2-year-old Los Angeles girl—not only was cellphone data a non-issue, but records show the girl’s death could actually have been avoided had government agencies involved in overseeing her and her parents acted on the extensive record they already had before them.

In another case, of a Louisiana sex offender who enticed and then killed a 12-year-old boy, the big break had nothing to do with a phone: The murderer left behind his keys and a trail of muddy footprints, and was stopped nearby after his car ran out of gas.

And in the case of a Sacramento hit-and-run that killed a man and his girlfriend’s four dogs, the driver was arrested in a traffic stop because his car was smashed up, and immediately confessed to involvement in the incident.

[…]

His poor examples, however, were reminiscent of one cited by Ronald T. Hosko, a former assistant director of the FBI’s Criminal Investigative Division, in a widely cited—and thoroughly debunked—Washington Post opinion piece last month.

In that case, the Post was eventually forced to have Hosko rewrite the piece, with the following caveat appended:

Editors note: This story incorrectly stated that Apple and Google’s new encryption rules would have hindered law enforcement’s ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.

Hadn’t Comey found anything better since then? In a question-and-answer session after his speech, Comey both denied trying to use scare stories to make his point—and admitted that he had launched a nationwide search for better ones, to no avail.

This is important. All the FBI talk about “going dark” and losing the ability to solve crimes is absolute bullshit. There is absolutely no evidence, either statistically or even anecdotally, that criminals are going free because of encryption.

So why are we even discussing the possibility to forcing companies to provide insecure encryption to their users and customers?

The EFF points out that companies are protected by law from being required to provide insecure security to make the FBI happy.

Sadly, I don’t think this is going to go away anytime soon.

My first post on these new Crypto Wars is here.

Posted on October 21, 2014 at 6:17 AMView Comments

iPhone Encryption and the Return of the Crypto Wars

Last week, Apple announced that it is closing a serious security vulnerability in the iPhone. It used to be that the phone’s encryption only protected a small amount of the data, and Apple had the ability to bypass security on the rest of it.

From now on, all the phone’s data is protected. It can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments. A user’s iPhone data is now more secure.

To hear US law enforcement respond, you’d think Apple’s move heralded an unstoppable crime wave. See, the FBI had been using that vulnerability to get into people’s iPhones. In the words of cyberlaw professor Orin Kerr, “How is the public interest served by a policy that only thwarts lawful search warrants?”

Ah, but that’s the thing: You can’t build a backdoor that only the good guys can walk through. Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.

Backdoor access built for the good guys is routinely used by the bad guys. In 2005, some unknown group surreptitiously used the lawful-intercept capabilities built into the Greek cell phone system. The same thing happened in Italy in 2006.

In 2010, Chinese hackers subverted an intercept system Google had put into Gmail to comply with US government surveillance requests. Back doors in our cell phone system are currently being exploited by the FBI and unknown others.

This doesn’t stop the FBI and Justice Department from pumping up the fear. Attorney General Eric Holder threatened us with kidnappers and sexual predators.

The former head of the FBI’s criminal investigative division went even further, conjuring up kidnappers who are also sexual predators. And, of course, terrorists.

FBI Director James Comey claimed that Apple’s move allows people to “place themselves beyond the law” and also invoked that now overworked “child kidnapper.” John J. Escalante, chief of detectives for the Chicago police department now holds the title of most hysterical: “Apple will become the phone of choice for the pedophile.”

It’s all bluster. Of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping. And, more importantly, there’s no evidence that encryption hampers criminal investigations in any serious way. In 2013, encryption foiled the police nine times, up from four in 2012­—and the investigations proceeded in some other way.

This is why the FBI’s scare stories tend to wither after public scrutiny. A former FBI assistant director wrote about a kidnapped man who would never have been found without the ability of the FBI to decrypt an iPhone, only to retract the point hours later because it wasn’t true.

We’ve seen this game before. During the crypto wars of the 1990s, FBI Director Louis Freeh and others would repeatedly use the example of mobster John Gotti to illustrate why the ability to tap telephones was so vital. But the Gotti evidence was collected using a room bug, not a telephone tap. And those same scary criminal tropes were trotted out then, too. Back then we called them the Four Horsemen of the Infocalypse: pedophiles, kidnappers, drug dealers, and terrorists. Nothing has changed.

Strong encryption has been around for years. Both Apple’s FileVault and Microsoft’s BitLocker encrypt the data on computer hard drives. PGP encrypts e-mail. Off-the-Record encrypts chat sessions. HTTPS Everywhere encrypts your browsing. Android phones already come with encryption built-in. There are literally thousands of encryption products without back doors for sale, and some have been around for decades. Even if the US bans the stuff, foreign companies will corner the market because many of us have legitimate needs for security.

Law enforcement has been complaining about “going dark” for decades now. In the 1990s, they convinced Congress to pass a law requiring phone companies to ensure that phone calls would remain tappable even as they became digital. They tried and failed to ban strong encryption and mandate back doors for their use. The FBI tried and failed again to ban strong encryption in 2010. Now, in the post-Snowden era, they’re about to try again.

We need to fight this. Strong encryption protects us from a panoply of threats. It protects us from hackers and criminals. It protects our businesses from competitors and foreign spies. It protects people in totalitarian governments from arrest and detention. This isn’t just me talking: The FBI also recommends you encrypt your data for security.

As for law enforcement? The recent decades have given them an unprecedented ability to put us under surveillance and access our data. Our cell phones provide them with a detailed history of our movements. Our call records, e-mail history, buddy lists, and Facebook pages tell them who we associate with. The hundreds of companies that track us on the Internet tell them what we’re thinking about. Ubiquitous cameras capture our faces everywhere. And most of us back up our iPhone data on iCloud, which the FBI can still get a warrant for. It truly is the golden age of surveillance.

After considering the issue, Orin Kerr rethought his position, looking at this in terms of a technological-legal trade-off. I think he’s right.

Given everything that has made it easier for governments and others to intrude on our private lives, we need both technological security and legal restrictions to restore the traditional balance between government access and our security/privacy. More companies should follow Apple’s lead and make encryption the easy-to-use default. And let’s wait for some actual evidence of harm before we acquiesce to police demands for reduced security.

This essay previously appeared on CNN.com

EDITED TO ADD (10/6): Three more essays worth reading. As is this on all the other ways Apple and the government have to get at your iPhone data.

And an Washington Post editorial manages to say this:

How to resolve this? A police “back door” for all smartphones is undesirable—a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.

Because a “secure golden key” is completely different from a “back door.”

EDITED TO ADD (10/7): Another essay.

EDITED TO ADD (10/9): Three more essays that are worth reading.

EDITED TO ADD (10/12): Another essay.

Posted on October 6, 2014 at 6:50 AMView Comments

1 3 4 5

Sidebar photo of Bruce Schneier by Joe MacInnis.