Glenn Greenwald's Encryption Guru
Bruce Schneier says the key to good security is accepting that perfect security doesn’t exist.
Last fall, not long after Bruce Schneier quietly revealed himself as the cryptographer who had helped journalist Glenn Greenwald review Edward Snowden’s NSA documents, he found himself on CNN International, talking about allegations that the United States had spied on the chancellor of Germany.
An exasperated host beamed Schneier in from Minneapolis, where he lives, and asked him to “help us,” as she put it, “decipher this enigma.” Schneier is a legendary encryption specialist who has written or edited 13 books on the subject, and worked for the Department of Defense, telecommunications companies, banks and governments. Most recently, he’s been a vocal advocate of the idea that the best security systems accept a reasonable amount of risk; a blind focus on protecting against every threat, he says, usually comes with unexpected costs.
Outside of the cryptography community, however, this view is not widely held, and the simplicity and directness with which Schneier expresses it tends to take people by surprise. “Sir, as we speak,” the CNN host began, her voice rising, “the U.S. may have tapped into as many as thirty-odd world leaders’ devices, as it were. Just explain to me how, or why, Angela Merkel would be carrying a phone, allegedly, which could be tapped!”
Schneier, who is 51 years old, with a thick beard and a wavy ponytail, tends to approach political questions with the practicality of an engineer. “Well,” he responded, “she has to talk to people.”
Since divulging the fact that he had access to the Snowden documents, Schneier has spent a lot of time fielding questions like that one. Seated in a Manhattan café recently, where he took a short break between meeting with a pharmaceutical company in New Jersey and speaking at Columbia—his post-Snowden life has grown into a whirlwind of public lectures, conferences and meetings with curious security professionals—he again sounded very much the engineer as he described his travel schedule over the past year in terms of its velocity: “My average speed is 32 miles an hour,” he said, “including now.”
During the last year—and, though less forcefully, over the past decade-plus since U.S. national security has been reconfigured—Schneier has been an unrelenting rationalist in what can look like an increasingly irrational world. “Security is all about tradeoffs, but when the stakes are considered infinitely high, the whole equation gets thrown out of kilter,” he wrote shortly after 9/11. Last year, after the NSA warned that publicizing its broad surveillance capabilities would push suspected terrorists off electronic communications, where they would know they would be watched, reports surfaced of surveillance targets, instead, not only staying on their electronic channels but discussing the NSA leaks there. Meanwhile, India instructed its London diplomatic staff to do sensitive business by typewriter.
In January, Schneier was asked to brief six members of Congress on NSA activities, based on his time working with the leaked documents. “Members of Congress, especially members not on the intelligence committee, have a very difficult time getting answers from the intelligence community,” said Zoe Lofgren, the Congresswoman who organized the meeting. “I thought, you know, Bruce has seen all this stuff. If we can’t get them to answer us, maybe we can get the information from Bruce.”
Says Schneier: “It’s really surreal when Congress has to ask some guy to tell them what the NSA is doing.”
Schneier has been interested in how security systems work—or, more specifically, how they can be made to fail—for as long as he can remember. He grew up decoding secret messages that his father, a judge, wrote him, and as an adult he spent four years using an ID he’d convinced the state of Illinois to issue him that was valid without a photo or a signature. (“Anyone could’ve borrowed it,” he says. “Oh, it was absolutely the best.”) He still considers Minneapolis home, but he’s spending the spring at Harvard Law School’s Berkman Center for Internet and Society, where he leads a reading group on the relationship between institutional power and the Internet.
His first book, which became a classic reference for engineers building the computer security systems of the 1990s, described cryptography as what Schneier now calls a “mathematical utopia,” where perfect math made for indestructible security. Encryption algorithms would be able to disguise secrets from anyone who didn’t have the password, no matter how powerful their computers. They could be used to protect email and medical records, and ensure private communication, as well as providing a cornerstone for things like unregulated electronic gambling and anonymous currency exchanges. Protection by mathematics, Schneier wrote, would be stronger than protection by law.
Soon after the book’s publication, Schneier was surprised by a colleague’s comment that the profession was filling with bad security systems designed by people who had read it. His initial disbelief faded when companies began hiring him to review their digital security, and he found “the weak points had nothing to do with the mathematics. They were in the hardware, the software, the networks and the people. Beautiful pieces of mathematics were made irrelevant.”
By the end of the decade Schneier had recanted. He went to work on his next book “partly,” he wrote, “to correct a mistake.” Almost everything he’s done since has been in opposition to that first idea. “Mathematics is logical,” he wrote, but “people are erratic, capricious, and barely comprehensible,” and even computers are buggy and unstable, made of much more than math.
For others, though, it has been a hard lesson to learn.
After 9/11, Schneier saw a familiar utopian thinking creeping into the politics of national security, andhe grew into an outspoken critic. He coined the term “security theater” to describe the showy-but-ineffectual performance of security around air travel, a choreography designed to produce a feeling of safety despite being poorly implemented, defending against the wrong danger, or both. His blog, dedicated to cryptography and tech security, was for a time a catalog of the many ways that ever-changing TSA regulations had been defeated by people with everyday resources but above-average creativity. And last year, he caused a small controversy when he cited research claiming that in the years following the September 11th attacks, enough people had chosen long-distance driving over air travel that the increase in auto-accident fatalities surpassed the number of deaths in the Twin Towers.
For Schneier, security is always a choice between different sets of risk, and there is no such thing as a perfect defense; you calculate the probabilities, and the potential costs of your decisions, as best you can. His arguments illuminate not only the places where politics and superstition have worked their way into supposedly rational systems, but also, in sometimes unexpected ways, how the shadow of 9/11 continues to define U.S. national security.
When Greenwald began to look for a technical collaborator to review the NSA documents, Schneier’s name kept coming up. Greenwald needed someone who could help untangle the technical language of the NSA’s internal documents, and help determine what the agency’s secret programs actually do. In addition to Schneier’s expertise and standing at the top of his field, he’s regarded as having the rare ability “to translate to a general audience the really complicated technical issues raised by a lot of these programs,” Greenwald says. “We are obviously writing for a general audience and not, you know, a hacker conference. And a lot of really smart experts have the ability to understand what the issues are, but not to translate it into a language that most people can understand.”
Since Greenwald left the Guardian for Pierre Omidyar’s First Look Media, Schneier says he hasn’t had access to the documents, but he’s making plans to work with Greenwald again soon. Until then, “the Washington Post”—whose coverage is led by Barton Gellman, one of three journalists with whom Snowden shared his collection—”is getting all the really good stories, including a bunch of things that I saw and wanted to write about.”
“Cryptography doesn’t exist in a vacuum,” Schneier says, a reality that has been reinforced during his time reviewing Snowden’s cache. He hasn’t found NSA programs that broke computer security by defeating the math—although some documents hint at one—but “by cheating,” “doing the non-cryptography things,” finding the security holes when data moves onto an unsafe system, or at unprotected points between one system and another.
“I can name three different ways the NSA has access to your Gmail,” Schneier told the audience at Columbia, “under three different legal authorities,” and through collaborations with three different companies. The problem, he says, is that the risks the NSA faces when conducting bulk surveillance—blanket coverage and retention of data and phone records—are too low. Until last summer, the agency didn’t have to weigh the value of expanding its data collection against the possibility of questions, from a concerned public or from re-energized government officials, about its role in a democracy.
Surveillance on individuals can have a genuine security benefit, Schneier says, but “that’s not what I’m trying to defend against. I’m trying to defend against bulk collection.” Part of building that defense is an engineering project: consumer software companies can expand security across their products, and programmers can repair vulnerabilities they had been unaware of until the NSA leak. But it’s also a political one. “Something that comes up again and again in the NSA documents is that they are amazingly risk-averse. They really take very safe paths,” Schneier said. The chance of being noticed by surveillance targets, or anyone else, weighs heavily on operational decisions. (Later, he described them as “very cautious spies.”) This has probably been true throughout the agency’s history, but over the last ten or so years collecting enormous amounts of data about American citizens began to look, to them, like a low-risk endeavor. As the public becomes aware of what this actually entails, that’s starting to change. In this case, Schneier says, better national security will depend on introducing risk, not eliminating it.