April 15, 2015
by Bruce Schneier
CTO, Resilient Systems, Inc.
A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit <https://www.schneier.com/crypto-gram.html>.
You can read this issue on the web at <https://www.schneier.com/crypto-gram/archives/2015/…>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively and intelligent comment section. An RSS feed is available.
In this issue:
- More “Data and Goliath” News
- The Eighth Movie-Plot Threat Contest
- Metal Detectors at Sports Stadiums
- Cisco Shipping Equipment to Fake Addresses to Foil NSA Interception
- Schneier News
- New Zealand’s XKEYSCORE Use
- Australia Outlaws Warrant Canaries
Last month, the book made it to #6 on the “New York Times” best-seller list in hardcover nonfiction, and #13 in combined print and e-book nonfiction. This was the March 22 list, and covers sales from the first week of March. On the March 29 list—covering sales from the second week of March—I was #11 on the hardcover nonfiction list, and not at all on the combined print and e-book nonfiction list. On the April 5th list, I wasn’t there at all.
Marc Rotenberg of EPIC tells me that Vance Packard’s “The Naked Society” made it to #7 on the list during the week of July 12, 1964, and—by that measure—”Data and Goliath” is the most popular privacy book of all time. I’m not sure I can claim that honor yet, but it’s a nice thought. And two weeks on the “New York Times” best-seller list is super fantastic.
For those curious to know what sorts of raw numbers translate into those rankings, this is what I know. Nielsen Bookscan tracks retail sales across the US, and captures about 80% of the book market. It reports that my book sold 4,706 copies during the first week of March, and 2,339 copies in the second week. Taking that 80% figure, that means I sold 6,000 copies the first week and 3,000 the second.
My publisher tells me that Amazon sold 650 hardcovers and 600 e-books during the first week, and 400 hardcovers and 500 e-books during the second week. The hardcover sales ranking was 865, 949, 611, 686, 657, 602, 595 during the first week, and 398, 511, 693, 867, 341, 357, 343 during the second. The book’s rankings during those first few days don’t match sales, because Amazon records a sale for the rankings when a person orders a book, but only counts the sale when it actually ships it. So all of my preorders sold on that first day, even though they were calculated in the rankings during the days and weeks before publication date.
There are lots of book reviews: from the Economist, Forbes, the Washington Post, Reuters, and many others. Everyone loves the book except the Wall Street Journal.
All of this is on the book’s website, along with a bunch of book-related articles and videos.
Note to readers. The book is 80,000 words long, which is a normal length for a book like this. But the book’s size is much larger, because it contains *a lot* of references. They’re not numbered, but if they were, there would be over 1,000 numbers. I counted all the links, and there are 1,622 individual citations. That’s a lot of text. This means that if you’re reading the book on paper, the narrative ends on page 238, even though the book continues to page 364. If you’re reading it on the Kindle, you’ll finish the book when the Kindle says you’re only 44% of the way through. The difference between pages and percentages is because the references are set in smaller type than the body. I warn you of this now, so you know what to expect. It always annoys me that the Kindle calculates percent done from the end of the file, not the end of the book.
And if you’ve read the book, please post a review on the book’s Amazon page or on Goodreads. Reviews are important on those sites, and I need more of them.
It’s April 1, and time for another Movie-Plot Threat Contest. This year, the theme is Crypto Wars II. Strong encryption is evil, because it prevents the police from solving crimes. (No, really—that’s the argument.) FBI Director James Comey is going to be hard to beat with his heartfelt litany of movie-plot threats:
“We’re drifting toward a place where a whole lot of people are going to be looking at us with tears in their eyes,” Comey argued, “and say ‘What do you mean you can’t? My daughter is missing. You have her phone. What do you mean you can’t tell me who she was texting with before she disappeared?'”
“I’ve heard tech executives say privacy should be the paramount virtue,” Comey said. “When I hear that, I close my eyes and say, ‘Try to imagine what that world looks like where pedophiles can’t be seen, kidnappers can’t be seen, drug dealers can’t be seen.'”
Come on, Comey. You might be able to scare noobs like Rep. John Carter with that talk, but you’re going to have to do better if you want to win this contest. We heard this same sort of stuff out of then-FBI director Louis Freeh in 1996 and 1997.
This is the contest: I want a movie-plot threat that shows the evils of encryption. (For those who don’t know, a movie-plot threat is a scary-threat story that would make a great movie, but is much too specific to build security policies around. Contest history here.) We’ve long heard about the evils of the Four Horsemen of the Internet Apocalypse—terrorists, drug dealers, kidnappers, and child pornographers. (Or maybe they’re terrorists, pedophiles, drug dealers, and money launderers; I can never remember.) Try to be more original than that. And nothing too science fictional; today’s technology or presumed technology only.
Entries are limited to 500 words—I check—and should be posted in the comments. At the end of the month, I’ll choose five or so semifinalists, and we can all vote and pick the winner.
The prize will be signed copies of the 20th Anniversary Edition of the 2nd Edition of “Applied Cryptography,” and the 15th Anniversary Edition of “Secrets and Lies,” both being published by Wiley this year in an attempt to ride the “Data and Goliath” bandwagon.
Post your entries here:
Rep. John Carter:
Previous movie-plot threat contests:
New books offered as prizes:
Fans attending Major League Baseball games are being greeted in a new way this year: with metal detectors at the ballparks. Touted as a counterterrorism measure, they’re nothing of the sort. They’re pure security theater: They look good without doing anything to make us safer. We’re stuck with them because of a combination of buck passing, CYA thinking and fear.
As a security measure, the new devices are laughable. The ballpark metal detectors are much more lax than the ones at an airport checkpoint. They aren’t very sensitive—people with phones and keys in their pockets are sailing through—and there are no X-ray machines. Bags get the same cursory search they’ve gotten for years. And fans wanting to avoid the detectors can opt for a light pat-down search instead.
There’s no evidence that this new measure makes anyone safer. A halfway competent ticketholder would have no trouble sneaking a gun into the stadium. For that matter, a bomb exploded at a crowded checkpoint would be no less deadly than one exploded in the stands. These measures will, at best, be effective at stopping the random baseball fan who’s carrying a gun or knife into the stadium. That may be a good idea, but unless there’s been a recent spate of fan shootings and stabbings at baseball games—and there hasn’t—this is a whole lot of time and money being spent to combat an imaginary threat.
But imaginary threats are the only ones baseball executives have to stop this season; there’s been no specific terrorist threat or actual intelligence to be concerned about. MLB executives forced this change on ballparks based on unspecified discussions with the Department of Homeland Security after the Boston Marathon bombing in 2013. Because, you know, that was also a sporting event.
This system of vague consultations and equally vague threats ensure that no one organization can be seen as responsible for the change. MLB can claim that the league and teams “work closely” with DHS. DHS can claim that it was MLB’s initiative. And both can safely relax because if something happens, at least they did *something*.
It’s an attitude I’ve seen before: “Something must be done. This is something. Therefore, we must do it.” Never mind if the something makes any sense or not.
In reality, this is CYA security, and it’s pervasive in post-9/11 America. It no longer matters if a security measure makes sense, if it’s cost-effective or if it mitigates any actual threats. All that matters is that you took the threat seriously, so if something happens you won’t be blamed for inaction. It’s security, all right—security for the careers of those in charge.
I’m not saying that these officials care only about their jobs and not at all about preventing terrorism, only that their priorities are skewed. They imagine vague threats, and come up with correspondingly vague security measures intended to address them. They experience none of the costs. They’re not the ones who have to deal with the long lines and confusion at the gates. They’re not the ones who have to arrive early to avoid the messes the new policies have caused around the league. And if fans spend more money at the concession stands because they’ve arrived an hour early and have had the food and drinks they tried to bring along confiscated, so much the better, from the team owners’ point of view.
I can hear the objections to this as I write. You don’t *know* these measures won’t be effective! What if something happens? Don’t we have to do everything possible to protect ourselves against terrorism?
That’s worst-case thinking, and it’s dangerous. It leads to bad decisions, bad design and bad security. A better approach is to realistically assess the threats, judge security measures on their effectiveness and take their costs into account. And the result of that calm, rational look will be the realization that there will always be places where we pack ourselves densely together, and that we should spend less time trying to secure those places and more time finding terrorist plots before they can be carried out.
So far, fans have been exasperated but mostly accepting of these new security measures. And this is precisely the problem—most of us don’t care all that much. Our options are to put up with these measures, or stay home. Going to a baseball game is not a political act, and metal detectors aren’t worth a boycott. But there’s an undercurrent of fear as well. If it’s in the name of security, we’ll accept it. As long as our leaders are scared of the terrorists, they’re going to continue the security theater. And we’re similarly going to accept whatever measures are forced upon us in the name of security. We’re going to accept the National Security Agency’s surveillance of every American, airport security procedures that make no sense and metal detectors at baseball and football stadiums. We’re going to continue to waste money overreacting to irrational fears.
We no longer need the terrorists. We’re now so good at terrorizing ourselves.
This essay previously appeared in the “Washington Post.”
Dreaming up terrorist threats at sporting events:
Overreacting to irrational fears:
The Intercept recently posted a story on the CIA’s attempts to hack the iOS operating system. Most interesting was the speculation that it hacked XCode, which would mean that any apps developed using that tool would be compromised.
It’s a classic application of Ken Thompson’s classic 1984 paper, “Reflections on Trusting Trust,” and a very nasty attack.
Dan Wallach speculates on how this might work.
The Citizen Lab at the University of Toronto published a new report on the use of spyware from the Italian cyberweapons arms manufacturer Hacking Team by the Ethiopian intelligence service. We previously learned that the government used this software to target US-based Ethiopian journalists.
New research: “How Polymorphic Warnings Reduce Habituation in the Brain—Insights from an fMRI Study.”
New research: Max Abrahms and Philip B.K. Potter, “Explaining Terrorism: Leadership Deficits and Militant Group Tactics,” “International Organizations.”
I have previously blogged Max Abrahms’s work.
David Omand—GCHQ director from 1996-1997, and the UK’s security and intelligence coordinator from 2000-2005—has just published a new paper: “Understanding Digital Intelligence and the Norms That Might Govern It.” I don’t agree with a lot of it, but it’s worth reading.
My favorite Omand quote is this, defending the close partnership between the NSA and GCHQ in 2013: “We have the brains. They have the money. It’s a collaboration that’s worked very well.”
We’ve learned a lot about the NSA’s abilities to hack a computer’s BIOS so that the hack survives reinstalling the OS. Now we have a research presentation about it.
The NSA has a term for vulnerabilities it think are exclusive to it: NOBUS, for “nobody but us.” Turns out that NOBUS is a flawed concept. As I keep saying: “Today’s top-secret programs become tomorrow’s PhD theses and the next day’s hacker tools.” By continuing to exploit these vulnerabilities rather than fixing them, the NSA is keeping us all vulnerable.
Ugly Mail is a Gmail extension to expose e-mail tracking. It’s a nice idea, but I would like it to work for other browsers and other e-mail programs.
The Brennan Center has a long report on what’s wrong with the FISA Court and how to fix it.
There’s a new story about the hacking capabilities of Canada’s Communications Security Establishment (CSE), based on the Snowden documents.
Researchers have managed to get two computers to communicate using heat and thermal sensors. It’s not really viable communication—the bit rate is eight per hour over fifteen inches—but it’s neat.
Researchers brute-force an iPhone password using a black box that attaches to the iPhone via USB. Because every set of wrong guesses requires a reboot, the process takes about five days. Still, a very clever attack.
There’s a Chinese CA that’s issuing fraudulent Google certificates. Yet another example of why the CA model is so broken.
Pew Research has a new survey on Americans’ privacy habits in a post-Snowden world.
It’s worth reading these results in detail. Overall, these numbers are consistent with a worldwide survey from December. The press is spinning this as “Most Americans’ behavior unchanged after Snowden revelations, study finds,” but I see something very different. I see a sizable percentage of Americans not only concerned about government surveillance, but actively doing something about it. “Third of Americans shield data from government.” Edward Snowden’s goal was to start a national dialog about government surveillance, and these surveys show that he has succeeded in doing exactly that.
Real-life remailers in the Warsaw Pact nations:
The security audit of the TrueCrypt code has been completed, and the results are good. Some issues were found, but nothing major.
Previous audit results:
This Bluetooth door lock is neat, but I’ll bet it can be hacked.
Here’s an article on making secret phone calls with cell phones.
Note that it actually makes sense to use a one-time pad in this instance. The message is a ten-digit number, and a one-time pad is easier, faster, and cleaner than using any computer encryption program.
The Southern Poverty Law Center warns of the rise of lone-wolf terrorism.
Jim Harper of the Cato Institute wrote about this in 2009 after the Fort Hood shooting.
Researchers found voting-system flaws in New South Wales, and were attacked by voting officials and the company that made the machines.
John Mueller suggests an alternative to the FBI’s practice of encouraging terrorists and then arresting them for something they would have never have planned on their own.
Citizen Lab has issued a report on China’s “Great Cannon” attack tool, used in the recent DDoS attack against GitHub.
Paul Krugman argues that we’ll give up our privacy because we want to emulate the rich, who are surrounded by servants who know everything about them.
Daniel C. Dennett and Deb Roy look at our loss of privacy in evolutionary terms, and see all sorts of adaptations coming.
An amazing interview with Edward Snowden by John Oliver:
Dan Geer proposes a way to figure out how many vulnerabilities there are in software:
The Congressional Research Service has released a report on the no-fly list and current litigation that says it violates due process.
Last May, we learned that the NSA intercepts equipment being shipped around the world and installs eavesdropping implants. There were photos of NSA employees opening up a Cisco box. Cisco’s CEO John Chambers personally complained to President Obama about this practice, which is not exactly a selling point for Cisco equipment abroad. “Der Spiegel” published the more complete document, along with a broader story, in January of this year:
In one recent case, after several months a beacon implanted through supply-chain interdiction called back to the NSA covert infrastructure. The call back provided us access to further exploit the device and survey the network. Upon initiating the survey, SIGINT analysis from TAO/Requirements & Targeting determined that the implanted device was providing even greater access than we had hoped: We knew the devices were bound for the Syrian Telecommunications Establishment (STE) to be used as part of their internet backbone, but what we did not know was that STE’s GSM (cellular) network was also using this backbone. Since the STE GSM network had never before been exploited, this new access represented a real coup.
Now Cisco is taking matters into its own hands, offering to ship equipment to fake addresses in an effort to avoid NSA interception.
I don’t think we have even begun to understand the long-term damage the NSA has done to the US tech industry.
I’m speaking at the Global Conference on Cyberspace in the Hague on April 17:
I’m speaking several times at the RSA Conference in San Francisco on April 21-23:
I’m speaking at Penguicon in Detroit on April 24:
I’m speaking at GISEC in Dubai on April 28:
All sorts of interviews—text, audio, video—are here:
Resilient Systems has launched its new “Action Module,” which allows our incident response platform to automatically take actions in the face of attack:
The “Intercept” and the “New Zealand Herald” have reported that New Zealand spied on communications about the World Trade Organization director-general candidates. I’m not sure why this is news; it seems like a perfectly reasonable national intelligence target. More interesting to me is that the “Intercept” published the XKEYSCORE rules. It’s interesting to see how primitive the keyword targeting is, and how broadly it collects e-mails.
The second *really* important point is that Edward Snowden’s name is mentioned nowhere in the stories. Given how scrupulous the “Intercept” is about identifying him as the source of his NSA documents, I have to conclude that this is from another leaker. For a while, I have believed that there are at least three leakers inside the Five Eyes intelligence community, plus another CIA leaker. What I have called Leaker #2 has previously revealed XKEYSCORE rules. Whether this new disclosure is from Leaker #2 or a new Leaker #5, I have no idea. I hope someone is keeping a list.
In the US, certain types of warrants can come with gag orders preventing the recipient from disclosing the existence of warrant to anyone else. A warrant canary is basically a legal hack of that prohibition. Instead of saying “I just received a warrant with a gag order,” the potential recipient keeps repeating “I have not received any warrants.” If the recipient stops saying that, the rest of us are supposed to assume that he has been served one.
Lots of organizations maintain them. Personally, I have never believed this trick would work. It relies on the fact that a prohibition against speaking doesn’t prevent someone from not speaking. But courts generally aren’t impressed by this sort of thing, and I can easily imagine a secret warrant that includes a prohibition against triggering the warrant canary. And for all I know, there are right now secret legal proceedings on this very issue.
Australia has sidestepped all of this by outlawing warrant canaries entirely:
Section 182A of the new law says that a person commits an offense if he or she discloses or uses information about “the existence or non-existence of such a [journalist information] warrant.” The penalty upon conviction is two years imprisonment.
Expect that sort of wording in future US surveillance bills, too.
Australia’s new rules:
Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <https://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of 12 books—including “Liars and Outliers: Enabling the Trust Society Needs to Survive”—as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation’s Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Resilient Systems, Inc. See <https://www.schneier.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of Resilient Systems, Inc.
Copyright (c) 2015 by Bruce Schneier.