Crypto-Gram

December 15, 2015

by Bruce Schneier
CTO, Resilient Systems, Inc.
schneier@schneier.com
https://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <https://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <https://www.schneier.com/crypto-gram/archives/2015/…>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively and intelligent comment section. An RSS feed is available.


In this issue:


Policy Repercussions of the Paris Terrorist Attacks

In 2013, in the early days of the Snowden leaks, Harvard Law School professor and former Assistant Attorney General Jack Goldsmith reflected on the increase in NSA surveillance post 9/11. He wrote:

Two important lessons of the last dozen years are (1) the government will increase its powers to meet the national security threat fully (because the People demand it), and (2) the enhanced powers will be accompanied by novel systems of review and transparency that seem to those in the Executive branch to be intrusive and antagonistic to the traditional national security mission, but that in the end are key legitimating factors for the expanded authorities.

Goldsmith is right, and I think about this quote as I read news articles about surveillance policies with headlines like “Political winds shifting on surveillance after Paris attacks?”

The politics of surveillance are the politics of fear. As long as the people are afraid of terrorism—regardless of how realistic their fears are—they will demand that the government keep them safe. And if the government can convince them that it needs this or that power in order to keep the people safe, the people will willingly grant them those powers. That’s Goldsmith’s first point.

Today, in the wake of the horrific and devastating Paris terror attacks, we’re at a pivotal moment. People are scared, and already Western governments are lining up to authorize more invasive surveillance powers. The US want to back-door encryption products in some vain hope that the bad guys are 1) naive enough to use those products for their own communications instead of more secure ones, and 2) too stupid to use the back doors against the rest of us. The UK is trying to rush the passage of legislation that legalizes a whole bunch of surveillance activities that GCHQ has already been doing to its own citizens. France just gave its police a bunch of new powers. It doesn’t matter that mass surveillance isn’t an effective anti-terrorist tool: a scared populace wants to be reassured.

And politicians want to reassure. It’s smart politics to exaggerate the threat. It’s smart politics to *do something*, even if that something isn’t effective at mitigating the threat. The surveillance apparatus has the ear of the politicians, and the primary tool in its box is more surveillance. There’s minimal political will to push back on those ideas, especially when people are scared.

Writing about our country’s reaction to the Paris attacks, Tom Engelhardt wrote:

…the officials of that security state have bet the farm on the preeminence of the terrorist ‘threat,’ which has, not so surprisingly, left them eerily reliant on the Islamic State and other such organizations for the perpetuation of their way of life, their career opportunities, their growing powers, and their relative freedom to infringe on basic rights, as well as for that comfortably all-embracing blanket of secrecy that envelops their activities.

Goldsmith’s second point is more subtle: when these power increases are made in public, they’re legitimized through bureaucracy. Together, the scared populace and their scared elected officials serve to make the expanded national security and law enforcement powers normal.

Terrorism is singularly designed to push our fear buttons in ways completely out of proportion to the actual threat. And as long as people are scared of terrorism, they’ll give their governments all sorts of new powers of surveillance, arrest, detention, and so on, regardless of whether those powers actually combat the threat. This means that those who want those powers need a steady stream of terrorist attacks to enact their agenda. It’s not that these people are actively rooting for the terrorists, but they know a good opportunity when they see it.

We know that the PATRIOT Act was largely written before the 9/11 terrorist attacks, and that the political climate was right for its introduction and passage. More recently:

Although “the legislative environment is very hostile today,” the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August e-mail, which was obtained by The Post, “it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.”

The Paris attacks could very well be that event.

I am very worried that the Obama administration has already secretly told the NSA to increase its surveillance inside the US. And I am worried that there will be new legislation legitimizing that surveillance and granting other invasive powers to law enforcement. As Goldsmith says, these powers will be accompanied by novel systems of review and transparency. But I have no faith that those systems will be effective in limiting abuse any more than they have been over the last couple of decades.

Blog entry URL:
https://www.schneier.com/blog/archives/2015/11/…

Goldsmith’s remarks:
https://www.lawfareblog.com/…

“Political winds shifting on surveillance after Paris attacks?”:
http://www.cnn.com/2015/11/20/politics/…

Me on the politics of fear:
https://www.schneier.com/essays/archives/2013/05/…

People afraid of terrorism:
https://www.washingtonpost.com/news/wonk/wp/2013/04/…

Me on peoples’ unrealistic terrorism fears:
http://www.cnn.com/2012/07/31/opinion/…

Government officials talking about how terrorism helps push legislation:
https://theintercept.com/2015/09/16/…

FBI Director wanting encryption back doors:
http://www.theguardian.com/technology/2015/jul/08/…

Why back doors are a bad idea:
http://www.wired.com/2015/11/…
https://www.schneier.com/paper-keys-under-doormats.html

The UK’s rush legislation:
https://uk.news.yahoo.com/…
http://www.telegraph.co.uk/news/uknews/…

France’s rush legislation:
https://www.lawfareblog.com/…

Mass surveillance isn’t an effective anti-terrorist tool:
http://www.nytimes.com/2015/11/18/opinion/…

It’s smart politics to exaggerate the threat:
https://www.schneier.com/essays/archives/2013/05/…

Tom Engelhardt’s comment on Paris:
http://www.thenation.com/article/…

How terrorism pushes our fear buttons:
https://www.schneier.com/essays/archives/2007/03/…
https://www.schneier.com/essays/archives/2007/05/…

PATRIOT Act:
http://www.globalissues.org/article/342/…

Bob Litt’s comments:
https://www.washingtonpost.com/world/…

Trevor Timm is all over this issue.
http://www.theguardian.com/commentisfree/2015/nov/…
http://www.theguardian.com/commentisfree/2015/nov/…
http://www.theguardian.com/commentisfree/2015/nov/…
http://www.theguardian.com/commentisfree/2015/dec/…

Dan Gillmore wrote something smart, too.
https://medium.com/backchannel/…


NSA Collected Americans’ E-mails Even After It Stopped

Collecting Americans’ E-mails

In 2001, the Bush administration authorized—almost certainly illegally—the NSA to conduct bulk electronic surveillance on Americans: phone calls, e-mails, financial information, and so on. We learned a lot about the bulk phone metadata collection program from the documents provided by Edward Snowden, and it was the focus of debate surrounding the USA FREEDOM Act. E-mail metadata surveillance, however, wasn’t part of that law. We learned the name of the program—STELLAR WIND—when it was leaked in 2004. But supposedly the NSA stopped collecting that data in 2011, because it wasn’t cost-effective.

The internet metadata collection program authorized by the FISA court was discontinued in 2011 for operational and resource reasons and has not been restarted,” Shawn Turner, the Obama administration’s director of communications for National Intelligence, said in a statement to the Guardian.

When Turner said that in 2013, we knew from the Snowden documents that the NSA was still collecting some Americans’ Internet metadata from communications links between the US and abroad. Now we have more proof. It turns out that the NSA never stopped collecting e-mail metadata on Americans. They just cancelled one particular program and changed the legal authority under which they collected it.

The report explained that there were two other legal ways to get such data. One was the collection of bulk data that had been gathered in other countries, where the N.S.A.’s activities are largely not subject to regulation by the Foreign Intelligence Surveillance Act and oversight by the intelligence court.

[…]

The N.S.A. had long barred analysts from using Americans’ data that had been swept up abroad, but in November 2010 it changed that rule, documents leaked by Edward J. Snowden have shown. The inspector general report cited that change to the N.S.A.’s internal procedures.

The other replacement source for the data was collection under the FISA Amendments Act of 2008, which permits warrantless surveillance on domestic soil that targets specific noncitizens abroad, including their new or stored emails to or from Americans.

In “Data and Goliath,” I wrote:

Some members of Congress are trying to impose limits on the NSA, and some of their proposals have real teeth and might make a difference. Even so, I don’t have any hope of meaningful congressional reform right now, because all of the proposals focus on specific programs and authorities: the telephone metadata collection program under Section 215, bulk records collection under Section 702, and so on. It’s a piecemeal approach that can’t work. We are now beyond the stage where simple legal interventions can make a difference. There’s just too much secrecy, and too much shifting of programs amongst different legal justifications.

The NSA continually plays this shell game with Congressional overseers. Whenever an intelligence-community official testifies that something is not being done under *this* particular program, or *this* particular authority, you can be sure that it’s being done under some other program or some other authority. In particular, the NSA regularly uses rules that allow them to conduct bulk surveillance outside the US—rules that largely evade both Congressional and Judicial oversight—to conduct bulk surveillance on Americans. Effective oversight of the NSA is impossible in the face of this level of misdirection and deception.

Stellar Wind leaked in 2004:
http://www.wired.com/2013/06/…

Stellar Wind cancelled:
http://www.theguardian.com/world/2013/jun/27/…

Bulk e-mail collection not stopped:
http://mobile.nytimes.com/2015/11/20/us/politics/…


News

There’s pretty strong evidence that the team of researchers from Carnegie Mellon University who cancelled their scheduled 2015 Black Hat talk deanonymized Tor users for the FBI.
https://boingboing.net/2015/11/11/…
http://motherboard.vice.com/read/…
http://www.wired.com/2015/11/…
http://motherboard.vice.com/read/…
http://motherboard.vice.com/read/…
https://.torproject.org//…
https://web.archive.org/web/20140705114447/http://…
Here’s the reaction from the Tor Project.
https://.torproject.org//…
Nicholas Weaver guessed this back in January.
http://arstechnica.com/tech-policy/2015/01/…

Paris attacks blamed on strong cryptography and Edward Snowden.
https://www.schneier.com/blog/archives/2015/11/…

“Refuse to be terrorized.” Paul Krugman has written a really good update of my 2006 essay.
http://www.nytimes.com/2015/11/16/opinion/…
My 2006 essay:
https://www.schneier.com/essays/archives/2006/08/…
This crass and irreverent essay was written after January’s Paris terrorist attack, but is very relevant right now.
http://www.cracked.com//…

Ads surreptitiously using sound to communicate across devices. It’s both creepy and disturbing.
http://arstechnica.com/tech-policy/2015/11/…
Related: a Chrome extension that broadcasts URLs over audio.
http://googleresearch.blogspot.co.uk/2015/05/…
https://www.yahoo.com/tech/…

Paris terrorists used double ROT-13 encryption. That is, no encryption at all. The Intercept has the story:
https://theintercept.com/2015/11/18/…
https://www.techdirt.com/articles/20151118/…
http://arstechnica.co.uk/tech-policy/2015/11/…
And what is it about this “mastermind” label? Why do we have to make them smarter than they are?
http://www.politico.com/magazine/story/2015/11/…

RFID-shielded, ultra-strong duffel bags for carrying cash through dangerous territory:
https://boingboing.net/2015/11/05/…

Roger Grimes has written an interesting paper: “Implementing a Data-Driven Computer Security Defense.” His thesis is that most organizations don’t match their defenses to the actual risks. His paper explains how it got to be this way, and how to fix it.
https://gallery.technet.microsoft.com/…

Algebraic Eraser is a public-key key-agreement protocol that’s patented and being pushed by a company for the Internet of Things, primarily because it is efficient on small low-power devices. There’s a new cryptanalytic attack.
http://www.securerf.com/tag/algebraic-eraser/
http://arstechnica.com/security/2015/11/…
http://arxiv.org/pdf/1511.03870v1.pdf
This is yet another demonstration of why you should not choose proprietary encryption over public algorithms and protocols. The good stuff is not patented.

A good “New Yorker” article traces the history of privacy from the mid-1800s to today:
http://www.newyorker.com/magazine/2013/06/24/the-prism

Someone opened a LifeLock account in his ex-wife’s name, and used the service to track her bank accounts, credit cards, and other financial activities. The article is mostly about how appalling LifeLock was about this, but I’m more interested in the surveillance possibilities. Certainly the FBI can use LifeLock to surveil people with a warrant. The FBI/NSA can also collect the financial data of every LifeLock customer with a National Security Letter. But it’s interesting how easy it was for an individual to open an account for another individual.
http://consumerist.com/2015/11/23/…

Phil Rogaway has written an excellent paper titled “The Moral Character of Cryptography Work.” In it, he exhorts cryptographers to consider the morality of their research, and to build systems that enhance privacy rather than diminish it. It is very much worth reading.
http://web.cs.ucdavis.edu/~rogaway/papers/moral.html

BlackBerry has chosen to shut down operations in Pakistan rather than provide the government with backdoor access to encrypted communications. Pakistan is a relatively small market, but still.
http://www.cnet.com/au/news/…
http://www.cbc.ca/news/business/…

A clever forced authorization attack against chip-and-PIN credit card terminals:
https://www.benthamsgaze.org/2015/12/01/…

Interesting essay about how Israel regulates encryption. Basically, it looks like secret agreements made in smoke-filled rooms, very discreet with no oversight or accountability. The fact that pretty much everyone in IT security has served in an offensive cybersecurity capacity for the Israeli army helps. As does the fact that the country is so small, making informal deal-making manageable. It doesn’t scale.
https://www.lawfareblog.com/…
Why is this important? “…companies in Israel, a country comprising less than 0.11% of the world’s population, are estimated to have sold 10% ($6 billion out of $60 billion) of global encryption and cyber technologies for 2014.

Newly declassified: “A History of U.S. Communications Security (Volumes I and II),” the David G. Boak Lectures, National Security Agency (NSA), 1973. (The document was initially declassified in 2008. We just got a whole bunch of additional material declassified. Both versions are in the document, so you can compare and see what was kept secret seven years ago.)
http://www.governmentattic.org/18docs/…

I’ve written about the difference between risk perception and risk reality. I thought about that when reading this list of Americans’ top technology fears:
http://www.brookings.edu/s/techtank/posts/2015/…

Interesting research: “Identifying patterns in informal sources of security information,” by Emilee Rader and Rick Wash, “Journal of Cybersecurity,” 1 Dec 2015.
http://cybersecurity.oxfordjournals.org/content/…

A Florida woman drove away after an accident, but her car automatically reported it anyway. She was arrested.
http://www.zdnet.com/article/…

The “New York Times Magazine” has a good story about swatting, centering around a Canadian teenager who did it over a hundred times.
http://www.nytimes.com/2015/11/29/magazine/…

“Security theater” sighting, in a Schlock Mercenary comic.
http://www.schlockmercenary.com/2015-12-07

Has anyone been following the attack against the DNS root servers two weeks ago? I can’t precisely explain why, but this feels like someone testing an attack capability. For defense: it’s long past time to implement source address validation in the DNS system.
https://www.schneier.com/blog/archives/2015/12/…


Reputation in the Information Age

Reputation is a social mechanism by which we come to trust one another, in all aspects of our society. I see it as a security mechanism. The promise and threat of a change in reputation entices us all to be trustworthy, which in turn enables others to trust us. In a very real sense, reputation enables friendships, commerce, and everything else we do in society. It’s old, older than our species, and we are finely tuned to both perceive and remember reputation information, and broadcast it to others.

The nature of how we manage reputation has changed in the past couple of decades, and Gloria Origgi alludes to the change in her remarks. Reputation now involves technology. Feedback and review systems, whether they be eBay rankings, Amazon reviews, or Uber ratings, are reputational systems. So is Google PageRank. Our reputations are, at least in part, based on what we say on social networking sites like Facebook and Twitter. Basically, what were wholly social systems have become socio-technical systems.

This change is important, for both the good and the bad of what it allows.

An example might make this clearer. In a small town, everyone knows each other, and lenders can make decisions about whom to loan money to, based on reputation (like in the movie “It’s a Wonderful Life”). The system isn’t perfect; it is prone to “old-boy network” preferences and discrimination against outsiders. The real problem, though, is that the system doesn’t scale. To enable lending on a larger scale, we replaced personal reputation with a technological system: credit reports and scores. They work well, and allow us to borrow money from strangers halfway across the country—and lending has exploded in our society, in part because of it. But the new system can be attacked technologically. Someone could hack the credit bureau’s database and enhance her reputation by boosting her credit score. Or she could steal someone else’s reputation. All sorts of attacks that just weren’t possible with a wholly personal reputation system become possible against a system that works as a technological reputation system.

We like socio-technical systems of reputation because they empower us in so many ways. People can achieve a level of fame and notoriety much more easily on the Internet. Totally new ways of making a living—think of Uber and Airbnb, or popular bloggers and YouTubers—become possible. But the downsides are considerable. The hacker tactic of social engineering involves fooling someone by hijacking the reputation of someone else. Most social media companies make their money leeching off our activities on their sites. And because we trust the reputational information from these socio-technical systems, anyone who can figure out how to game those systems can artificially boost their reputation. Amazon, eBay, Yelp, and others have been trying to deal with fake reviews for years. And you can buy Twitter followers and Facebook likes cheap.

Reputation has always been gamed. It’s been an eternal arms race between those trying to artificially enhance their reputation and those trying to detect those enhancements. In that respect, nothing is new here. But technology changes the mechanisms of both enhancement and enhancement detection. There’s power to be had on either side of that arms race, and it’ll be interesting to watch each side jockeying for the upper hand.

This essay is part of a conversation with Gloria Origgi entitled “What is Reputation?”
http://edge.org/conversation/…


Schneier News

The German edition of “Data and Goliath” has been published.
https://www.m-vg.de/redline/shop/article/…
Here’s a review:
http://www.nzz.ch/wirtschaft/…

I appeared on a C-SPAN panel with Jessica Stern (co-author of “Isis”) and Gabriella Blum (co-author of “The Future of Violence”).
http://www.c-span.org/video/?400412-2/…

I was profiled in the Engineering Ethics blog.
https://www.schneier.com/news/archives/2015/11/…


On CISA

I have avoided writing about the Cybersecurity Information Sharing Act (CISA), largely because the details kept changing. (For those not following closely, similar bills were passed by both the House and the Senate. They’re now being combined into a single bill which will be voted on again, and then almost certainly signed into law by President Obama.)

Now that it’s pretty solid, I find that I don’t have to write anything, because Danny Weitzner did such a good job, writing about how the bill encourages companies to share personal information with the government, allows them to take some offensive measures against attackers (or innocents, if they get it wrong), waives privacy protections, and gives companies immunity from prosecution.

Information sharing is essential to good cybersecurity, and we need more of it. But CISA is a really bad law.

http://qz.com/543692/…

This is good, too:
http://www.esquire.com/news-politics/politics/news/…


Voter Surveillance

There hasn’t been that much written about surveillance and big data being used to manipulate voters. In “Data and Goliath,” I wrote:

Unique harms can arise from the use of surveillance data in politics. Election politics is very much a type of marketing, and politicians are starting to use personalized marketing’s capability to discriminate as a way to track voting patterns and better “sell” a candidate or policy position. Candidates and advocacy groups can create ads and fund-raising appeals targeted to particular categories: people who earn more than $100,000 a year, gun owners, people who have read news articles on one side of a particular issue, unemployed veterans…anything you can think of. They can target outraged ads to one group of people, and thoughtful policy-based ads to another. They can also fine-tune their get-out-the-vote campaigns on Election Day, and more efficiently gerrymander districts between elections. Such use of data will likely have fundamental effects on democracy and voting.

A new research paper looks at the trends.
http://library.queensu.ca/ojs/index.php/….


Resilient Systems News

I’ll be participating in an end-of-year trends and predictions webinar on Thursday, December 17, at 1:00 PM EST. Join me here.
http://info.resilientsystems.com/…

In other news, Resilient has joined the IBM Security App Exchange community.
https://www.resilientsystems.com/news/news-releases/…

And we’re still hiring for a bunch of positions.
https://www.resilientsystems.com/company/careers


Worldwide Cryptographic Products Survey: Edits and

Additions Wanted

Back in September, I announced my intention to survey the world market of cryptographic products. The goal is to compile a list of both free and commercial encryption products that can be used to protect arbitrary data and messages. That is, I’m not interested in products that are specifically designed for a narrow application, like financial transactions, or products that provide authentication or data integrity. I am interested in products that people like FBI director James Comey can possibly claim help criminals communicate securely.

Together with a student here at Harvard University, we’ve compiled a spreadsheet of over 400 products from many different countries.

At this point, we would like your help. Please look at the list. Please correct anything that is wrong, and add anything that is missing. Use this form to submit changes and additions. If it’s more complicated than that, please e-mail me.

As the rhetoric surrounding weakening or banning strong encryption continues, it’s important for policymakers to understand how international the cryptographic market is, and how much of it is not under their control. My hope is that this survey will contribute to the debate by making that point.

Original announcement:
https://www.schneier.com/blog/archives/2015/09/…

Current spreadsheet:
https://docs.google.com/spreadsheets/d/…

Google form for submissions and comments:
https://docs.google.com/forms/d/…


Security vs. Business Flexibility

This article demonstrates that security is less important than functionality.

When asked about their preference if they needed to choose between IT security and business flexibility, 71 percent of respondents said that security should be equally or more important than business flexibility.

But show them the money and things change, when the same people were asked if they would take the risk of a potential security threat in order to achieve the biggest deal of their life, 69 percent of respondents say they would take the risk.

The reactions I’ve read call this a sad commentary on security, but I think it’s a perfectly reasonable result. Security is important, but when there’s an immediate conflicting requirement, security takes a back seat. I don’t think this is a problem of security literacy, or of awareness, or of training. It’s a consequence of our natural proclivity to take risks when the rewards are great.

Given the option, I would choose the security threat, too.

In the IT world, we need to recognize this reality. We need to build security that’s flexible and adaptable, that can respond to and mitigate security breaches, and can maintain security even in the face of business executives who would deliberately bypass security protection measures to achieve the biggest deal of their lives.

This essay previously appeared on the Resilient Systems blog.
https://www.resilientsystems.com/blog-post/…

http://betanews.com/2015/11/20/…


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <https://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of 12 books—including “Liars and Outliers: Enabling the Trust Society Needs to Survive”—as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation’s Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Resilient Systems, Inc. See <https://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of Resilient Systems, Inc.

Copyright (c) 2015 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.