Crypto-Gram

February 15, 2018

by Bruce Schneier
CTO, IBM Resilient
schneier@schneier.com
https://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <https://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <https://www.schneier.com/crypto-gram/archives/2018/…>. These same essays and news items appear in the “Schneier on Security” blog at <https://www.schneier.com/>, along with a lively and intelligent comment section. An RSS feed is available.


In this issue:


The Effects of the Spectre and Meltdown Vulnerabilities

On January 3, the world learned about a series of major security vulnerabilities in modern microprocessors. Called Spectre and Meltdown, these vulnerabilities were discovered by several different researchers last summer, disclosed to the microprocessors’ manufacturers, and patched—at least to the extent possible.

This news isn’t really any different from the usual endless stream of security vulnerabilities and patches, but it’s also a harbinger of the sorts of security problems we’re going to be seeing in the coming years. These are vulnerabilities in computer hardware, not software. They affect virtually all high-end microprocessors produced in the last 20 years. Patching them requires large-scale coordination across the industry, and in some cases drastically affects the performance of the computers. And sometimes patching isn’t possible; the vulnerability will remain until the computer is discarded.

Spectre and Meltdown aren’t anomalies. They represent a new area to look for vulnerabilities and a new avenue of attack. They’re the future of security—and it doesn’t look good for the defenders.

Modern computers do lots of things at the same time. Your computer and your phone simultaneously run several applications—or apps. Your browser has several windows open. A cloud computer runs applications for many different computers. All of those applications need to be isolated from each other. For security, one application isn’t supposed to be able to peek at what another one is doing, except in very controlled circumstances. Otherwise, a malicious advertisement on a website you’re visiting could eavesdrop on your banking details, or the cloud service purchased by some foreign intelligence organization could eavesdrop on every other cloud customer, and so on. The companies that write browsers, operating systems, and cloud infrastructure spend a lot of time making sure this isolation works.

Both Spectre and Meltdown break that isolation, deep down at the microprocessor level, by exploiting performance optimizations that have been implemented for the past decade or so. Basically, microprocessors have become so fast that they spend a lot of time waiting for data to move in and out of memory. To increase performance, these processors guess what data they’re going to receive and execute instructions based on that. If the guess turns out to be correct, it’s a performance win. If it’s wrong, the microprocessors throw away what they’ve done without losing any time. This feature is called speculative execution.

Spectre and Meltdown attack speculative execution in different ways. Meltdown is more of a conventional vulnerability; the designers of the speculative-execution process made a mistake, so they just needed to fix it. Spectre is worse; it’s a flaw in the very concept of speculative execution. There’s no way to patch that vulnerability; the chips need to be redesigned in such a way as to eliminate it.

Since the announcement, manufacturers have been rolling out patches to these vulnerabilities to the extent possible. Operating systems have been patched so that attackers can’t make use of the vulnerabilities. Web browsers have been patched. Chips have been patched. From the user’s perspective, these are routine fixes. But several aspects of these vulnerabilities illustrate the sorts of security problems we’re only going to be seeing more of.

First, attacks against hardware, as opposed to software, will become more common. Last fall, vulnerabilities were discovered in Intel’s Management Engine, a remote-administration feature on its microprocessors. Like Spectre and Meltdown, they affected how the chips operate. Looking for vulnerabilities on computer chips is new. Now that researchers know this is a fruitful area to explore, security researchers, foreign intelligence agencies, and criminals will be on the hunt.

Second, because microprocessors are fundamental parts of computers, patching requires coordination between many companies. Even when manufacturers like Intel and AMD can write a patch for a vulnerability, computer makers and application vendors still have to customize and push the patch out to the users. This makes it much harder to keep vulnerabilities secret while patches are being written. Spectre and Meltdown were announced prematurely because details were leaking and rumors were swirling. Situations like this give malicious actors more opportunity to attack systems before they’re guarded.

Third, these vulnerabilities will affect computers’ functionality. In some cases, the patches for Spectre and Meltdown result in significant reductions in speed. The press initially reported 30%, but that only seems true for certain servers running in the cloud. For your personal computer or phone, the performance hit from the patch is minimal. But as more vulnerabilities are discovered in hardware, patches will affect performance in noticeable ways.

And then there are the unpatchable vulnerabilities. For decades, the computer industry has kept things secure by finding vulnerabilities in fielded products and quickly patching them. Now there are cases where that doesn’t work. Sometimes it’s because computers are in cheap products that don’t have a patch mechanism, like many of the DVRs and webcams that are vulnerable to the Mirai (and other) botnets—groups of Internet-connected devices sabotaged for coordinated digital attacks. Sometimes it’s because a computer chip’s functionality is so core to a computer’s design that patching it effectively means turning the computer off. This, too, is becoming more common.

Increasingly, everything is a computer: not just your laptop and phone, but your car, your appliances, your medical devices, and global infrastructure. These computers are and always will be vulnerable, but Spectre and Meltdown represent a new class of vulnerability. Unpatchable vulnerabilities in the deepest recesses of the world’s computer hardware is the new normal. It’s going to leave us all much more vulnerable in the future.

This essay previously appeared on TheAtlantic.com.
https://www.theatlantic.com/technology/archive/2018/…

https://www.nytimes.com/2018/01/03/business/…
https://zeltser.com/…
https://.barkly.com/meltdown-spectre-bugs-explained

Research papers:
https://meltdownattack.com/meltdown.pdf
https://spectreattack.com/spectre.pdf

Intel Management Engine flaw:
https://www.wired.com/story/…

Early story on Spectre/Meltdown::
https://www.theregister.co.uk/2018/01/02/…

Performance effects of patches:
https://www.forbes.com/sites/brookecrothers/2018/01/…

Everything is a computer:
https://www.theatlantic.com/technology/archive/2017/…


News

Jim Risen writes a long and interesting article about his battles with the US government and the “New York Times” to report government secrets.
https://theintercept.com/2018/01/03/…

Interesting article by Major General Hao Yeli, Chinese People’s Liberation Army (ret.), a senior advisor at the China International Institute for Strategic Society, Vice President of China Institute for Innovation and Development Strategy, and the Chair of the Guanchao Cyber Forum.
http://cco.ndu.edu/Portals/96/Documents/prism/…

Student cracks the Inca knot code.
https://www.bostonglobe.com/metro/massachusetts/…

Interesting research: “Long-term market implications of data breaches, not,” by Russell Lange and Eric W. Burger. The market isn’t going to fix this. If we want better security, we need to regulate the market.
http://www.tandfonline.com/doi/full/10.1080/…
https://s2erc.georgetown.edu/sites/s2erc/files/…
http://ceur-ws.org/Vol-1816/paper-18.pdf

The EFF and Lookout are reporting on a new piece of spyware operating out of Lebanon. It primarily targets mobile devices compromised by fake versions of secure messaging clients like Signal and WhatsApp.
https://www.eff.org/press/releases/…
https://.lookout.com/dark-caracal-mobile-apt
https://info.lookout.com/rs/051-ESQ-475/images/…
https://www.theregister.co.uk/2018/01/18/…
https://www.theverge.com/2018/1/18/16905464/…
https://www.forbes.com/sites/thomasbrewster/2018/01/…

Kaspersky Labs is reporting on a new piece of sophisticated malware for Android: Skygofree.
https://securelist.com/…
https://arstechnica.com/information-technology/2018/…
https://boingboing.net/2018/01/17/hacking-team-2-0.html

Several new strains of malware hijack cryptocurrency mining. So far it hasn’t been very profitable, but it—or some later version—eventually will be.
https://arstechnica.com/information-technology/2018/…

Detecting drone surveillance with traffic analysis. The details have to do with the way drone video is compressed.
https://www.wired.com/story/…
https://arxiv.org/pdf/1801.03074.pdf
https://www.youtube.com/watch?v=4icQwducz68

A new vulnerability in WhatsApp has been discovered:
https://www.wired.com/story/…
Matthew Green has a good description:
https://.cryptographyengineering.com/2018/01/10/…
Here’s the research paper.
https://eprint.iacr.org/2017/713.pdf
Commentary from Moxie Marlinspike, the developer of the protocol.
https://news.ycombinator.com/item?id=16117487

It’s really hard to estimate the cost of an insecure Internet. Studies are all over the map. A methodical study by RAND is the best work I’ve seen at trying to put a number on this. The results are, well, all over the map. “The resulting values are highly sensitive to input parameters; for instance, the global cost of cyber crime has direct gross domestic product (GDP) costs of $275 billion to $6.6 trillion and total GDP costs (direct plus systemic) of $799 billion to $22.5 trillion (1.1 to 32.4 percent of GDP).”
https://www.rand.org/pubs/research_reports/RR2299.html
Here’s Rand’s risk calculator, if you want to play with the parameters yourself.
https://www.rand.org/pubs/tools/TL281.html
Note: I was an advisor to the project.

Separately, Symantec has published a new cybercrime report with its own statistics.
https://www.symantec.com/content/dam/symantec/docs/…

In November, the company Strava released an anonymous data-visualization map showing all the fitness activity by everyone using the app. Over this weekend, someone realized that it could be used to locate secret military bases: just look for repeated fitness activity in the middle of nowhere.
https://www.theguardian.com/world/2018/jan/28/…
https://twitter.com/jack_dot_bin/status/…
https://labs.strava.com/heatmap/

Local residents are opposing adding an elevator to a subway station because terrorists might use it to detonate a bomb. No, really. There’s no actual threat analysis, only fear.
https://www.nytimes.com/2018/01/22/nyregion/…
In 2005, I coined the term “movie-plot threat” to denote a threat scenario that caused undue fear solely because of its specificity. Longtime readers of this blog will remember my annual Movie-Plot Threat Contests. I ended the contest in 2015 because I thought the meme had played itself out. Clearly there’s more work to be done.

According to this story, Israeli scientists released some information to the public they shouldn’t have.
https://www.haaretz.com/israel-news/…
https://archive.is/7iwNu
http://archive.is/4JEiB
Those officials have managed to ensure that the Haaretz article doesn’t have any actual information about the information. I have reason to believe the information is related to Internet security. Does anyone know more?
https://www.schneier.com/blog/archives/2018/01/…

Brian Krebs is reporting sophisticated jackpotting attacks against US ATMs. The attacker gains physical access to the ATM, plants malware using specialized electronics, and then later returns and forces the machine to dispense all the cash it has inside.
https://krebsonsecurity.com/2018/01/…

Stuxnet famously used legitimate digital certificates to sign its malware. A research paper from last year found that the practice is much more common than previously thought.
http://www.umiacs.umd.edu/~tdumitra/papers/CCS-2017.pdf
https://arstechnica.com/information-technology/2017/…

A CNN reporter found some sensitive—but, technically, not classified—documents about Super Bowl security in the front pocket of an airplane seat.
https://www.usatoday.com/story/news/nation/2018/02/…

The “Guardian” is reporting that “every NHS trust assessed for cyber security vulnerabilities has failed to meet the standard required.”
https://www.theguardian.com/technology/2018/feb/05/…
https://www.theregister.co.uk/2018/02/06/…
https://www.recode.net/2017/6/27/15881666/…

A water utility in Europe has been infected by cryptocurrency mining software. This is a relatively new attack: hackers compromise computers and force them to mine cryptocurrency for them. This is the first time I’ve seen it infect SCADA systems, though. It seems that this mining software is benign, and doesn’t affect the performance of the hacked computer. (A smart virus doesn’t kill its host.) But that’s not going to always be the case.
http://www.eweek.com/security/…
https://thehackernews.com/2018/01/…

In “The House that Spied on Me,” Kashmir Hill outfits her home to be as “smart” as possible and writes about the results.
https://gizmodo.com/…

Internet security threats at the Olympics:
https://www.lawfareblog.com/…

There has already been one attack:
https://www.reuters.com/article/…

Nice profile of Mordechai Guri, who researches a variety of clever ways to steal data over air-gapped computers.
https://www.wired.com/story/…
https://boingboing.net/2018/02/07/…


After Section 702 Reauthorization

For over a decade, civil libertarians have been fighting government mass surveillance of innocent Americans over the Internet. We’ve just lost an important battle. On January 18, President Trump signed the renewal of Section 702, domestic mass surveillance became effectively a permanent part of US law.

Section 702 was initially passed in 2008, as an amendment to the Foreign Intelligence Surveillance Act of 1978. As the title of that law says, it was billed as a way for the NSA to spy on non-Americans located outside the United States. It was supposed to be an efficiency and cost-saving measure: the NSA was already permitted to tap communications cables located outside the country, and it was already permitted to tap communications cables from one foreign country to another that passed through the United States. Section 702 allowed it to tap those cables from inside the United States, where it was easier. It also allowed the NSA to request surveillance data directly from Internet companies under a program called PRISM.

The problem is that this authority also gave the NSA the ability to collect foreign communications and data in a way that inherently and intentionally also swept up Americans’ communications as well, without a warrant. Other law enforcement agencies are allowed to ask the NSA to search those communications, give their contents to the FBI and other agencies and then lie about their origins in court.

In 1978, after Watergate had revealed the Nixon administration’s abuses of power, we erected a wall between intelligence and law enforcement that prevented precisely this kind of sharing of surveillance data under any authority less restrictive than the Fourth Amendment. Weakening that wall is incredibly dangerous, and the NSA should never have been given this authority in the first place.

Arguably, it never was. The NSA had been doing this type of surveillance illegally for years, something that was first made public in 2006. Section 702 was secretly used as a way to paper over that illegal collection, but nothing in the text of the later amendment gives the NSA this authority. We didn’t know that the NSA was using this law as the statutory basis for this surveillance until Edward Snowden showed us in 2013.

Civil libertarians have been battling this law in both Congress and the courts ever since it was proposed, and the NSA’s domestic surveillance activities even longer. What this most recent vote tells me is that we’ve lost that fight.

Section 702 was passed under George W. Bush in 2008, reauthorized under Barack Obama in 2012, and now reauthorized again under Trump. In all three cases, congressional support was bipartisan. It has survived multiple lawsuits by the Electronic Frontier Foundation, the ACLU, and others. It has survived the revelations by Snowden that it was being used far more extensively than Congress or the public believed, and numerous public reports of violations of the law. It has even survived Trump’s belief that he was being personally spied on by the intelligence community, as well as any congressional fears that Trump could abuse the authority in the coming years. And though this extension lasts only six years, it’s inconceivable to me that it will ever be repealed at this point.

So what do we do? If we can’t fight this particular statutory authority, where’s the new front on surveillance? There are, it turns out, reasonable modifications that target surveillance more generally, and not in terms of any particular statutory authority. We need to look at US surveillance law more generally.

First, we need to strengthen the minimization procedures to limit incidental collection. Since the Internet was developed, all the world’s communications travel around in a single global network. It’s impossible to collect only foreign communications, because they’re invariably mixed in with domestic communications. This is called “incidental” collection, but that’s a misleading name. It’s collected knowingly, and searched regularly. The intelligence community needs much stronger restrictions on which American communications channels it can access without a court order, and rules that require they delete the data if they inadvertently collect it. More importantly, “collection” is defined as the point the NSA takes a copy of the communications, and not later when they search their databases.

Second, we need to limit how other law enforcement agencies can use incidentally collected information. Today, those agencies can query a database of incidental collection on Americans. The NSA can legally pass information to those other agencies. This has to stop. Data collected by the NSA under its foreign surveillance authority should not be used as a vehicle for domestic surveillance.

The most recent reauthorization modified this lightly, forcing the FBI to obtain a court order when querying the 702 data for a criminal investigation. There are still exceptions and loopholes, though.

Third, we need to end what’s called “parallel construction.” Today, when a law enforcement agency uses evidence found in this NSA database to arrest someone, it doesn’t have to disclose that fact in court. It can reconstruct the evidence in some other manner once it knows about it, and then pretend it learned of it that way. This right to lie to the judge and the defense is corrosive to liberty, and it must end.

Pressure to reform the NSA will probably first come from Europe. Already, European Union courts have pointed to warrantless NSA surveillance as a reason to keep Europeans’ data out of US hands. Right now, there is a fragile agreement between the EU and the United States—called “Privacy Shield”—that requires Americans to maintain certain safeguards for international data flows. NSA surveillance goes against that, and it’s only a matter of time before EU courts start ruling this way. That’ll have significant effects on both government and corporate surveillance of Europeans and, by extension, the entire world.

Further pressure will come from the increased surveillance coming from the Internet of Things. When your home, car, and body are awash in sensors, privacy from both governments and corporations will become increasingly important. Sooner or later, society will reach a tipping point where it’s all too much. When that happens, we’re going to see significant pushback against surveillance of all kinds. That’s when we’ll get new laws that revise all government authorities in this area: a clean sweep for a new world, one with new norms and new fears.

It’s possible that a federal court will rule on Section 702. Although there have been many lawsuits challenging the legality of what the NSA is doing and the constitutionality of the 702 program, no court has ever ruled on those questions. The Bush and Obama administrations successfully argued that defendants don’t have legal standing to sue. That is, they have no right to sue because they don’t know they’re being targeted. If any of the lawsuits can get past that, things might change dramatically.

Meanwhile, much of this is the responsibility of the tech sector. This problem exists primarily because Internet companies collect and retain so much personal data and allow it to be sent across the network with minimal security. Since the government has abdicated its responsibility to protect our privacy and security, these companies need to step up: Minimize data collection. Don’t save data longer than absolutely necessary. Encrypt what has to be saved. Well-designed Internet services will safeguard users, regardless of government surveillance authority.

For the rest of us concerned about this, it’s important not to give up hope. Everything we do to keep the issue in the public eye—and not just when the authority comes up for reauthorization again in 2024—hastens the day when we will reaffirm our rights to privacy in the digital age.

This essay previously appeared in the “Washington Post.”
https://www.washingtonpost.com/news/posteverything/…

https://www.politico.com/story/2018/01/19/…
https://www.wired.com/story/…

“Incidental collection”:
https://www.eff.org/pages/Incidental-collection

“Parallel construction:”
https://theintercept.com/2017/11/30/…

2006 surveillance abuses:
http://www.nytimes.com/2006/04/13/us/…

2013 surveillance abuses from Snowden:
https://www.theguardian.com/world/2013/aug/09/…

702 compliance violations:
https://www.newamerica.org/oti//…

Trump’s beliefs:
http://www.washingtonexaminer.com/…

Privacy Shield:
https://www.privacyshield.gov/


Schneier News

I spoke at Columbia University’s School of International Affairs with Jason Healey and Merit Janow on 2/8:
http://isoc-ny.org/p2/9891


Cabinet of Secret Documents from Australia

This story of leaked Australian government secrets is unlike any other I’ve heard:

It begins at a second-hand shop in Canberra, where ex-government furniture is sold off cheaply.

The deals can be even cheaper when the items in question are two heavy filing cabinets to which no-one can find the keys.

They were purchased for small change and sat unopened for some months until the locks were attacked with a drill.

Inside was the trove of documents now known as The Cabinet Files.

The thousands of pages reveal the inner workings of five separate governments and span nearly a decade.

Nearly all the files are classified, some as “top secret” or “AUSTEO”, which means they are to be seen by Australian eyes only.

Yes, that really happened. The person who bought and opened the file cabinets contacted the Australian Broadcasting Corp, who is now publishing a bunch of it.

There’s lots of interesting (and embarrassing) stuff in the documents, although most of it is local politics. I am more interested in the government’s reaction to the incident: they’re pushing for a law making it illegal for the press to publish government secrets it received through unofficial channels.

“The one thing I would point out about the legislation that does concern me particularly is that classified information is an element of the offence,” he said.

“That is to say, if you’ve got a filing cabinet that is full of classified information … that means all the Crown has to prove if they’re prosecuting you is that it is classified—nothing else.

“They don’t have to prove that you knew it was classified, so knowledge is beside the point.”

[…]

Many groups have raised concerns, including media organisations who say they unfairly target journalists trying to do their job.

But really anyone could be prosecuted just for possessing classified information, regardless of whether they know about it.

That might include, for instance, if you stumbled across a folder of secret files in a regular skip bin while walking home and handed it over to a journalist.

This illustrates a fundamental misunderstanding of the threat. The Australian Broadcasting Corp gets their funding from the government, and was very restrained in what they published. They waited months before publishing as they coordinated with the Australian government. They allowed the government to secure the files, and then returned them. From the government’s perspective, they were the best possible media outlet to receive this information. If the government makes it illegal for the Australian press to publish this sort of material, the next time it will be sent to the BBC, the Guardian, the New York Times, or WikiLeaks. And since people no longer read their news from newspapers sold in stores but on the Internet, the result will be just as many people reading the stories with far fewer redactions.

The proposed law is older than this leak, but the leak is giving it new life. The Australian opposition party is being cagey on whether they will support the law. They don’t want to appear weak on national security, so I’m not optimistic.

http://www.abc.net.au/news/2018-01-31/…
https://www.npr.org/sections/thetwo-way/2018/01/31/…
https://www.theguardian.com/commentisfree/2017/dec/…

The new law:
http://www.abc.net.au/news/2018-02-02/…
https://www.theaustralian.com.au/business/media/…

How ABC worked with the Australian government:
https://www.theguardian.com/australia-news/2018/feb/…
https://www.theguardian.com/media/2018/feb/02/…

Australia backed down about the new law:
https://www.theguardian.com/australia-news/2018/feb/…

A great political cartoon:
https://www.fairfaxstatic.com.au/content/dam/images/…


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <https://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of 12 books—including “Liars and Outliers: Enabling the Trust Society Needs to Survive”—as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation’s Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and CTO of IBM Resilient and Special Advisor to IBM Security. See <https://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of IBM Resilient.

Copyright (c) 2018 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.