August 15, 2015
by Bruce Schneier
CTO, Resilient Systems, Inc.
A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit <https://www.schneier.com/crypto-gram.html>.
You can read this issue on the web at <https://www.schneier.com/crypto-gram/archives/2015/…>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively and intelligent comment section. An RSS feed is available.
In this issue:
- Moving Crypto-Gram: An Update
- Backdoors Won’t Solve Comey’s Going Dark Problem
- Another Salvo in the Second Crypto War (of Words)
- Cosa Nostra Dead Drops
- Bizarre High-Tech Kidnapping
- Fugitive Located by Spotify
- Schneier News
- Intimidating Military Personnel by Targeting Their Families
- Stagefright Vulnerability in Android Phones
Moving Crypto-Gram: An Update
Last month I announced that the Crypto-Gram mailing list would be moving to a new home on Dreamhost, and you’d all have to reconfirm your subscriptions in order to keep receiving Crypto-Gram. That didn’t happen because—at the last minute—I found out the new host had bugs in its bounce processing system, and they admitted the issues would probably never be fixed.
I still plan to move the list, but I’m not ready to announce a date or a destination yet. When the move happens, you won’t have to do anything to confirm that you still want to be subscribed—at least not right away. (I might still have to make some of you reconfirm your subscriptions later, in order to clean up the list and reduce problems with spam blocking services, but I’ll avoid it if I can.)
I’m sorry for all the confusion, and I’m especially sorry that the list’s subscribe and unsubscribe features were down for an extended period during the move that didn’t happen. They’re back up now, and you can unsubscribe by using the link at the very end of this e-mail. If you’re reading this after the list has moved, and find the link doesn’t work, you can always e-mail firstname.lastname@example.org and I’ll make sure you get removed.
Backdoors Won’t Solve Comey’s Going Dark Problem
At the Aspen Security Forum two weeks ago, James Comey (and others) explicitly talked about the “going dark” problem, describing the specific scenario they are concerned about. Maybe others have heard the scenario before, but it was a first for me. It centers around ISIL operatives abroad and ISIL-inspired terrorists here in the US. The FBI knows who the Americans are, can get a court order to carry out surveillance on their communications, but cannot eavesdrop on the conversations, because they are encrypted. They can get the metadata, so they know who is talking to who, but they can’t find out what’s being said.
“ISIL’s M.O. is to broadcast on Twitter, get people to follow them, then move them to Twitter Direct Messaging” to evaluate if they are a legitimate recruit, he said. “Then they’ll move them to an encrypted mobile-messaging app so they go dark to us.”
The FBI can get court-approved access to Twitter exchanges, but not to encrypted communication, Comey said. Even when the FBI demonstrates probable cause and gets a judicial order to intercept that communication, it cannot break the encryption for technological reasons, according to Comey.
If this is what Comey and the FBI are actually concerned about, they’re getting bad advice—because their proposed solution won’t solve the problem. Comey wants communications companies to give them the capability to eavesdrop on conversations without the conversants’ knowledge or consent; that’s the “backdoor” we’re all talking about. But the problem isn’t that most encrypted communications platforms are securely encrypted, or even that some are—the problem is that there exists at least one securely encrypted communications platform on the planet that ISIL can use.
Imagine that Comey got what he wanted. Imagine that iMessage and Facebook and Skype and everything else US-made had his backdoor. The ISIL operative would tell his potential recruit to use something else, something secure and non-US-made. Maybe an encryption program from Finland, or Switzerland, or Brazil. Maybe Mujahedeen Secrets. Maybe anything. (Sure, some of these will have flaws, and they’ll be identifiable by their metadata, but the FBI already has the metadata, and the better software will rise to the top.) As long as there is *something* that the ISIL operative can move them to, some software that the American can download and install on their phone or computer, or hardware that they can buy from abroad, the FBI still won’t be able to eavesdrop.
And by pushing these ISIL operatives to non-US platforms, they lose access to the metadata they otherwise have.
Convincing US companies to install backdoors isn’t enough; in order to solve this going dark problem, the FBI has to ensure that an American can only use backdoored software. And the only way to do that is to prohibit the use of non-backdoored software, which is the sort of thing that the UK’s David Cameron said he wanted for his country in January:
But the question is are we going to allow a means of communications which it simply isn’t possible to read. My answer to that question is: no, we must not.
And that, of course, is impossible. Jonathan Zittrain explained why. And Cory Doctorow outlined what trying would entail:
For David Cameron’s proposal to work, he will need to stop Britons from installing software that comes from software creators who are out of his jurisdiction. The very best in secure communications are already free/open source projects, maintained by thousands of independent programmers around the world. They are widely available, and thanks to things like cryptographic signing, it is possible to download these packages from any server in the world (not just big ones like Github) and verify, with a very high degree of confidence, that the software you’ve downloaded hasn’t been tampered with.
This, then, is what David Cameron is proposing:
* All Britons’ communications must be easy for criminals, voyeurs and foreign spies to intercept.
* Any firms within reach of the UK government must be banned from producing secure software.
* All major code repositories, such as Github and Sourceforge, must be blocked.
* Search engines must not answer queries about web-pages that carry secure software.
* Virtually all academic security work in the UK must cease—security research must only take place in proprietary research environments where there is no onus to publish one’s findings, such as industry R&D and the security services.
* All packets in and out of the country, and within the country, must be subject to Chinese-style deep-packet inspection and any packets that appear to originate from secure software must be dropped.
* Existing walled gardens (like IOs and games consoles) must be ordered to ban their users from installing secure software.
* Anyone visiting the country from abroad must have their smartphones held at the border until they leave.
* Proprietary operating system vendors (Microsoft and Apple) must be ordered to redesign their operating systems as walled gardens that only allow users to run software from an app store, which will not sell or give secure software to Britons.
* Free/open source operating systems—that power the energy, banking, ecommerce, and infrastructure sectors—must be banned outright.
As extreme as it reads, without all of that, the ISIL operative would be able to communicate securely with his potential American recruit. And all of this is not going to happen.
Last week, former NSA director Mike McConnell, former DHS secretary Michael Chertoff, and former deputy defense secretary William Lynn published a Washington Post op-ed opposing backdoors in encryption software. They wrote:
Today, with almost everyone carrying a networked device on his or her person, ubiquitous encryption provides essential security. If law enforcement and intelligence organizations face a future without assured access to encrypted communications, they will develop technologies and techniques to meet their legitimate mission goals.
I believe this is true. Already one is being talked about in the academic literature: lawful hacking.
Perhaps the FBI’s reluctance to accept this is based on their belief that all encryption software comes from the US, and therefore is under their influence. Back in the 1990s, during the first Crypto Wars, the US government had a similar belief. To convince them otherwise, George Washington University surveyed the cryptography market in 1999 and found that there were over 500 companies in 70 countries manufacturing or distributing non-US cryptography products. Maybe we need a similar study today.
This essay previously appeared on Lawfare.
Aspen Security Forum:
Comey’s remarks at the forum:
Identifying encryption programs from the metadata:
What Cameron wants:
Washington Post op-ed:
The First Crypto Wars:
George Washington University survey from 1999:
Another Salvo in the Second Crypto War (of Words)
Prosecutors from New York, London, Paris, and Madrid wrote an op-ed in yesterday’s New York Times in favor of backdoors in cell phone encryption. There are a number of flaws in their argument, ranging from how easy it is to get data off an encrypted phone to the dangers of designing a backdoor in the first place, but all of that has been said before. And since anecdote can be more persuasive than data, the op-ed started with one:
In June, a father of six was shot dead on a Monday afternoon in Evanston, Ill., a suburb 10 miles north of Chicago. The Evanston police believe that the victim, Ray C. Owens, had also been robbed. There were no witnesses to his killing, and no surveillance footage either.
With a killer on the loose and few leads at their disposal, investigators in Cook County, which includes Evanston, were encouraged when they found two smartphones alongside the body of the deceased: an iPhone 6 running on Apple’s iOS 8 operating system, and a Samsung Galaxy S6 Edge running on Google’s Android operating system. Both devices were passcode protected.
You can guess the rest. A judge issued a warrant, but neither Apple nor Google could unlock the phones. “The homicide remains unsolved. The killer remains at large.”
The Intercept researched the example, and it seems to be real. The phones belonged to the victim, and…
According to Commander Joseph Dugan of the Evanston Police Department, investigators were able to obtain records of the calls to and from the phones, but those records did not prove useful. By contrast, interviews with people who knew Owens suggested that he communicated mainly through text messages—the kind that travel as encrypted data—and had made plans to meet someone shortly before he was shot.
The information on his phone was not backed up automatically on Apple’s servers—apparently because he didn’t use wi-fi, which backups require.
But Dugan also wasn’t as quick to lay the blame solely on the encrypted phones. “I don’t know if getting in there, getting the information, would solve the case,” he said, “but it definitely would give us more investigative leads to follow up on.”
This is the first actual example I’ve seen illustrating the value of a backdoor. Unlike the increasingly common example of an ISIL handler abroad communicating securely with a radicalized person in the US, it’s an example where a backdoor might have helped. I say “might have,” because the Galaxy S6 is not encrypted by default, which means the victim deliberately turned the encryption on. If the native smartphone encryption had been backdoored, we don’t know if the victim would have turned it on nevertheless, or if he would have employed a different, non-backdoored, app.
The authors’ other examples are much sloppier:
Between October and June, 74 iPhones running the iOS 8 operating system could not be accessed by investigators for the Manhattan district attorney’s office—despite judicial warrants to search the devices. The investigations that were disrupted include the attempted murder of three individuals, the repeated sexual abuse of a child, a continuing sex trafficking ring and numerous assaults and robberies.
In France, smartphone data was vital to the swift investigation of the Charlie Hebdo terrorist attacks in January, and the deadly attack on a gas facility at Saint-Quentin-Fallavier, near Lyon, in June. And on a daily basis, our agencies rely on evidence lawfully retrieved from smartphones to fight sex crimes, child abuse, cybercrime, robberies or homicides.
We’ve heard that 74 number before. It’s over nine months, in an office that handles about 100,000 cases a year: less than 0.1% of the time. Details about those cases would be useful, so we can determine if encryption was just an impediment to investigation, or resulted in a criminal going free. The government needs to do a better job of presenting empirical data to support its case for backdoors. That they’re unable to do so suggests very strongly that an empirical analysis wouldn’t favor the government’s case.
As to the Charlie Hebdo case, it’s not clear how much of that vital smartphone data was actual data, and how much of it was unable-to-be-encrypted metadata. I am reminded of the examples that then-FBI-Director Louis Freeh would give during the First Crypto Wars in the 1990s. The big one used to illustrate the dangers of encryption was Mafia boss John Gotti. But the surveillance that convicted him was a room bug, not a wiretap. Given that the examples from FBI Director James Comey’s “going dark” speech last year were bogus, skepticism in the face of anecdote seems prudent.
So much of this “going dark” versus the “golden age of surveillance” debate depends on where you start from. Referring to that first Evanston example and the inability to get evidence from the victim’s phones, the op-ed authors write: “Until very recently, this situation would not have occurred.” That’s utter nonsense. From the beginning of time until very recently, this was the only situation that could have occurred. Objects in the vicinity of an event were largely mute about the past. Few things, save for eyewitnesses, could ever reach back in time and produce evidence. Even 15 years ago, the victim’s cell phone would have had no evidence on it that couldn’t have been obtained elsewhere, and that’s if the victim had been carrying a cell phone at all.
For most of human history, surveillance has been expensive. Over the last couple of decades, it has become incredibly cheap and almost ubiquitous. That a few bits and pieces are becoming expensive again isn’t a cause for alarm.
The essay originally appeared on the Lawfare blog:
Getting data off an iPhone:
The dangers of designing back doors:
The Evanston case:
How the ISIL Example is flawed:
Galaxy S6 encryption:
That “74” number:
The government’s data doesn’t support the arguments:
Cell phone data in the Charlie Hebdo shooting:
John Gotti surveillance:
Comey’s “going dark” speech:
The bogus examples from that speech:
Comey’s “going dark” essay:
The golden age of surveillance:
The declining cost of surveillance:
The ubiquitousness of surveillance:
Excellent parody/commentary: “When Curtains Block Justice.”
The ProxyHam project (and associated Def Con talk) has been canceled under mysterious circumstances. No one seems to know anything, and conspiracy theories abound.
How to build your own ProxyHam:
Micah Lee has a good tutorial on installing and using secure chat.
Google secures photos using public but unguessable URLs. It’s a perfectly valid security measure, although unsettling to some.
The—depending on who is doing the reporting—cheating, affair, adultery, or infidelity site Ashley Madison has been hacked. The hackers are threatening to expose all of the company’s documents, including internal e-mails and details of its 37 million customers.
In an essay/review of a book on UK intelligence officer and Soviet spy Kim Philby, Malcolm Gladwell makes an interesting observation.
Preventing book theft in the Middle Ages:
Hackers can remotely hack the Uconnect system in cars just by knowing the car’s IP address. They can disable the brakes, turn on the AC, blast music, and disable the transmission.
In related news, there’s a Senate bill to improve car security standards. Honestly, I’m not sure our security technology is enough to prevent this sort of thing if the car’s controls are attached to the Internet.
A worker in Amazon’s packaging department in India figured out how to deliver electronics to himself.
This is an interesting article that looks at Hacking Team’s purchasing of zero-day (0day) vulnerabilities from a variety of sources. Lots of details in the article. This was made possible by the organizational doxing of Hacking Team by some unknown individuals or group.
Michael Chertoff speaks out against backdoors
Commentary, and former Director of the National Counterintelligence Center Michael Leiter’s comments.
New research: “All Your Biases Belong To Us: Breaking RC4 in WPA-TKIP and TLS,” by Mathy Vanhoef and Frank Piessens.
It’s common wisdom that the NSA was unable to intercept phone calls from Khalid al-Mihdhar in San Diego to Bin Ladin in Yemen because of legal restrictions. This has been used to justify the NSA’s massive phone metadata collection programs. James Bamford argues that there were no legal restrictions, and that the NSA screwed up.
New paper: “‘…no one can hack my mind’: Comparing Expert and Non-Expert Security Practices,” by Iulia Ion, Rob Reeder, and Sunny Consolvo.
FireEye has a detailed report of a sophisticated piece of Russian malware: HAMMERTOSS. It uses some clever techniques to hide.
John Mueller has a good essay on how the ISIS threat is overblown.
Brink’s sells an Internet-enabled smart safe called the CompuSafe Galileo. Despite being sold as a more secure safe, it’s wildly insecure.
The details all sounds familiar. The computer industry learned its lessons over a decade ago. Before then they ignored security vulnerabilities, threatened researchers, and generally behaved very badly. I expect the same things to happen with Internet-of-Things companies.
A Kentucky man shot down a drone that was hovering in his backyard. He was arrested, but what is the law?
New research can identify a person by reading their thermal signature in complete darkness and then matching it with ordinary photographs.
Nicholas Weaver has an excellent essay on iPhone security. In it, he explains how Apple could enable surveillance on iMessage and FaceTime.
There’s a persistent rumor going around that Apple is in the secret FISA Court, fighting a government order to make its platform more surveillance-friendly—and they’re losing. This might explain Apple CEO Tim Cook’s somewhat sudden vehemence about privacy. I have not found any confirmation of the rumor.
Good fictional account of an average computer user and how people understand and view security.
Related: “Real World Use Cases for High-Risk Users.”
Before Edward Snowden told us so much about NSA surveillance, before Mark Klein told us a little, even before 9/11, Duncan Campbell broke the story of ECHELON. This is his story of that story. It’s a fascinating read.
(Yes, it turns out that NSA mass surveillance didn’t start after 9/11.)
Interesting research detecting betrayal in the game of Diplomacy by analyzing interplayer messages.
Back when I was in high school, I briefly published a postal Diplomacy zine.
Local police are trying to convince drug dealers to turn each other in by pointing out that it reduces competition.
The British Museum wants help breaking a code on a 13th-century sword.
Good Q&A with Cynthia Dwork on algorithmic bias.
Cosa Nostra Dead Drops
Good operational security is hard, and often uses manual technologies:
Investigators described how Messina Denaro, 53, disdains telecommunications and relies on handwritten notes, or “pizzini,'” to relay orders. The notes were wadded tight, covered in tape and hidden under rocks or dug into soil until go-betweens retrieved them. The messages were ordered destroyed after being read.
That’s a classic dead drop.
Bizarre High-Tech Kidnapping
This is a story of a very high-tech kidnapping:
FBI court filings unsealed last week showed how Denise Huskins’ kidnappers used anonymous remailers, image sharing sites, Tor, and other people’s Wi-Fi to communicate with the police and the media, scrupulously scrubbing meta data from photos before sending. They tried to use computer spyware and a DropCam to monitor the aftermath of the abduction and had a Parrot radio-controlled drone standing by to pick up the ransom by remote control.
The story also demonstrates just how effective the FBI is tracing cell phone usage these days. They had a blocked call from the kidnappers to the victim’s cell phone. First they used a search warrant to AT&T to get the actual calling number. After learning that it was an AT&T prepaid Tracfone, they called AT&T to find out where the burner was bought, what the serial numbers were, and the location where the calls were made from.
The FBI reached out to Tracfone, which was able to tell the agents that the phone was purchased from a Target store in Pleasant Hill on March 2 at 5:39 pm. Target provided the bureau with a surveillance-cam photo of the buyer: a white male with dark hair and medium build. AT&T turned over records showing the phone had been used within 650 feet of a cell site in South Lake Tahoe.
The criminal complaint borders on surreal. Were it an episode of CSI:Cyber, you would never believe it.
Fugitive Located by Spotify
The latest in identification by data:
Webber said a tipster had spotted recent activity from Nunn on the Spotify streaming service and alerted law enforcement. He scoured the Internet for other evidence of Nunn and Barr’s movements, eventually filling out 12 search warrants for records at different technology companies. Those searches led him to an IP address that traced Nunn to Cabo San Lucas, Webber said.
Nunn, he said, had been avidly streaming television shows and children’s programs on various online services, giving the sheriff’s department a hint to the couple’s location.
There’s a new Tumblr: “Meerkats that Look Like Bruce Schneier.” I have nothing to do with it.
I’m speaking—remotely via Skype—at LinuxCon in Seattle on August 18, 2015.
I’m speaking at CloudSec in Singapore on August 25, 2015.
I’m speaking at MindTheSec in Sao Paulo, Brazil, on August 27, 2015.
I’m speaking on the future of privacy at a public seminar sponsored by the Institute for Future Studies, in Stockholm, Sweden, on September 21, 2015.
I’m speaking at Next Generation Threats 2015 in Stockholm, Sweden, on September 22, 2015.
I’m speaking at Next Generation Threats 2015 in Gothenburg, Sweden, on September 23, 2015.
I’m speaking at Free and Safe in Cyberspace in Brussels on September 24, 2015.
I’ll be on a panel at Privacy. Security. Risk. 2015 in Las Vegas on September 30, 2015.
I’m speaking at the Privacy + Security Forum, October 21-23, 2015, at The Marvin Center in Washington, DC.
I’m speaking at the Boston Book Festival on October 24, 2015.
I’m speaking at the 4th Annual Cloud Security Congress EMEA in Berlin on November 17, 2015.
Intimidating Military Personnel by Targeting Their Families
This FBI alert is interesting:
(U//FOUO) In May 2015, the wife of a US military member was approached in front of her home by two Middle-Eastern males. The men stated that she was the wife of a US interrogator. When she denied their claims, the men laughed. The two men left the area in a dark-colored, four-door sedan with two other Middle-Eastern males in the vehicle. The woman had observed the vehicle in the neighborhood on previous occasions.
(U//FOUO) Similar incidents in Wyoming have been reported to the FBI throughout June 2015. On numerous occasions, family members of military personnel were confronted by Middle-Eastern males in front of their homes. The males have attempted to obtain personal information about the military member and family members through intimidation. The family members have reported feeling scared.
The report says nothing about whether these are isolated incidents, a trend, or part of a larger operation. But it has gotten me thinking about the new ways military personnel can be intimidated. More and more military personnel live here and work there, remotely as drone pilots, intelligence analysts, and so on, and their military and personal lives intertwine to a degree we have not seen before. There will be some interesting security repercussions from that.
Stagefright Vulnerability in Android Phones
The Stagefright vulnerability for Android phones is a bad one. It’s exploitable via a text message (details depend on auto downloading of the particular phone), it runs at an elevated privilege (again, the severity depends on the particular phone—on some phones it’s full privilege), and it’s trivial to weaponize. Imagine a worm that infects a phone and then immediately sends a copy of itself to everyone on that phone’s contact list.
The worst part of this is that it’s an Android exploit, so most phones won’t be patched anytime soon—if ever. (The people who discovered the bug alerted Google in April. Google has sent patches to its phone manufacturer partners, but most of them have not sent the patch to Android phone users.)
Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <https://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of 12 books—including “Liars and Outliers: Enabling the Trust Society Needs to Survive”—as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation’s Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Resilient Systems, Inc. See <https://www.schneier.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of Resilient Systems, Inc.
Copyright (c) 2015 by Bruce Schneier.