Blog: October 2014 Archives
The Intercept has published the complete manuals for Hacking Team's attack software. This follows a detailed report on Hacking Team's products from August. Hacking Team sells computer and cell phone hacking capabilities to the governments of Azerbaijan, Colombia, Egypt, Ethiopia, Hungary, Italy, Kazakhstan, Korea, Malaysia, Mexico, Morocco, Nigeria, Oman, Panama, Poland, Saudi Arabia, Sudan, Thailand, Turkey, UAE, and Uzbekistan...and probably others as well.
This is important. The NSA's capabilities are not unique to the NSA. They're not even unique to countries like the US, UK, China, Russia, France, Germany, and Israel. They're available for purchase by any totalitarian country that wants to spy on foreign governments or its own citizens. By ensuring an insecure Internet for everyone, the NSA enables companies like Hacking Team to thrive.
Worry about Ebola (or anything) manifests physically as what's known as a fight, flight, or freeze response. Biological systems ramp up or down to focus the body's resources on the threat at hand. Heart rate and blood pressure increase, immune function is suppressed (after an initial burst), brain chemistry changes, and the normal functioning of the digestive system is interrupted, among other effects. Like fear itself, these changes are protective in the short term. But when they persist, the changes prompted by chronic stress -- defined as stress beyond the normal hassles of life, lasting at least one to two weeks -- are associated with increased risk of cardiovascular disease (the leading cause of death in America); increased likelihood and severity of clinical depression (suicide is the 10th leading cause of death in America); depressed memory formation and recall; impaired fertility; reduced bone growth; and gastrointestinal disorders.
Perhaps most insidious of all, by suppressing our immune systems, chronic stress makes us more likely to catch infectious diseases, or suffer more -- or die -- from diseases that a healthy immune system would be better able to control. The fear of Ebola may well have an impact on the breadth and severity of how many people get sick, or die, from influenza this flu season. (The CDC reports that, either directly or indirectly, influenza kills between 3,000 and 49,000 people per year.)
There is no question that America's physical, economic, and social health is far more at risk from the fear of Ebola than from the virus itself.
EDITED TO ADD (10/30): The State of Louisiana is prohibiting researchers who have recently been to Ebola-infected countries from attending a conference on tropical medicine. So now we're at a point where our fear of Ebola is inhibiting scientific research into treating and curing Ebola.
Turning to the crime section of the Chapman Survey on American Fears, the team discovered findings that not only surprised them, but also those who work in fields pertaining to crime.
"What we found when we asked a series of questions pertaining to fears of various crimes is that a majority of Americans not only fear crimes such as, child abduction, gang violence, sexual assaults and others; but they also believe these crimes (and others) have increased over the past 20 years," said Dr. Edward Day who led this portion of the research and analysis. "When we looked at statistical data from police and FBI records, it showed crime has actually decreased in America in the past 20 years. Criminologists often get angry responses when we try to tell people the crime rate has gone down."
Despite evidence to the contrary, Americans do not feel like the United States is becoming a safer place. The Chapman Survey on American Fears asked how they think prevalence of several crimes today compare with 20 years ago. In all cases, the clear majority of respondents were pessimistic; and in all cases Americans believe crime has at least remained steady. Crimes specifically asked about were: child abduction, gang violence, human trafficking, mass riots, pedophilia, school shootings, serial killing and sexual assault.
EDITED TO ADD (11/13): Direct link to the data as well as the survey methodology.
The latest version of Apple's OS automatically syncs your files to iCloud Drive, even files you choose to store locally. Apple encrypts your data, both in transit and in iCloud, with a key it knows. Apple, of course, complies with all government requests: FBI warrants, subpoenas, and National Security Letters -- as well as NSA PRISM and whatever-else-they-have demands.
EDITED TO ADD (10/28): See comments. This seems to be way overstated. I will look at this again when I have time, probably tomorrow.
EDITED TO ADD (10/28): This is a more nuanced discussion of this issue. At this point, it seems clear that there is a lot less here than described in the blog post below.
EDITED TO ADD (10/29): There is something here. It only affects unsaved documents, and not all applications. But the OS's main text editor is one of them. Yes, this feature has been in the OS for a while, but that's not a defense. It's both dangerous and poorly documented.
There's a report that the FBI has identified a second leaker:
The case in question involves an Aug. 5 story published by The Intercept, an investigative website co-founded by Glenn Greenwald, the reporter who first published sensitive NSA documents obtained from Snowden.
Headlined "Barack Obama's Secret Terrorist-Tracking System, by the Numbers," the story cited a classified government document showing that nearly half the people on the U.S. government's master terrorist screening database had "no recognized terrorist affiliation."
The story, co-authored by Jeremy Scahill and Ryan Devereaux, was accompanied by a document "obtained from a source in the intelligence community" providing details about the watch-listing system that were dated as late as August 2013, months after Snowden fled to Hong Kong and revealed himself as the leaker of thousands of top secret documents from the NSA.
Here's a physical attack against a credit card verification system. Basically, the attack disrupts the communications between the retail terminal and the system that identifies revoked credit cards. Since retailers generally default to accepting cards when the system doesn't work, the attack is generally successful.
RC4 is an example of what I think of as a too-good-to-be-true cipher. It looks so simple. It is so simple. In classic cryptographic terms, it's a single rotor machine. It's a single self-modifying rotor, but it modifies itself very slowly. Even so, it's very hard to cryptanalyze. Even though the single rotor leaks information about its internal state with every output byte, its self-modifying structure always seems to stay ahead of analysis. But RC4 been around for over 25 years, and the best attacks are at the edge of practicality. When I talk about what sorts of secret cryptographic advances the NSA might have, a practical RC4 attack is one of the possibilities.
Spritz is Rivest and Schuldt's redesign of RC4. It retains all of the problems that RC4 had. It's built on a 256-element array of bytes, making it less than ideal for modern 32-bit and 64-bit CPUs. It's not very fast. (It's 50% slower than RC4, which was already much slower than algorithms like AES and Threefish.) It has a long key setup. But it's a very clever design.
Here are the cores of RC4 and Spritz:
1: i = i + 1
2: j = j + S[i]
4: z = S[S[i] + S[j]]
5: Return z
1: i = i + w
2: j = k + S[j + S[i]]
2a: k = i + k + S[j]
4: z = S[j + S[i + S[z + k]]]
5: Return z
S is an 8-bit permutation. In theory, it can be any size, which is nice for analysis, but in practice, it's a 256-element array. RC4 has two pointers into the array: i and j. Spritz adds a third: k. The parameter w is basically a constant. It's always 1 in RC4, but can be any odd number in Spritz (odd because that means it's always relatively prime to 256). In both ciphers, i slowly walks around the array, and j -- or j and k -- bounce around wildly. Both have a single swap of two elements of the array. And both produce an output byte, z, a function of all the other parameters. In Spritz, the previous z is part of the calculation of the current z.
That's the core. There are also functions for turning the key into the initial array permutation, using this as a stream cipher, using it as a hash function, and so on. It's basically a sponge function, so it has a lot of applications.
What's really interesting here is the way Rivest and Schuldt chose their various functions. They basically tried them all (given some constraints), and chose the ones with the best security properties. This is the sort of thing that can only be done with massive computing power.
I have always really liked RC4, and am happy to see a 21st-century redesign. I don't know what kind of use it'll get with its 8-bit word size, but surely there's a niche for it somewhere.
Interesting paper: Maya Embar, Louis M. McHough IV, and William R. Wesselman, "Printer watermark obfuscation," Proceeding
RIIT '14: Proceedings of the 3rd annual conference on Research in information technology:
Abstract: Most color laser printers manufactured and sold today add "invisible" information to make it easier to determine when a particular document was printed and exactly which printer was used. Some manufacturers have acknowledged the existence of the tracking information in their documentation while others have not. None of them have explained exactly how it works or the scope of the information that is conveyed. There are no laws or regulations that require printer companies to track printer users this way, and none that prevent them from ceasing this practice or providing customers a means to opt out of being tracked. The tracking information is coded by patterns of yellow dots that the printers add to every page they print. The details of the patterns vary by manufacturer and printer model.
EDITED TO ADD (11/14): List of printers and whether or not they display tracking dots (may not be up to date).
Susan Landau has a new paper on the NSA's increasing role in commercial cybersecurity. She argues that the NSA is the wrong organization to do this, and we need a more public and open government agency involved in commercial cybersecurity.
Last week, Adi Shamir gave a presentation at Black Hat Europe on using all-in-one printers to control computers on the other side of air gaps. There's no paper yet, but two publications reported on the talk:
Theoretically, if a malicious program is installed on an air-gapped computer by an unsuspecting user via, say, a USB thumb drive, attackers should have a hard time controlling the malicious program or stealing data through it because there is no Internet connection.
But the researchers found that if a multifunction printer is attached to such a computer, attackers could issue commands to a malicious program running on it by flashing visible or infrared light at the scanner lid when open.
The researchers observed that if a source of light is pointed repeatedly at the white coating on the inside of the scanner's lid during a scanning operation, the resulting image will have a series of white lines on darker background. Those lines correspond to the pulses of light hitting the lid and their thickness depends on the duration of the pulses, Shamir explained.
Using this observation the researchers developed Morse code that can be used to send pulses of light at different intervals and interpret the resulting lines as binary data1s and 0s. Malware running on an air-gapped system could be programmed to initiate a scanning operation at a certain time -- for example, during the night -- and then interpret the commands sent by attackers using the technique from far away.
Shamir estimated that several hundred bits of data can be sent during a single scan. That's enough to send small commands that can activate various functionality built into the malware.
This technique can be used to send commands into an air-gapped computer network, and to exfiltrate data from that network.
The Guardian has reported that the app Whisper tracks users, and then published a second article explaining what it knows after Whisper denied the story. Here's Whisper's denial; be sure to also read the first comment from Moxie Marlinspike.
There is a misconception that building a lawful intercept solution into a system requires a so-called "back door," one that foreign adversaries and hackers may try to exploit.
But that isn't true. We aren't seeking a back-door approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process -- front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks.
Cyber adversaries will exploit any vulnerability they find. But it makes more sense to address any security risks by developing intercept solutions during the design phase, rather than resorting to a patchwork solution when law enforcement comes knocking after the fact. And with sophisticated encryption, there might be no solution, leaving the government at a dead end -- all in the name of privacy and network security.
I'm not sure why he believes he can have a technological means of access that somehow only works for people of the correct morality with the proper legal documents, but he seems to believe that's possible. As Jeffrey Vagle and Matt Blaze point out, there's no technical difference between Comey's "front door" and a "back door."
As in all of these sorts of speeches, Comey gave examples of crimes that could have been solved had only the police been able to decrypt the defendant's phone. Unfortunately, none of the three stories is true. The Intercept tracked down each story, and none of them is actually a case where encryption foiled an investigation, arrest, or conviction:
In the most dramatic case that Comey invoked -- the death of a 2-year-old Los Angeles girl -- not only was cellphone data a non-issue, but records show the girl's death could actually have been avoided had government agencies involved in overseeing her and her parents acted on the extensive record they already had before them.
In another case, of a Louisiana sex offender who enticed and then killed a 12-year-old boy, the big break had nothing to do with a phone: The murderer left behind his keys and a trail of muddy footprints, and was stopped nearby after his car ran out of gas.
And in the case of a Sacramento hit-and-run that killed a man and his girlfriend's four dogs, the driver was arrested in a traffic stop because his car was smashed up, and immediately confessed to involvement in the incident.
His poor examples, however, were reminiscent of one cited by Ronald T. Hosko, a former assistant director of the FBI's Criminal Investigative Division, in a widely cited -- and thoroughly debunked -- Washington Post opinion piece last month.
In that case, the Post was eventually forced to have Hosko rewrite the piece, with the following caveat appended:
Editors note: This story incorrectly stated that Apple and Google's new encryption rules would have hindered law enforcement's ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.
Hadn't Comey found anything better since then? In a question-and-answer session after his speech, Comey both denied trying to use scare stories to make his point -- and admitted that he had launched a nationwide search for better ones, to no avail.
This is important. All the FBI talk about "going dark" and losing the ability to solve crimes is absolute bullshit. There is absolutely no evidence, either statistically or even anecdotally, that criminals are going free because of encryption.
So why are we even discussing the possibility to forcing companies to provide insecure encryption to their users and customers?
The EFF points out that companies are protected by law from being required to provide insecure security to make the FBI happy.
Sadly, I don't think this is going to go away anytime soon.
My first post on these new Crypto Wars is here.
The FBI claims that it found the Silk Road server by examining plain text Internet traffic to and from the Silk Road CAPTCHA, and that it visited the address using a regular browser and received the CAPTCHA page. But [Nicholas] Weaver says the traffic logs from the Silk Road server (PDF) that also were released by the government this week tell a different story.
"The server logs which the FBI provides as evidence show that, no, what happened is the FBI didn't see a leakage coming from that IP," he said. "What happened is they contacted that IP directly and got a PHPMyAdmin configuration page." See this PDF file for a look at that PHPMyAdmin page. Here is the PHPMyAdmin server configuration.
But this is hardly a satisfying answer to how the FBI investigators located the Silk Road servers. After all, if the FBI investigators contacted the PHPMyAdmin page directly, how did they know to do that in the first place?
"That's still the $64,000 question," Weaver said. "So both the CAPTCHA couldn't leak in that configuration, and the IP the government visited wasn't providing the CAPTCHA, but instead a PHPMyAdmin interface. Thus, the leaky CAPTCHA story is full of holes."
My guess is that the NSA provided the FBI with this information. We know that the NSA provides surveillance data to the FBI and the DEA, under the condition that they lie about where it came from in court.
NSA whistleblower William Binney explained how it's done:
...when you can't use the data, you have to go out and do a parallel construction, [which] means you use what you would normally consider to be investigative techniques, [and] go find the data. You have a little hint, though. NSA is telling you where the data is...
That's a lot.
As usual, you can also use this squid post to talk about the security stories in the news that I haven't covered.
Commenting has been broken for the past few days. We hope to get it fixed on Monday.
ECI is a classification above Top Secret. It's for things that are so sensitive they're basically not written down, like the names of companies whose cryptography has been deliberately weakened by the NSA, or the names of agents who have infiltrated foreign IT companies.
As part of the Intercept story on the NSA's using agents to infiltrate foreign companies and networks, it published a list of ECI compartments. It's just a list of code names and three-letter abbreviations, along with the group inside the NSA that is responsible for them. The descriptions of what they all mean would never be in a computer file, so it's only of value to those of us who like code names.
This designation is why there have been no documents in the Snowden archive listing specific company names. They're all referred to by these ECI code names.
EDITED TO ADD (11/10): Another compilation of NSA's organizational structure.
This is a creepy story. A woman has her phone seized by the Drug Enforcement Agency and gives them permission to look at her phone. Without her knowledge or consent, they steal photos off of the phone (the article says they were "racy") and use it to set up a fake Facebook page in her name.
The woman sued the government over this. Extra creepy was the government's defense in court: "Defendants admit that Plaintiff did not give express permission for the use of photographs contained on her phone on an undercover Facebook page, but state the Plaintiff implicitly consented by granting access to the information stored in her cell phone and by consenting to the use of that information to aid in an ongoing criminal investigations [sic]."
The article was edited to say: "Update: Facebook has removed the page and the Justice Department said it is reviewing the incident." So maybe this is just an overzealous agent and not official DEA policy.
But as Marcy Wheeler said, this is a good reason to encrypt your cell phone.
A few days ago, I saw this tweet: "Just a reminder that it is now *a full year* since Schneier cited it, and the FOXACID ops manual remains unpublished." It's true.
The citation is this:
According to a top-secret operational procedures manual provided by Edward Snowden, an exploit named Validator might be the default, but the NSA has a variety of options. The documentation mentions United Rake, Peddle Cheap, Packet Wrench, and Beach Head-all delivered from a FOXACID subsystem called Ferret Cannon.
Back when I broke the QUANTUM and FOXACID programs, I talked with the Guardian editors about publishing the manual. In the end, we decided not to, because the information in it wasn't useful to understanding the story. It's been a year since I've seen it, but I remember it being just what I called it: an operation procedures manual. It talked about what to type into which screens, and how to deal with error conditions. It didn't talk about capabilities, either technical or operational. I found it interesting, but it was hard to argue that it was necessary in order to understand the story.
It will probably never be published. I lost access to the Snowden documents soon after writing that essay -- Greenwald broke with the Guardian, and I have never been invited back by the Intercept -- and there's no one looking at the documents with an eye to writing about the NSA's technical capabilities and how to securely design systems to protect against government surveillance. Even though we now know that the same capabilities are being used by other governments and cyber criminals, there's much more interest in stories with political ramifications.
The latest Intercept article on the Snowden documents talks about the NSA's undercover operatives working in foreign companies. There are no specifics, although the countries China, Germany, and South Korea are mentioned. It's also hard to tell if the NSA has undercover operatives working in companies in those countries, or has undercover contractors visiting those companies. The document is dated 2004, although there's no reason to believe that the NSA has changed its behavior since then.
The most controversial revelation in Sentry Eagle might be a fleeting reference to the NSA infiltrating clandestine agents into "commercial entities." The briefing document states that among Sentry Eagle's most closely guarded components are "facts related to NSA personnel (under cover), operational meetings, specific operations, specific technology, specific locations and covert communications related to SIGINT enabling with specific commercial entities (A/B/C)""
It is not clear whether these "commercial entities" are American or foreign or both. Generally the placeholder "(A/B/C)" is used in the briefing document to refer to American companies, though on one occasion it refers to both American and foreign companies. Foreign companies are referred to with the placeholder "(M/N/O)." The NSA refused to provide any clarification to The Intercept.
That program is SENTRY OSPREY, which is a program under SENTRY EAGLE.
The document makes no other reference to NSA agents working under cover. It is not clear whether they might be working as full-time employees at the "commercial entities," or whether they are visiting commercial facilities under false pretenses.
Least fun job right now: being the NSA person who fielded the telephone call from the Intercept to clarify that (A/B/C)/(M/N/O) thing. "Hi. We're going public with SENTRY EAGLE next week. There's one thing in the document we don't understand, and we wonder if you could help us...." Actually, that's wrong. The person who fielded the phone call had no idea what SENTRY EAGLE was. The least fun job belongs to the person up the command chain who did.
Good essay by Molly Sauter: basically, there is no legal avenue for activism and protest on the Internet.
Also note Sauter's new book, The Coming Swarm.
Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World is finished. I submitted it to my publisher, Norton, this morning. In a few weeks, I'll get the copyedited manuscript back, and a few weeks after that, it'll go into production. Stacks of printed books will come out the other end in February, and the book will be published on March 9. There's already an Amazon page, but it's still pretty preliminary. And I expect the price to go down.
Books are both a meandering and clarifying process for me, and I figure out what I'm writing about as I write about it. Data and Goliath started out being about security and power in cyberspace, and ended up being about digital surveillance and what to do about it.
This is the table of contents:
Chapter 2: Data as Surveillance
Chapter 3: Analyzing our Data
Chapter 4: The Business of Surveillance
Chapter 5: Government Surveillance and Control
Chapter 6: Consolidation of Institutional Surveillance
Chapter 8: Commercial Fairness and Equality
Chapter 9: Business Competitiveness
Chapter 10: Privacy
Chapter 11: Security
Chapter 13: Solutions for Government
Chapter 14: Solutions for Corporations
Chapter 15: Solutions for the Rest of Us
Chapter 16: Social Norms and the Big Data Trade-off
Fundamentally, the issues surrounding mass surveillance are tensions between group interest and self-interest, a topic I covered in depth in Liars and Outliers. We're promised great benefits if we allow all of our data to be collected in one place; at the same time, it can be incredibly personal. I see this tension playing out in many areas: location data, social graphs, medical data, search histories. Figuring out the proper balances between group and self-interests, and ensuring that those balances are maintained, is the fundamental issue of the information age. It's how we are going to be judged by our descendants fifty years from now.
Anyway, the book is done and at the publisher. I'm happy with it; the manuscript is so tight you can bounce a quarter off of it. This is a complicated topic, and I think I distilled it down into 80,000 words that are both understandable by the lay reader and interesting to the policy wonk or technical geek. It's also an important topic, and I hope the book becomes a flash point for discussion and debate.
But that's not for another five months. You might think that's a long time, but in publishing that's incredibly fast. I convinced Norton to go with this schedule by stressing that the book becomes less timely every second it's not published. (An exaggeration, I know, but they bought it.) Now I just hope that nothing major happens between now and then to render the book obsolete.
For now, I want to get back to writing shorter pieces. Writing a book can be all-consuming, and I generally don't have time for anything else. Look at my essays. Last year, I wrote 59 essays. This year so far: 17. That's an effect of writing the book. Now that it's done, expect more essays on news websites and longer posts on this blog. It'll be good to be thinking about something else for a change.
If anyone works for a publication, and wants to write a review, conduct an interview, publish an excerpt, or otherwise help me get the word out about the book, please e-mail me and I will pass you on to Norton's publicity department. I think this book has a real chance of breaking out of my normal security market.
Last week, Apple announced that it is closing a serious security vulnerability in the iPhone. It used to be that the phone's encryption only protected a small amount of the data, and Apple had the ability to bypass security on the rest of it.
From now on, all the phone's data is protected. It can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments. A user's iPhone data is now more secure.
To hear US law enforcement respond, you'd think Apple's move heralded an unstoppable crime wave. See, the FBI had been using that vulnerability to get into people's iPhones. In the words of cyberlaw professor Orin Kerr, "How is the public interest served by a policy that only thwarts lawful search warrants?"
Ah, but that's the thing: You can't build a backdoor that only the good guys can walk through. Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You're either vulnerable to eavesdropping by any of them, or you're secure from eavesdropping from all of them.
Backdoor access built for the good guys is routinely used by the bad guys. In 2005, some unknown group surreptitiously used the lawful-intercept capabilities built into the Greek cell phone system. The same thing happened in Italy in 2006.
In 2010, Chinese hackers subverted an intercept system Google had put into Gmail to comply with US government surveillance requests. Back doors in our cell phone system are currently being exploited by the FBI and unknown others.
This doesn't stop the FBI and Justice Department from pumping up the fear. Attorney General Eric Holder threatened us with kidnappers and sexual predators.
The former head of the FBI's criminal investigative division went even further, conjuring up kidnappers who are also sexual predators. And, of course, terrorists.
FBI Director James Comey claimed that Apple's move allows people to "place themselves beyond the law" and also invoked that now overworked "child kidnapper." John J. Escalante, chief of detectives for the Chicago police department now holds the title of most hysterical: "Apple will become the phone of choice for the pedophile."
It's all bluster. Of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping. And, more importantly, there's no evidence that encryption hampers criminal investigations in any serious way. In 2013, encryption foiled the police nine times, up from four in 2012 -- and the investigations proceeded in some other way.
This is why the FBI's scare stories tend to wither after public scrutiny. A former FBI assistant director wrote about a kidnapped man who would never have been found without the ability of the FBI to decrypt an iPhone, only to retract the point hours later because it wasn't true.
We've seen this game before. During the crypto wars of the 1990s, FBI Director Louis Freeh and others would repeatedly use the example of mobster John Gotti to illustrate why the ability to tap telephones was so vital. But the Gotti evidence was collected using a room bug, not a telephone tap. And those same scary criminal tropes were trotted out then, too. Back then we called them the Four Horsemen of the Infocalypse: pedophiles, kidnappers, drug dealers, and terrorists. Nothing has changed.
Strong encryption has been around for years. Both Apple's FileVault and Microsoft's BitLocker encrypt the data on computer hard drives. PGP encrypts e-mail. Off-the-Record encrypts chat sessions. HTTPS Everywhere encrypts your browsing. Android phones already come with encryption built-in. There are literally thousands of encryption products without back doors for sale, and some have been around for decades. Even if the US bans the stuff, foreign companies will corner the market because many of us have legitimate needs for security.
Law enforcement has been complaining about "going dark" for decades now. In the 1990s, they convinced Congress to pass a law requiring phone companies to ensure that phone calls would remain tappable even as they became digital. They tried and failed to ban strong encryption and mandate back doors for their use. The FBI tried and failed again to ban strong encryption in 2010. Now, in the post-Snowden era, they're about to try again.
We need to fight this. Strong encryption protects us from a panoply of threats. It protects us from hackers and criminals. It protects our businesses from competitors and foreign spies. It protects people in totalitarian governments from arrest and detention. This isn't just me talking: The FBI also recommends you encrypt your data for security.
As for law enforcement? The recent decades have given them an unprecedented ability to put us under surveillance and access our data. Our cell phones provide them with a detailed history of our movements. Our call records, e-mail history, buddy lists, and Facebook pages tell them who we associate with. The hundreds of companies that track us on the Internet tell them what we're thinking about. Ubiquitous cameras capture our faces everywhere. And most of us back up our iPhone data on iCloud, which the FBI can still get a warrant for. It truly is the golden age of surveillance.
Given everything that has made it easier for governments and others to intrude on our private lives, we need both technological security and legal restrictions to restore the traditional balance between government access and our security/privacy. More companies should follow Apple's lead and make encryption the easy-to-use default. And let's wait for some actual evidence of harm before we acquiesce to police demands for reduced security.
This essay previously appeared on CNN.com
And an Washington Post editorial manages to say this:
How to resolve this? A police "back door" for all smartphones is undesirable--a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.
Because a "secure golden key" is completely different from a "back door."
EDITED TO ADD (10/7): Another essay.
EDITED TO ADD (10/12): Another essay.
Former NSA employee -- not technical director, as the link says -- explains how NSA bulk surveillance works, using some of the Snowden documents. Very interesting.
EDITED TO ADD (10/4): Apologies to Binney for downgrading his role at the NSA. He was not the technical director of the NSA, which is what I was thinking of, but he was a technical director at the NSA:
"In '97, I became the technical director of the geopolitical -- military geopolitical analysis and reporting shop for the world, which was about 6,000 people," Binney told Frontline.
Whatever the case, he does know what he's talking about when he talks about NSA surveillance.
The NSA is building a private cloud with its own security features:
As a result, the agency can now track every instance of every individual accessing what is in some cases a single word or name in a file. This includes when it arrived, who can access it, who did access it, downloaded it, copied it, printed it, forwarded it, modified it, or deleted it.
"All of this I can do in the cloud but--in many cases--it cannot be done in the legacy systems, many of which were created before such advanced data provenance technology existed." Had this ability all been available at the time, it is unlikely that U.S. soldier Bradley Manning would have succeeded in obtaining classified documents in 2010.
Firechat is a secure wireless peer-to-peer chat app:
Firechat is theoretically resistant to the kind of centralized surveillance that the Chinese government (as well as western states, especially the US and the UK) is infamous for. Phones connect directly to one another, establish encrypted connections, and transact without sending messages to servers where they can be sniffed and possibly decoded.
"Selling spyware is not just reprehensible, it's a crime," Leslie Caldwell, assistant attorney general in the DOJ's Criminal Division, said in a statement. "Apps like StealthGenie are expressly designed for use by stalkers and domestic abusers who want to know every detail of a victim's personal life -- all without the victim's knowledge."
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.