Schneier on Security
A blog covering security and security technology.
In a request today to National Security Agency director Keith Alexander and Defense Secretary Chuck Hagel, the group argues that the NSA's recently revealed domestic surveillance program is "unlawful" because the agency neglected to request public comments first. A federal appeals court previously ruled that was necessary in a lawsuit involving airport body scanners.
This isn't an empty exercise. While it's unlikely that a judge will order the NSA to suspend the program pending public approval, the process will put pressure on Washington to subject the NSA to more oversight, and pressure the NSA into more transparency. We've used these tactics before. Two decades ago, EPIC launched a similar petition against the Clipper Chip, a process that eventually led to the Clinton administration and the FBI abandoning the effort. And EPIC's more recent action against TSA full-body scanners is one of the reasons we have privacy safeguards on the millimeter wave scanners they are still using.
The more people who sign this petition, this, the clearer the message it sends to Washington: a message that people care about the privacy of their telephone records, Internet transactions, and online communications. Secret judges should not be allowed to use secret interpretations of secret laws to authorize the NSA to engage in domestic surveillance. Sooner or later, a court is going to recognize that. Until then, the more noise the better.
Add your voice here. It just might work.
On his blog, Scott Adams suggests that it might be possible to identify sociopaths based on their interactions on social media.
My hypothesis is that science will someday be able to identify sociopaths and terrorists by their patterns of Facebook and Internet use. I'll bet normal people interact with Facebook in ways that sociopaths and terrorists couldn't duplicate.
Okay, but so what? Imagine you had such an amazingly accurate test...then what? Do we investigate those who test positive, even though there's no suspicion that they've actually done anything? Do we follow them around? Subject them to additional screening at airports? Throw them in jail because we know the streets will be safer because of it? Do we want to live in a Minority Report world?
The problem isn't just that such a system is wrong, it's that the mathematics of testing makes this sort of thing pretty ineffective in practice. It's called the "base rate fallacy." Suppose you have a test that's 90% accurate in identifying both sociopaths and non-sociopaths. If you assume that 4% of people are sociopaths, then the chance of someone who tests positive actually being a sociopath is 26%. (For every thousand people tested, 90% of the 40 sociopaths will test positive, but so will 10% of the 960 non-sociopaths.) You have postulate a test with an amazing 99% accuracy -- only a 1% false positive rate -- even to have an 80% chance of someone testing positive actually being a sociopath.
This fallacy isn't new. It's the same thinking that caused us to intern Japanese-Americans during World War II, stop people in their cars because they're black, and frisk them at airports because they're Muslim. It's the same thinking behind massive NSA surveillance programs like PRISM. It's one of the things that scares me about police DNA databases.
Many authors have written stories about thoughtcrime. Who has written about genecrime?
John Mueller and Mark Stewart ask the important questions about the NSA surveillance programs: why were they secret, what have they accomplished, and what do they cost?
Facebook (here), Apple (here), and Yahoo (here) have all released details of US government requests for data. They each say that they've turned over user data for about 10,000 people, although the time frames are different. The exact number isn't important; what's important is that it's much lower than the millions implied by the PRISM document.
Now the big question: do we believe them? If we don't, what would it take before we did believe them?
In an excellent essay about privacy and secrecy, law professor Daniel Solove makes an important point. There are two types of NSA secrecy being discussed. It's easy to confuse them, but they're very different.
Of course, if the government is trying to gather data about a particular suspect, keeping the specifics of surveillance efforts secret will decrease the likelihood of that suspect altering his or her behavior.
This distinction is also becoming important as Snowden keeps talking. There are a lot of articles about Edward Snowden cooperating with the Chinese government. I have no idea if this is true -- Snowden denies it -- or if they're part of an American smear campaign designed to change the debate from the NSA surveillance programs to the whistleblower's actions. (It worked against Assange.) In anticipation of the inevitable questions, I want to change a previous assessment statement: I consider Snowden a hero for whistleblowing on the existence and details of the NSA surveillance programs, but not for revealing specific operational secrets to the Chinese government. Charles Pierce wishes Snowden would stop talking. I agree; the more this story is about him the less it is about the NSA. Stop giving interviews and let the documents do the talking.
Back to Daniel Solove, this excellent 2011 essay on the value of privacy is making the rounds again. And it should.
Many commentators had been using the metaphor of George Orwell's 1984 to describe the problems created by the collection and use of personal data. I contended that the Orwell metaphor, which focuses on the harms of surveillance (such as inhibition and social control) might be apt to describe law enforcement's monitoring of citizens. But much of the data gathered in computer databases is not particularly sensitive, such as one's race, birth date, gender, address, or marital status. Many people do not care about concealing the hotels they stay at, the cars they own or rent, or the kind of beverages they drink. People often do not take many steps to keep such information secret. Frequently, though not always, people's activities would not be inhibited if others knew this information.
The whole essay is worth reading, as is -- I hope -- my essay on the value of privacy from 2006.
I have come to believe that the solution to all of this is regulation. And it's not going to be the regulation of data collection; it's going to be the regulation of data use.
EDITED TO ADD (6/18): A good rebutttal to the "nothing to hide" argument.
Interesting speculation that the NSA is storing everyone's phone calls, and not just metadata. Definitely worth reading.
I expressed skepticism about this just a month ago. My assumption had always been that everyone's compressed voice calls is just too much data to move around and store. Now, I don't know.
There's a bit of a conspiracy-theory air to all of this speculation, but underestimating what the NSA will do is a mistake. General Alexander has told members of Congress that they can record the contents of phone calls. And they have the technical capability.
Earlier reports have indicated that the NSA has the ability to record nearly all domestic and international phone calls -- in case an analyst needed to access the recordings in the future. A Wired magazine article last year disclosed that the NSA has established "listening posts" that allow the agency to collect and sift through billions of phone calls through a massive new data center in Utah, "whether they originate within the country or overseas." That includes not just metadata, but also the contents of the communications.
I believe that, to the extent that the NSA is analyzing and storing conversations, they're doing speech-to-text as close to the source as possible and working with that. Even if you have to store the audio for conversations in foreign languages, or for snippets of conversations the conversion software is unsure of, it's a lot fewer bits to move around and deal with.
And, by the way, I hate the term "metadata." What's wrong with "traffic analysis," which is what we've always called that sort of thing?
There's one piece of blowback that isn't being discussed -- aside from the fact that Snowden has killed the chances of any liberal arts major getting a DoD job for at least a decade -- and that's how the massive NSA surveillance of the Internet affects the US's role in Internet governance.
Ron Deibert makes this point:
But there are unintended consequences of the NSA scandal that will undermine U.S. foreign policy interests -- in particular, the "Internet Freedom" agenda espoused by the U.S. State Department and its allies.
Writing about the new Internet nationalism, I talked about the ITU meeting in Dubai last fall, and the attempt of some countries to wrest control of the Internet from the US. That movement just got a huge PR boost. Now, when countries like Russia and Iran say the US is simply too untrustworthy to manage the Internet, no one will be able to argue.
We can't fight for Internet freedom around the world, then turn around and destroy it back home. Even if we don't see the contradiction, the rest of the world does.
It's a novel behavior.
As usual, you can also use this squid post to talk about the security stories in the news that I haven't covered.
On April 1, I announced the Sixth Annual Movie Plot Threat Contest:
I want a cyberwar movie-plot threat. (For those who don't know, a movie-plot threat is a scare story that would make a great movie plot, but is much too specific to build security policy around.) Not the Chinese attacking our power grid or shutting off 911 emergency services -- people are already scaring our legislators with that sort of stuff. I want something good, something no one has thought of before.
Submissions are in, and -- apologies that this is a month late, but I completely forgot about it -- here are the semifinalists.
Cast your vote by number; voting closes at the end of the month.
This paper (full paper behind paywall) -- from Environment International (2009) -- does a good job of separating fact from fiction:
Abstract: In recent years there has been an increased concern regarding the potential use of chemical and biological weapons for mass urban terror. In particular, there are concerns that ricin could be employed as such an agent. This has been reinforced by recent high profile cases involving ricin, and its use during the cold war to assassinate a high profile communist dissident. Nevertheless, despite these events, does it deserve such a reputation? Ricin is clearly toxic, though its level of risk depends on the route of entry. By ingestion, the pathology of ricin is largely restricted to the gastrointestinal tract where it may cause mucosal injuries; with appropriate treatment, most patients will make a full recovery. As an agent of terror, it could be used to contaminate an urban water supply, with the intent of causing lethality in a large urban population. However, a substantial mass of pure ricin powder would be required. Such an exercise would be impossible to achieve covertly and would not guarantee success due to variables such as reticulation management, chlorination, mixing, bacterial degradation and ultra-violet light. By injection, ricin is lethal; however, while parenteral delivery is an ideal route for assassination, it is not realistic for an urban population. Dermal absorption of ricin has not been demonstrated. Ricin is also lethal by inhalation. Low doses can lead to progressive and diffuse pulmonary oedema with associated inflammation and necrosis of the alveolar pneumocytes. However, the risk of toxicity is dependent on the aerodynamic equivalent diameter (AED) of the ricin particles. The AED, which is an indicator of the aerodynamic behaviour of a particle, must be of sufficiently low micron size as to target the human alveoli and thereby cause major toxic effects. To target a large population would also necessitate a quantity of powder in excess of several metric tons. The technical and logistical skills required to formulate such a mass of powder to the required size is beyond the ability of terrorists who typically operate out of a kitchen in a small urban dwelling or in a small ill-equipped laboratory. Ricin as a toxin is deadly but as an agent of bioterror it is unsuitable and therefore does not deserve the press attention and subsequent public alarm that has been created.
Ray Wang makes an important point about trust and our data:
This is the paradox. The companies contending to win our trust to manage our digital identities all seem to have complementary (or competing) business models that breach that trust by selling our data.
...and by turning it over to the government.
If the government demanded that we all carry tracking devices 24/7, we would rebel. Yet we all carry cell phones. If the government demanded that we deposit copies of all of our messages to each other with the police, we'd declare their actions unconstitutional. Yet we all use Gmail and Facebook messaging and SMS. If the government demanded that we give them access to all the photographs we take, and that we identify all of the people in them and tag them with locations, we'd refuse. Yet we do exactly that on Flickr and other sites.
Ray Ozzie is right when he said that we got what we asked for when we told the government we were scared and that they should do whatever they wanted to make us feel safer. But we also got what we asked for when we traded our privacy for convenience, trusting these corporations to look out for our best interests.
We're living in a world of feudal security. And if you watch Game of Thrones, you know that feudalism benefits the powerful -- at the expense of the peasants.
Last night, I was on All In with Chris Hayes (parts one and two). One of the things we talked about after the show was over is how technological solutions only work around the margins. That's not a cause for despair. Think about technological solutions to murder. Yes, they exist -- wearing a bullet-proof vest, for example -- but they're not really viable. The way we protect ourselves from murder is through laws. This is how we're also going to protect our privacy.
EDITED TO ADD (6/18): The Onion nailed it back in 2011.
Powered by Movable Type. Photo at top by Geoffrey Stone.
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.