Entries Tagged "intelligence"

Page 11 of 25

NSA Secrecy and Personal Privacy

In an excellent essay about privacy and secrecy, law professor Daniel Solove makes an important point. There are two types of NSA secrecy being discussed. It’s easy to confuse them, but they’re very different.

Of course, if the government is trying to gather data about a particular suspect, keeping the specifics of surveillance efforts secret will decrease the likelihood of that suspect altering his or her behavior.

But secrecy at the level of an individual suspect is different from keeping the very existence of massive surveillance programs secret. The public must know about the general outlines of surveillance activities in order to evaluate whether the government is achieving the appropriate balance between privacy and security. What kind of information is gathered? How is it used? How securely is it kept? What kind of oversight is there? Are these activities even legal? These questions can’t be answered, and the government can’t be held accountable, if surveillance programs are completely classified.

This distinction is also becoming important as Snowden keeps talking. There are a lot of articles about Edward Snowden cooperating with the Chinese government. I have no idea if this is true—Snowden denies it—or if it’s part of an American smear campaign designed to change the debate from the NSA surveillance programs to the whistleblower’s actions. (It worked against Assange.) In anticipation of the inevitable questions, I want to change a previous assessment statement: I consider Snowden a hero for whistleblowing on the existence and details of the NSA surveillance programs, but not for revealing specific operational secrets to the Chinese government. Charles Pierce wishes Snowden would stop talking. I agree; the more this story is about him the less it is about the NSA. Stop giving interviews and let the documents do the talking.

Back to Daniel Solove, this excellent 2011 essay on the value of privacy is making the rounds again. And it should.

Many commentators had been using the metaphor of George Orwell’s 1984 to describe the problems created by the collection and use of personal data. I contended that the Orwell metaphor, which focuses on the harms of surveillance (such as inhibition and social control) might be apt to describe law enforcement’s monitoring of citizens. But much of the data gathered in computer databases is not particularly sensitive, such as one’s race, birth date, gender, address, or marital status. Many people do not care about concealing the hotels they stay at, the cars they own or rent, or the kind of beverages they drink. People often do not take many steps to keep such information secret. Frequently, though not always, people’s activities would not be inhibited if others knew this information.

I suggested a different metaphor to capture the problems: Franz Kafka’s The Trial, which depicts a bureaucracy with inscrutable purposes that uses people’s information to make important decisions about them, yet denies the people the ability to participate in how their information is used. The problems captured by the Kafka metaphor are of a different sort than the problems caused by surveillance. They often do not result in inhibition or chilling. Instead, they are problems of information processing—the storage, use, or analysis of data—rather than information collection. They affect the power relationships between people and the institutions of the modern state. They not only frustrate the individual by creating a sense of helplessness and powerlessness, but they also affect social structure by altering the kind of relationships people have with the institutions that make important decisions about their lives.

The whole essay is worth reading, as is—I hope—my essay on the value of privacy from 2006.

I have come to believe that the solution to all of this is regulation. And it’s not going to be the regulation of data collection; it’s going to be the regulation of data use.

EDITED TO ADD (6/18): A good rebutttal to the “nothing to hide” argument.

Posted on June 18, 2013 at 11:02 AMView Comments

Evidence that the NSA Is Storing Voice Content, Not Just Metadata

Interesting speculation that the NSA is storing everyone’s phone calls, and not just metadata. Definitely worth reading.

I expressed skepticism about this just a month ago. My assumption had always been that everyone’s compressed voice calls is just too much data to move around and store. Now, I don’t know.

There’s a bit of a conspiracy-theory air to all of this speculation, but underestimating what the NSA will do is a mistake. General Alexander has told members of Congress that they can record the contents of phone calls. And they have the technical capability.

Earlier reports have indicated that the NSA has the ability to record nearly all domestic and international phone calls—in case an analyst needed to access the recordings in the future. A Wired magazine article last year disclosed that the NSA has established “listening posts” that allow the agency to collect and sift through billions of phone calls through a massive new data center in Utah, “whether they originate within the country or overseas.” That includes not just metadata, but also the contents of the communications.

William Binney, a former NSA technical director who helped to modernize the agency’s worldwide eavesdropping network, told the Daily Caller this week that the NSA records the phone calls of 500,000 to 1 million people who are on its so-called target list, and perhaps even more. “They look through these phone numbers and they target those and that’s what they record,” Binney said.

Brewster Kahle, a computer engineer who founded the Internet Archive, has vast experience storing large amounts of data. He created a spreadsheet this week estimating that the cost to store all domestic phone calls a year in cloud storage for data-mining purposes would be about $27 million per year, not counting the cost of extra security for a top-secret program and security clearances for the people involved.

I believe that, to the extent that the NSA is analyzing and storing conversations, they’re doing speech-to-text as close to the source as possible and working with that. Even if you have to store the audio for conversations in foreign languages, or for snippets of conversations the conversion software is unsure of, it’s a lot fewer bits to move around and deal with.

And, by the way, I hate the term “metadata.” What’s wrong with “traffic analysis,” which is what we’ve always called that sort of thing?

Posted on June 18, 2013 at 5:57 AMView Comments

Prosecuting Snowden

Edward Snowden broke the law by releasing classified information. This isn’t under debate; it’s something everyone with a security clearance knows. It’s written in plain English on the documents you have to sign when you get a security clearance, and it’s part of the culture. The law is there for a good reason, and secrecy has an important role in military defense.

But before the Justice Department prosecutes Snowden, there are some other investigations that ought to happen.

We need to determine whether these National Security Agency programs are themselves legal. The administration has successfully barred anyone from bringing a lawsuit challenging these laws, on the grounds of national secrecy. Now that we know those arguments are without merit, it’s time for those court challenges.

It’s clear that some of the NSA programs exposed by Snowden violate the Constitution and others violate existing laws. Other people have an opposite view. The courts need to decide.

We need to determine whether classifying these programs is legal. Keeping things secret from the people is a very dangerous practice in a democracy, and the government is permitted to do so only under very specific circumstances. Reading the documents leaked so far, I don’t see anything that needs to be kept secret. The argument that exposing these documents helps the terrorists doesn’t even pass the laugh test; there’s nothing here that changes anything any potential terrorist would do or not do. But in any case, now that the documents are public, the courts need to rule on the legality of their secrecy.

And we need to determine how we treat whistle-blowers in this country. We have whistle-blower protection laws that apply in some cases, particularly when exposing fraud, and other illegal behavior. NSA officials have repeatedly lied about the existence, and details, of these programs to Congress.

Only after all of these legal issues have been resolved should any prosecution of Snowden move forward. Because only then will we know the full extent of what he did, and how much of it is justified.

I believe that history will hail Snowden as a hero—his whistle-blowing exposed a surveillance state and a secrecy machine run amok. I’m less optimistic of how the present day will treat him, and hope that the debate right now is less about the man and more about the government he exposed.

This essay was originally published on the New York Times Room for Debate blog, as part of a series of essays on the topic.

EDITED TO ADD (6/13): There’s a big discussion of this on Reddit.

Posted on June 12, 2013 at 6:16 AMView Comments

"The Global Cyber Game"

This 127-page report was just published by the UK Defence Academy. I have not read it yet, but it looks really interesting.

Executive Summary: This report presents a systematic way of thinking about cyberpower and its use by a variety of global players. The urgency of addressing cyberpower in this way is a consequence of the very high value of the Internet and the hazards of its current militarization.

Cyberpower and cyber security are conceptualized as a ‘Global Game’ with a novel ‘Cyber Gameboard’ consisting of a nine-cell grid. The horizontal direction on the grid is divided into three columns representing aspects of information (i.e. cyber): connection, computation and cognition. The vertical direction on the grid is divided into three rows representing types of power: coercion, co-option, and cooperation. The nine cells of the grid represent all the possible combinations of power and information, that is, forms of cyberpower.

The Cyber Gameboard itself is also an abstract representation of the surface of cyberspace, or C-space as defined in this report. C-space is understood as a networked medium capable of conveying various combinations of power and information to produce effects in physical or ‘flow space,’ referred to as F-space in this report. Game play is understood as the projection via C-space of a cyberpower capability existing in any one cell of the gameboard to produce an effect in F-space vis-a-vis another player in any other cell of the gameboard. By default, the Cyber Game is played either actively or passively by all those using network connected computers. The players include states, businesses, NGOs, individuals, non-state political groups, and organized crime, among others. Each player is seen as having a certain level of cyberpower when its capability in each cell is summed across the whole board. In general states have the most cyberpower.

The possible future path of the game is depicted by two scenarios, N-topia and N-crash. These are the stakes for which the Cyber Game is played. N-topia represents the upside potential of the game, in which the full value of a globally connected knowledge society is realized. N-crash represents the downside potential, in which militarization and fragmentation of the Internet cause its value to be substantially destroyed. Which scenario eventuates will be determined largely by the overall pattern of play of the Cyber Game.

States have a high level of responsibility for determining the outcome. The current pattern of play is beginning to resemble traditional state-on-state geopolitical conflict. This puts the civil Internet at risk, and civilian cyber players are already getting caught in the crossfire. As long as the civil Internet remains undefended and easily permeable to cyber attack it will be hard to achieve the N-topia scenario.

Defending the civil Internet in depth, and hardening it by re-architecting will allow its full social and economic value to be realized but will restrict the potential for espionage and surveillance by states. This trade-off is net positive and in accordance with the espoused values of Western-style democracies. It does however call for leadership based on enlightened self-interest by state players.

Posted on May 22, 2013 at 12:05 PMView Comments

Intelligence Analysis and the Connect-the-Dots Metaphor

The FBI and the CIA are being criticized for not keeping better track of Tamerlan Tsarnaev in the months before the Boston Marathon bombings. How could they have ignored such a dangerous person? How do we reform the intelligence community to ensure this kind of failure doesn’t happen again?

It’s an old song by now, one we heard after the 9/11 attacks in 2001 and after the Underwear Bomber’s failed attack in 2009. The problem is that connecting the dots is a bad metaphor, and focusing on it makes us more likely to implement useless reforms.

Connecting the dots in a coloring book is easy and fun. They’re right there on the page, and they’re all numbered. All you have to do is move your pencil from one dot to the next, and when you’re done, you’ve drawn a sailboat. Or a tiger. It’s so simple that 5-year-olds can do it.

But in real life, the dots can only be numbered after the fact. With the benefit of hindsight, it’s easy to draw lines from a Russian request for information to a foreign visit to some other piece of information that might have been collected.

In hindsight, we know who the bad guys are. Before the fact, there are an enormous number of potential bad guys.

How many? We don’t know. But we know that the no-fly list had 21,000 people on it last year. The Terrorist Identities Datamart Environment, also known as the watch list, has 700,000 names on it.

We have no idea how many potential “dots” the FBI, CIA, NSA and other agencies collect, but it’s easily in the millions. It’s easy to work backwards through the data and see all the obvious warning signs. But before a terrorist attack, when there are millions of dots—some important but the vast majority unimportant—uncovering plots is a lot harder.

Rather than thinking of intelligence as a simple connect-the-dots picture, think of it as a million unnumbered pictures superimposed on top of each other. Or a random-dot stereogram. Is it a sailboat, a puppy, two guys with pressure-cooker bombs, or just an unintelligible mess of dots? You try to figure it out.

It’s not a matter of not enough data, either.

Piling more data onto the mix makes it harder, not easier. The best way to think of it is a needle-in-a-haystack problem; the last thing you want to do is increase the amount of hay you have to search through. The television show Person of Interest is fiction, not fact.

There’s a name for this sort of logical fallacy: hindsight bias. First explained by psychologists Daniel Kahneman and Amos Tversky, it’s surprisingly common. Since what actually happened is so obvious once it happens, we overestimate how obvious it was before it happened.

We actually misremember what we once thought, believing that we knew all along that what happened would happen. It’s a surprisingly strong tendency, one that has been observed in countless laboratory experiments and real-world examples of behavior. And it’s what all the post-Boston-Marathon bombing dot-connectors are doing.

Before we start blaming agencies for failing to stop the Boston bombers, and before we push “intelligence reforms” that will shred civil liberties without making us any safer, we need to stop seeing the past as a bunch of obvious dots that need connecting.

Kahneman, a Nobel prize winner, wisely noted: “Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.” Kahneman calls it “the illusion of understanding,” explaining that the past is only so understandable because we have cast it as simple inevitable stories and leave out the rest.

Nassim Taleb, an expert on risk engineering, calls this tendency the “narrative fallacy.” We humans are natural storytellers, and the world of stories is much more tidy, predictable and coherent than the real world.

Millions of people behave strangely enough to warrant the FBI’s notice, and almost all of them are harmless. It is simply not possible to find every plot beforehand, especially when the perpetrators act alone and on impulse.

We have to accept that there always will be a risk of terrorism, and that when the occasional plot succeeds, it’s not necessarily because our law enforcement systems have failed.

This essay previously appeared on CNN.

EDITED TO ADD (5/7): The hindsight bias was actually first discovered by Baruch Fischhoff: “Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty,” Journal of Experimental Psychology: Human Perception and Performance, 1(3), 1975, pp. 288-299.

Posted on May 7, 2013 at 6:10 AMView Comments

Interesting Article on Libyan Internet Intelligence Gathering

This is worth reading, for the insights it provides on how a country goes about monitoring its citizens in the information age: a combination of targeted attacks and wholesale surveillance.

I’ll just quote one bit, this list of Western companies that helped:

Amesys, with its Eagle system, was just one of Libya’s partners in repression. A South African firm called VASTech had set up a sophisticated monitoring center in Tripoli that snooped on all inbound and outbound international phone calls, gathering and storing 30 million to 40 million minutes of mobile and landline conversations each month. ZTE Corporation, a Chinese firm whose gear powered much of Libya’s cell phone infrastructure, is believed to have set up a parallel Internet monitoring system for External Security: Photos from the basement of a makeshift surveillance site, obtained from Human Rights Watch, show components of its ZXMT system, comparable to Eagle. American firms likely bear some blame, as well. On February 15, just prior to the revolution, regime officials reportedly met in Barcelona with officials from Narus, a Boeing subsidiary, to discuss Internet-filtering software. And the Human Rights Watch photos also clearly show a manual for a satellite phone monitoring system sold by a subsidiary of L-3 Communications, a defense conglomerate based in New York.

Posted on June 5, 2012 at 6:07 AMView Comments

The Explosive from the Latest Foiled Al Qaeda Underwear Bomb Plot

Interesting:

Although the plot was disrupted before a particular airline was targeted and tickets were purchased, al Qaeda’s continued attempts to attack the U.S. speak to the organization’s persistence and willingness to refine specific approaches to killing. Unlike Abdulmutallab’s bomb, the new device contained lead azide, an explosive often used as a detonator. If the new underwear bomb had been used, the bomber would have ignited the lead azide, which would have triggered a more powerful explosive, possibly military-grade explosive pentaerythritol tetranitrate (PETN).

Lead azide and PETN were key components in a 2010 plan to detonate two bombs sent from Yemen and bound for Chicago—one in a cargo aircraft and the other in the cargo hold of a passenger aircraft. In that plot, al-Qaeda hid bombs in printer cartridges, allowing them to slip past cargo handlers and airport screeners. Both bombs contained far more explosive material than the 80 grams of PETN that Abdulmutallab smuggled onto his Northwest Airlines flight.

With the latest device, al Asiri appears to have been able to improve on the underwear bomb supplied to Abdulmutallab, says Joan Neuhaus Schaan, a fellow in homeland security and terrorism for Rice University’s James A. Baker III Institute for Public Policy.

The interview is also interesting, and I am especially pleased to see this last answer:

What has been the most effective means of disrupting terrorism attacks?
As with bombs that were being sent from Yemen to Chicago as cargo, this latest plot was discovered using human intelligence rather than screening procedures and technologies. These plans were disrupted because of proactive mechanisms put in place to stop terrorism rather than defensive approaches such as screening.

Posted on May 25, 2012 at 6:43 AMView Comments

A Foiled Terrorist Plot

We don’t know much, but here are my predictions:

  1. There’s a lot more hyperbole to this story than reality.
  2. The explosive would have either 1) been caught by pre-9/11 security, or 2) not been caught by post-9/11 security.
  3. Nonetheless, it will be used to justify more invasive airport security.

Posted on May 8, 2012 at 1:14 PMView Comments

1 9 10 11 12 13 25

Sidebar photo of Bruce Schneier by Joe MacInnis.