Entries Tagged "FBI"

Page 11 of 23

Protecting E-Mail from Eavesdropping

In the wake of the Snowden NSA documents, reporters have been asking me whether encryption can solve the problem. Leaving aside the fact that much of what the NSA is collecting can’t be encrypted by the user—telephone metadata, e-mail headers, phone calling records, e-mail you’re reading from a phone or tablet or cloud provider, anything you post on Facebook—it’s hard to give good advice.

In theory, an e-mail program will protect you, but the reality is much more complicated.

  • The program has to be vulnerability-free. If there is some back door in the program that bypasses, or weakens, the encryption, it’s not secure. It’s very difficult, almost impossible, to verify that a program is vulnerability-free.
  • The user has to choose a secure password. Luckily, there’s advice on how to do this.
  • The password has to be managed securely. The user can’t store it in a file somewhere. If he’s worried about security for after the FBI has arrested him and searched his house, he shouldn’t write it on a piece of paper, either.
  • Actually, he should understand the threat model he’s operating under. Is it the NSA trying to eavesdrop on everything, or an FBI investigation that specifically targets him—or a targeted attack, like dropping a Trojan on his computer, that bypasses e-mail encryption entirely?

This is simply too much for the poor reporter, who wants an easy-to-transcribe answer.

We’ve known how to send cryptographically secure e-mail since the early 1990s. Twenty years later, we’re still working on the security engineering of e-mail programs. And if the NSA is eavesdropping on encrypted e-mail, and if the FBI is decrypting messages from suspects’ hard drives, they’re both breaking the engineering, not the underlying cryptographic algorithms.

On the other hand, the two adversaries can be very different. The NSA has to process a ginormous amount of traffic. It’s the “drinking from a fire hose” problem; they cannot afford to devote a lot of time to decrypting everything, because they simply don’t have the computing resources. There’s just too much data to collect. In these situations, even a modest level of encryption is enough—until you are specifically targeted. This is why the NSA saves all encrypted data it encounters; it might want to devote cryptanalysis resources to it at some later time.

Posted on July 8, 2013 at 6:43 AMView Comments

New Details on Skype Eavesdropping

This article, on the cozy relationship between the commercial personal-data industry and the intelligence industry, has new information on the security of Skype.

Skype, the Internet-based calling service, began its own secret program, Project Chess, to explore the legal and technical issues in making Skype calls readily available to intelligence agencies and law enforcement officials, according to people briefed on the program who asked not to be named to avoid trouble with the intelligence agencies.

Project Chess, which has never been previously disclosed, was small, limited to fewer than a dozen people inside Skype, and was developed as the company had sometimes contentious talks with the government over legal issues, said one of the people briefed on the project. The project began about five years ago, before most of the company was sold by its parent, eBay, to outside investors in 2009. Microsoft acquired Skype in an $8.5 billion deal that was completed in October 2011.

A Skype executive denied last year in a blog post that recent changes in the way Skype operated were made at the behest of Microsoft to make snooping easier for law enforcement. It appears, however, that Skype figured out how to cooperate with the intelligence community before Microsoft took over the company, according to documents leaked by Edward J. Snowden, a former contractor for the N.S.A. One of the documents about the Prism program made public by Mr. Snowden says Skype joined Prism on Feb. 6, 2011.

Reread that Skype denial from last July, knowing that at the time the company knew that they were giving the NSA access to customer communications. Notice how it is precisely worded to be technically accurate, yet leave the reader with the wrong conclusion. This is where we are with all the tech companies right now; we can’t trust their denials, just as we can’t trust the NSA—or the FBI—when it denies programs, capabilities, or practices.

Back in January, we wondered whom Skype lets spy on their users. Now we know.

Posted on June 20, 2013 at 2:42 PMView Comments

Government Secrets and the Need for Whistle-blowers

Yesterday, we learned that the NSA received all calling records from Verizon customers for a three-month period starting in April. That’s everything except the voice content: who called who, where they were, how long the call lasted—for millions of people, both Americans and foreigners. This “metadata” allows the government to track the movements of everyone during that period, and build a detailed picture of who talks to whom. It’s exactly the same data the Justice Department collected about AP journalists.

The Guardian delivered this revelation after receiving a copy of a secret memo about this—presumably from a whistle-blower. We don’t know if the other phone companies handed data to the NSA too. We don’t know if this was a one-off demand or a continuously renewed demand; the order started a few days after the Boston bombers were captured by police.

We don’t know a lot about how the government spies on us, but we know some things. We know the FBI has issued tens of thousands of ultra-secret National Security Letters to collect all sorts of data on people—we believe on millions of people—and has been abusing them to spy on cloud-computer users. We know it can collect a wide array of personal data from the Internet without a warrant. We also know that the FBI has been intercepting cell-phone data, all but voice content, for the past 20 years without a warrant, and can use the microphone on some powered-off cell phones as a room bug—presumably only with a warrant.

We know that the NSA has many domestic-surveillance and data-mining programs with codenames like Trailblazer, Stellar Wind, and Ragtime—deliberately using different codenames for similar programs to stymie oversight and conceal what’s really going on. We know that the NSA is building an enormous computer facility in Utah to store all this data, as well as faster computer networks to process it all. We know the U.S. Cyber Command employs 4,000 people.

We know that the DHS is also collecting a massive amount of data on people, and that local police departments are running “fusion centers” to collect and analyze this data, and covering up its failures. This is all part of the militarization of the police.

Remember in 2003, when Congress defunded the decidedly creepy Total Information Awareness program? It didn’t die; it just changed names and split into many smaller programs. We know that corporations are doing an enormous amount of spying on behalf of the government: all parts.

We know all of this not because the government is honest and forthcoming, but mostly through three backchannels—inadvertent hints or outright admissions by government officials in hearings and court cases, information gleaned from government documents received under FOIA, and government whistle-blowers.

There’s much more we don’t know, and often what we know is obsolete. We know quite a bit about the NSA’s ECHELON program from a 2000 European investigation, and about the DHS’s plans for Total Information Awareness from 2002, but much less about how these programs have evolved. We can make inferences about the NSA’s Utah facility based on the theoretical amount of data from various sources, the cost of computation, and the power requirements from the facility, but those are rough guesses at best. For a lot of this, we’re completely in the dark.

And that’s wrong.

The U.S. government is on a secrecy binge. It overclassifies more information than ever. And we learn, again and again, that our government regularly classifies things not because they need to be secret, but because their release would be embarrassing.

Knowing how the government spies on us is important. Not only because so much of it is illegal—or, to be as charitable as possible, based on novel interpretations of the law—but because we have a right to know. Democracy requires an informed citizenry in order to function properly, and transparency and accountability are essential parts of that. That means knowing what our government is doing to us, in our name. That means knowing that the government is operating within the constraints of the law. Otherwise, we’re living in a police state.

We need whistle-blowers.

Leaking information without getting caught is difficult. It’s almost impossible to maintain privacy in the Internet Age. The WikiLeaks platform seems to have been secure—Bradley Manning was caught not because of a technological flaw, but because someone he trusted betrayed him—but the U.S. government seems to have successfully destroyed it as a platform. None of the spin-offs have risen to become viable yet. The New Yorker recently unveiled its Strongbox platform for leaking material, which is still new but looks good. This link contains the best advice on how to leak information to the press via phone, email, or the post office. The National Whistleblowers Center has a page on national-security whistle-blowers and their rights.

Leaking information is also very dangerous. The Obama Administration has embarked on a war on whistle-blowers, pursuing them—both legally and through intimidation—further than any previous administration has done. Mark Klein, Thomas Drake, and William Binney have all been persecuted for exposing technical details of our surveillance state. Bradley Manning has been treated cruelly and inhumanly—and possibly tortured—for his more-indiscriminate leaking of State Department secrets.

The Obama Administration’s actions against the Associated Press, its persecution of Julian Assange, and its unprecedented prosecution of Manning on charges of “aiding the enemy” demonstrate how far it’s willing to go to intimidate whistle-blowers—as well as the journalists who talk to them.

But whistle-blowing is vital, even more broadly than in government spying. It’s necessary for good government, and to protect us from abuse of power.

We need details on the full extent of the FBI’s spying capabilities. We don’t know what information it routinely collects on American citizens, what extra information it collects on those on various watch lists, and what legal justifications it invokes for its actions. We don’t know its plans for future data collection. We don’t know what scandals and illegal actions—either past or present—are currently being covered up.

We also need information about what data the NSA gathers, either domestically or internationally. We don’t know how much it collects surreptitiously, and how much it relies on arrangements with various companies. We don’t know how much it uses password cracking to get at encrypted data, and how much it exploits existing system vulnerabilities. We don’t know whether it deliberately inserts backdoors into systems it wants to monitor, either with or without the permission of the communications-system vendors.

And we need details about the sorts of analysis the organizations perform. We don’t know what they quickly cull at the point of collection, and what they store for later analysis—and how long they store it. We don’t know what sort of database profiling they do, how extensive their CCTV and surveillance-drone analysis is, how much they perform behavioral analysis, or how extensively they trace friends of people on their watch lists.

We don’t know how big the U.S. surveillance apparatus is today, either in terms of money and people or in terms of how many people are monitored or how much data is collected. Modern technology makes it possible to monitor vastly more people—yesterday’s NSA revelations demonstrate that they could easily surveil everyone—than could ever be done manually.

Whistle-blowing is the moral response to immoral activity by those in power. What’s important here are government programs and methods, not data about individuals. I understand I am asking for people to engage in illegal and dangerous behavior. Do it carefully and do it safely, but—and I am talking directly to you, person working on one of these secret and probably illegal programs—do it.

If you see something, say something. There are many people in the U.S. that will appreciate and admire you.

For the rest of us, we can help by protesting this war on whistle-blowers. We need to force our politicians not to punish them—to investigate the abuses and not the messengers—and to ensure that those unjustly persecuted can obtain redress.

Our government is putting its own self-interest ahead of the interests of the country. That needs to change.

This essay originally appeared on the Atlantic.

EDITED TO ADD (6/10): It’s not just phone records. Another secret program, PRISM, gave the NSA access to e-mails and private messages at Google, Facebook, Yahoo!, Skype, AOL, and others. And in a separate leak, we now know about the Boundless Informant NSA data mining system.

The leaker for at least some of this is Edward Snowden. I consider him an American hero.

EFF has a great timeline of NSA spying. And this and this contain some excellent speculation about what PRISM could be.

Someone needs to write an essay parsing all of the precisely worded denials. Apple has never heard the word “PRISM,” but could have known of the program under a different name. Google maintained that there is no government “back door,” but left open the possibility that the data could have been just handed over. Obama said that the government isn’t “listening to your telephone calls,” ignoring 1) the meta-data, 2) the fact that computers could be doing all of the listening, and 3) that text-to-speech results in phone calls being read and not listened to. And so on and on and on.

Here are people defending the programs. And here’s someone criticizing my essay.

Four more good essays.

I’m sure there are lots more things out there that should be read. Please include the links in comments. Not only essays I would agree with; intelligent opinions from the other sides are just as important.

EDITED TO ADD (6/10): Two essays discussing the policy issues.

My original essay is being discussed on Reddit.

EDITED TO ADD (6/11): Three more good articles: “The Irrationality of Giving Up This Much Liberty to Fight Terror,” “If the NSA Trusted Edward Snowden with Our Data, Why Should We Trust the NSA?” and “Using Metadata to Find Paul Revere.”

EDITED TO ADD (6/11): NSA surveillance reimagined as children’s books.

EDITED TO ADD (7/1): This essay has been translated into Russian and French.

EDITED TO ADD (10/2): This essay has also been translated into Finnish.

Posted on June 10, 2013 at 6:12 AMView Comments

The Problems with CALEA-II

The FBI wants a new law that will make it easier to wiretap the Internet. Although its claim is that the new law will only maintain the status quo, it’s really much worse than that. This law will result in less-secure Internet products and create a foreign industry in more-secure alternatives. It will impose costly burdens on affected companies. It will assist totalitarian governments in spying on their own citizens. And it won’t do much to hinder actual criminals and terrorists.

As the FBI sees it, the problem is that people are moving away from traditional communication systems like telephones onto computer systems like Skype. Eavesdropping on telephones used to be easy. The FBI would call the phone company, which would bring agents into a switching room and allow them to literally tap the wires with a pair of alligator clips and a tape recorder. In the 1990s, the government forced phone companies to provide an analogous capability on digital switches; but today, more and more communications happens over the Internet.

What the FBI wants is the ability to eavesdrop on everything. Depending on the system, this ranges from easy to impossible. E-mail systems like Gmail are easy. The mail resides in Google’s servers, and the company has an office full of people who respond to requests for lawful access to individual accounts from governments all over the world. Encrypted voice systems like Silent Circle are impossible to eavesdrop on—the calls are encrypted from one computer to the other, and there’s no central node to eavesdrop from. In those cases, the only way to make the system eavesdroppable is to add a backdoor to the user software. This is precisely the FBI’s proposal. Companies that refuse to comply would be fined $25,000 a day.

The FBI believes it can have it both ways: that it can open systems to its eavesdropping, but keep them secure from anyone else’s eavesdropping. That’s just not possible. It’s impossible to build a communications system that allows the FBI surreptitious access but doesn’t allow similar access by others. When it comes to security, we have two options: We can build our systems to be as secure as possible from eavesdropping, or we can deliberately weaken their security. We have to choose one or the other.

This is an old debate, and one we’ve been through many times. The NSA even has a name for it: the equities issue. In the 1980s, the equities debate was about export control of cryptography. The government deliberately weakened U.S. cryptography products because it didn’t want foreign groups to have access to secure systems. Two things resulted: fewer Internet products with cryptography, to the insecurity of everybody, and a vibrant foreign security industry based on the unofficial slogan “Don’t buy the U.S. stuff—it’s lousy.”

In 1993, the debate was about the Clipper Chip. This was another deliberately weakened security product, an encrypted telephone. The FBI convinced AT&T to add a backdoor that allowed for surreptitious wiretapping. The product was a complete failure. Again, why would anyone buy a deliberately weakened security system?

In 1994, the Communications Assistance for Law Enforcement Act mandated that U.S. companies build eavesdropping capabilities into phone switches. These were sold internationally; some countries liked having the ability to spy on their citizens. Of course, so did criminals, and there were public scandals in Greece (2005) and Italy (2006) as a result.

In 2012, we learned that every phone switch sold to the Department of Defense had security vulnerabilities in its surveillance system. And just this May, we learned that Chinese hackers breached Google’s system for providing surveillance data for the FBI.

The new FBI proposal will fail in all these ways and more. The bad guys will be able to get around the eavesdropping capability, either by building their own security systems—not very difficult—or buying the more-secure foreign products that will inevitably be made available. Most of the good guys, who don’t understand the risks or the technology, will not know enough to bother and will be less secure. The eavesdropping functions will 1) result in more obscure—and less secure—product designs, and 2) be vulnerable to exploitation by criminals, spies, and everyone else. U.S. companies will be forced to compete at a disadvantage; smart customers won’t buy the substandard stuff when there are more-secure foreign alternatives. Even worse, there are lots of foreign governments who want to use these sorts of systems to spy on their own citizens. Do we really want to be exporting surveillance technology to the likes of China, Syria, and Saudi Arabia?

The FBI’s shortsighted agenda also works against the parts of the government that are still working to secure the Internet for everyone. Initiatives within the NSA, the DOD, and DHS to do everything from securing computer operating systems to enabling anonymous web browsing will all be harmed by this.

What to do, then? The FBI claims that the Internet is “going dark,” and that it’s simply trying to maintain the status quo of being able to eavesdrop. This characterization is disingenuous at best. We are entering a golden age of surveillance; there’s more electronic communications available for eavesdropping than ever before, including whole new classes of information: location tracking, financial tracking, and vast databases of historical communications such as e-mails and text messages. The FBI’s surveillance department has it better than ever. With regard to voice communications, yes, software phone calls will be harder to eavesdrop upon. (Although there are questions about Skype’s security.) That’s just part of the evolution of technology, and one that on balance is a positive thing.

Think of it this way: We don’t hand the government copies of our house keys and safe combinations. If agents want access, they get a warrant and then pick the locks or bust open the doors, just as a criminal would do. A similar system would work on computers. The FBI, with its increasingly non-transparent procedures and systems, has failed to make the case that this isn’t good enough.

Finally there’s a general principle at work that’s worth explicitly stating. All tools can be used by the good guys and the bad guys. Cars have enormous societal value, even though bank robbers can use them as getaway cars. Cash is no different. Both good guys and bad guys send e-mails, use Skype, and eat at all-night restaurants. But because society consists overwhelmingly of good guys, the good uses of these dual-use technologies greatly outweigh the bad uses. Strong Internet security makes us all safer, even though it helps the bad guys as well. And it makes no sense to harm all of us in an attempt to harm a small subset of us.

This essay originally appeared in Foreign Policy.

Posted on June 4, 2013 at 12:44 PMView Comments

Transparency and Accountability

As part of the fallout of the Boston bombings, we’re probably going to get some new laws that give the FBI additional investigative powers. As with the Patriot Act after 9/11, the debate over whether these new laws are helpful will be minimal, but the effects on civil liberties could be large. Even though most people are skeptical about sacrificing personal freedoms for security, it’s hard for politicians to say no to the FBI right now, and it’s politically expedient to demand that something be done.

If our leaders can’t say no—and there’s no reason to believe they can—there are two concepts that need to be part of any new counterterrorism laws, and investigative laws in general: transparency and accountability.

Long ago, we realized that simply trusting people and government agencies to always do the right thing doesn’t work, so we need to check up on them. In a democracy, transparency and accountability are how we do that. It’s how we ensure that we get both effective and cost-effective government. It’s how we prevent those we trust from abusing that trust, and protect ourselves when they do. And it’s especially important when security is concerned.

First, we need to ensure that the stuff we’re paying money for actually works and has a measureable impact. Law-enforcement organizations regularly invest in technologies that don’t make us any safer. The TSA, for example, could devote an entire museum to expensive but ineffective systems: puffer machines, body scanners, FAST behavioral screening, and so on. Local police departments have been wasting lots of post-9/11 money on unnecessary high-tech weaponry and equipment. The occasional high-profile success aside, police surveillance cameras have been shown to be a largely ineffective police tool.

Sometimes honest mistakes led organizations to invest in these technologies. Sometimes there’s self-deception and mismanagement—and far too often lobbyists are involved. Given the enormous amount of security money post-9/11, you inevitably end up with an enormous amount of waste. Transparency and accountability are how we keep all of this in check.

Second, we need to ensure that law enforcement does what we expect it to do and nothing more. Police powers are invariably abused. Mission creep is inevitable, and it results in laws designed to combat one particular type of crime being used for an ever-widening array of crimes. Transparency is the only way we have of knowing when this is going on.

For example, that’s how we learned that the FBI is abusing National Security Letters. Traditionally, we use the warrant process to protect ourselves from police overreach. It’s not enough for the police to want to conduct a search; they also need to convince a neutral third party—a judge—that the search is in the public interest and will respect the rights of those searched. That’s accountability, and it’s the very mechanism that NSLs were exempted from.

When laws are broken, accountability is how we punish those who abused their power. It’s how, for example, we correct racial profiling by police departments. And it’s a lack of accountability that permits the FBI to get away with massive data collection until exposed by a whistleblower or noticed by a judge.

Third, transparency and accountability keep both law enforcement and politicians from lying to us. The Bush Administration lied about the extent of the NSA’s warrantless wiretapping program. The TSA lied about the ability of full-body scanners to save naked images of people. We’ve been lied to about the lethality of tasers, when and how the FBI eavesdrops on cell-phone calls, and about the existence of surveillance records. Without transparency, we would never know.

A decade ago, the FBI was heavily lobbying Congress for a law to give it new wiretapping powers: a law known as CALEA. One of its key justifications was that existing law didn’t allow it to perform speedy wiretaps during kidnapping investigations. It sounded plausible—and who wouldn’t feel sympathy for kidnapping victims?—but when civil-liberties organizations analyzed the actual data, they found that it was just a story; there were no instances of wiretapping in kidnapping investigations. Without transparency, we would never have known that the FBI was making up stories to scare Congress.

If we’re going to give the government any new powers, we need to ensure that there’s oversight. Sometimes this oversight is before action occurs. Warrants are a great example. Sometimes they’re after action occurs: public reporting, audits by inspector generals, open hearings, notice to those affected, or some other mechanism. Too often, law enforcement tries to exempt itself from this principle by supporting laws that are specifically excused from oversight…or by establishing secret courts that just rubber-stamp government wiretapping requests.

Furthermore, we need to ensure that mechanisms for accountability have teeth and are used.

As we respond to the threat of terrorism, we must remember that there are other threats as well. A society without transparency and accountability is the very definition of a police state. And while a police state might have a low crime rate—especially if you don’t define police corruption and other abuses of power as crime—and an even lower terrorism rate, it’s not a society that most of us would willingly choose to live in.

We already give law enforcement enormous power to intrude into our lives. We do this because we know they need this power to catch criminals, and we’re all safer thereby. But because we recognize that a powerful police force is itself a danger to society, we must temper this power with transparency and accountability.

This essay previously appeared on TheAtlantic.com.

Posted on May 14, 2013 at 5:48 AMView Comments

Is the U.S. Government Recording and Saving All Domestic Telephone Calls?

I have no idea if “former counterterrorism agent for the FBI” Tom Clemente knows what he’s talking about, but that’s certainly what he implies here:

More recently, two sources familiar with the investigation told CNN that Russell had spoken with Tamerlan after his picture appeared on national television April 18.

What exactly the two said remains under investigation, the sources said.

Investigators may be able to recover the conversation, said Tom Clemente, a former counterterrorism agent for the FBI.

“We certainly have ways in national security investigations to find out exactly what was said in that conversation,” he told CNN’s Erin Burnett on Monday, adding that “all of that stuff is being captured as we speak whether we know it or like it or not.”

“It’s not necessarily something that the FBI is going to want to present in court, but it may help lead the investigation and/or lead to questioning of her,” he said.

I’m very skeptical about Clemente’s comments. He left the FBI shortly after 9/11, and he didn’t have any special security clearances. My guess is that he is speaking more about what the NSA and FBI could potentially do, and not about what they are doing right now. And I don’t believe that the NSA could save every domestic phone call, not at this time. Possibly after the Utah data center is finished, but not now. They could be saving the all the metadata now, but I’m skeptical about that too.

Other commentary.

EDITED TO ADD (5/7): Interesting comments. I think it’s worth going through the math. There are two possible ways to do this. The first is to collect, compress, transport, and store. The second is to collect, convert to text, transport, and store. So, what data rates, processing requirements, and storage sizes are we talking about?

Posted on May 7, 2013 at 12:57 PMView Comments

Intelligence Analysis and the Connect-the-Dots Metaphor

The FBI and the CIA are being criticized for not keeping better track of Tamerlan Tsarnaev in the months before the Boston Marathon bombings. How could they have ignored such a dangerous person? How do we reform the intelligence community to ensure this kind of failure doesn’t happen again?

It’s an old song by now, one we heard after the 9/11 attacks in 2001 and after the Underwear Bomber’s failed attack in 2009. The problem is that connecting the dots is a bad metaphor, and focusing on it makes us more likely to implement useless reforms.

Connecting the dots in a coloring book is easy and fun. They’re right there on the page, and they’re all numbered. All you have to do is move your pencil from one dot to the next, and when you’re done, you’ve drawn a sailboat. Or a tiger. It’s so simple that 5-year-olds can do it.

But in real life, the dots can only be numbered after the fact. With the benefit of hindsight, it’s easy to draw lines from a Russian request for information to a foreign visit to some other piece of information that might have been collected.

In hindsight, we know who the bad guys are. Before the fact, there are an enormous number of potential bad guys.

How many? We don’t know. But we know that the no-fly list had 21,000 people on it last year. The Terrorist Identities Datamart Environment, also known as the watch list, has 700,000 names on it.

We have no idea how many potential “dots” the FBI, CIA, NSA and other agencies collect, but it’s easily in the millions. It’s easy to work backwards through the data and see all the obvious warning signs. But before a terrorist attack, when there are millions of dots—some important but the vast majority unimportant—uncovering plots is a lot harder.

Rather than thinking of intelligence as a simple connect-the-dots picture, think of it as a million unnumbered pictures superimposed on top of each other. Or a random-dot stereogram. Is it a sailboat, a puppy, two guys with pressure-cooker bombs, or just an unintelligible mess of dots? You try to figure it out.

It’s not a matter of not enough data, either.

Piling more data onto the mix makes it harder, not easier. The best way to think of it is a needle-in-a-haystack problem; the last thing you want to do is increase the amount of hay you have to search through. The television show Person of Interest is fiction, not fact.

There’s a name for this sort of logical fallacy: hindsight bias. First explained by psychologists Daniel Kahneman and Amos Tversky, it’s surprisingly common. Since what actually happened is so obvious once it happens, we overestimate how obvious it was before it happened.

We actually misremember what we once thought, believing that we knew all along that what happened would happen. It’s a surprisingly strong tendency, one that has been observed in countless laboratory experiments and real-world examples of behavior. And it’s what all the post-Boston-Marathon bombing dot-connectors are doing.

Before we start blaming agencies for failing to stop the Boston bombers, and before we push “intelligence reforms” that will shred civil liberties without making us any safer, we need to stop seeing the past as a bunch of obvious dots that need connecting.

Kahneman, a Nobel prize winner, wisely noted: “Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.” Kahneman calls it “the illusion of understanding,” explaining that the past is only so understandable because we have cast it as simple inevitable stories and leave out the rest.

Nassim Taleb, an expert on risk engineering, calls this tendency the “narrative fallacy.” We humans are natural storytellers, and the world of stories is much more tidy, predictable and coherent than the real world.

Millions of people behave strangely enough to warrant the FBI’s notice, and almost all of them are harmless. It is simply not possible to find every plot beforehand, especially when the perpetrators act alone and on impulse.

We have to accept that there always will be a risk of terrorism, and that when the occasional plot succeeds, it’s not necessarily because our law enforcement systems have failed.

This essay previously appeared on CNN.

EDITED TO ADD (5/7): The hindsight bias was actually first discovered by Baruch Fischhoff: “Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty,” Journal of Experimental Psychology: Human Perception and Performance, 1(3), 1975, pp. 288-299.

Posted on May 7, 2013 at 6:10 AMView Comments

The Public/Private Surveillance Partnership

Our government collects a lot of information about us. Tax records, legal records, license records, records of government services received—it’s all in databases that are increasingly linked and correlated. Still, there’s a lot of personal information the government can’t collect. Either they’re prohibited by law from asking without probable cause and a judicial order, or they simply have no cost-effective way to collect it. But the government has figured out how to get around the laws, and collect personal data that has been historically denied to them: ask corporate America for it.

It’s no secret that we’re monitored continuously on the Internet. Some of the company names you know, such as Google and Facebook. Others hide in the background as you move about the Internet. There are browser plugins that show you who is tracking you. One Atlantic editor found 105 companies tracking him during one 36-hour period. Add data from your cell phone (who you talk to, your location), your credit cards (what you buy, from whom you buy it), and the dozens of other times you interact with a computer daily, we live in a surveillance state beyond the dreams of Orwell.

It’s all corporate data, compiled and correlated, bought and sold. And increasingly, the government is doing the buying. Some of this is collected using National Security Letters (NSLs). These give the government the ability to demand an enormous amount of personal data about people for very speculative reasons, with neither probable cause nor judicial oversight. Data on these secretive orders is obviously scant, but we know that the FBI has issued hundreds of thousands of them in the past decade—for reasons that go far beyond terrorism.

NSLs aren’t the only way the government can get at corporate data. Sometimes they simply purchase it, just as any other company might. Sometimes they can get it for free, from corporations that want to stay on the government’s good side.

CISPA, a bill currently wending its way through Congress, codifies this sort of practice even further. If signed into law, CISPA will allow the government to collect all sorts of personal data from corporations, without any oversight at all, and will protect corporations from lawsuits based on their handing over that data. Without hyperbole, it’s been called the death of the 4th Amendment. Right now, it’s mainly the FBI and the NSA who are getting this data, but—all sorts of government agencies have administrative subpoena power.

Data on this scale has all sorts of applications. From finding tax cheaters by comparing data brokers’ estimates of income and net worth with what’s reported on tax returns, to compiling a list of gun owners from Web browsing habits, instant messaging conversations, and locations—did you have your iPhone turned on when you visited a gun store?—the possibilities are endless.

Government photograph databases form the basis of any police facial recognition system. They’re not very good today, but they’ll only get better. But the government no longer needs to collect photographs. Experiments demonstrate that the Facebook database of tagged photographs is surprisingly effective at identifying people. As more places follow Disney’s lead in fingerprinting people at its theme parks, the government will be able to use that to identify people as well.

In a few years, the whole notion of a government-issued ID will seem quaint. Among facial recognition, the unique signature from your smart phone, the RFID chips in your clothing and other items you own, and whatever new technologies that will broadcast your identity, no one will have to ask to see ID. When you walk into a store, they’ll already know who you are. When you interact with a policeman, she’ll already have your personal information displayed on her Internet-enabled glasses.

Soon, governments won’t have to bother collecting personal data. We’re willingly giving it to a vast network of for-profit data collectors, and they’re more than happy to pass it on to the government without our knowledge or consent.

This essay previously appeared on TheAtlantic.com.

EDITED TO ADD: This essay has been translated into French.

Posted on May 3, 2013 at 6:15 AMView Comments

Random Links on the Boston Terrorist Attack

Encouraging poll data says that maybe Americans are starting to have realistic fears about terrorism, or at least are refusing to be terrorized.

Good essay by Scott Atran on terrorism and our reaction.

Reddit apologizes. I think this is a big story. The Internet is going to help in everything, including trying to identify terrorists. This will happen whether or not the help is needed, wanted, or even helpful. I think this took the FBI by surprise. (Here’s a good commentary on this sort of thing.)

Facial recognition software didn’t help. I agree with this, though; it will only get better.

EDITED TO ADD (4/25): “Hapless, Disorganized, and Irrational“: John Mueller and Mark Stewart describe the Boston—and most other—terrorists.

Posted on April 25, 2013 at 6:42 AMView Comments

1 9 10 11 12 13 23

Sidebar photo of Bruce Schneier by Joe MacInnis.