Entries Tagged "Boston Marathon bombings"

Page 1 of 2

Metal Detectors at Sports Stadiums

Fans attending Major League Baseball games are being greeted in a new way this year: with metal detectors at the ballparks. Touted as a counterterrorism measure, they’re nothing of the sort. They’re pure security theater: They look good without doing anything to make us safer. We’re stuck with them because of a combination of buck passing, CYA thinking, and fear.

As a security measure, the new devices are laughable. The ballpark metal detectors are much more lax than the ones at an airport checkpoint. They aren’t very sensitive—people with phones and keys in their pockets are sailing through—and there are no X-ray machines. Bags get the same cursory search they’ve gotten for years. And fans wanting to avoid the detectors can opt for a “light pat-down search” instead.

There’s no evidence that this new measure makes anyone safer. A halfway competent ticketholder would have no trouble sneaking a gun into the stadium. For that matter, a bomb exploded at a crowded checkpoint would be no less deadly than one exploded in the stands. These measures will, at best, be effective at stopping the random baseball fan who’s carrying a gun or knife into the stadium. That may be a good idea, but unless there’s been a recent spate of fan shootings and stabbings at baseball games—and there hasn’t—this is a whole lot of time and money being spent to combat an imaginary threat.

But imaginary threats are the only ones baseball executives have to stop this season; there’s been no specific terrorist threat or actual intelligence to be concerned about. MLB executives forced this change on ballparks based on unspecified discussions with the Department of Homeland Security after the Boston Marathon bombing in 2013. Because, you know, that was also a sporting event.

This system of vague consultations and equally vague threats ensure that no one organization can be seen as responsible for the change. MLB can claim that the league and teams “work closely” with DHS. DHS can claim that it was MLB’s initiative. And both can safely relax because if something happens, at least they did something.

It’s an attitude I’ve seen before: “Something must be done. This is something. Therefore, we must do it.” Never mind if the something makes any sense or not.

In reality, this is CYA security, and it’s pervasive in post-9/11 America. It no longer matters if a security measure makes sense, if it’s cost-effective or if it mitigates any actual threats. All that matters is that you took the threat seriously, so if something happens you won’t be blamed for inaction. It’s security, all right—security for the careers of those in charge.

I’m not saying that these officials care only about their jobs and not at all about preventing terrorism, only that their priorities are skewed. They imagine vague threats, and come up with correspondingly vague security measures intended to address them. They experience none of the costs. They’re not the ones who have to deal with the long lines and confusion at the gates. They’re not the ones who have to arrive early to avoid the messes the new policies have caused around the league. And if fans spend more money at the concession stands because they’ve arrived an hour early and have had the food and drinks they tried to bring along confiscated, so much the better, from the team owners’ point of view.

I can hear the objections to this as I write. You don’t know these measures won’t be effective! What if something happens? Don’t we have to do everything possible to protect ourselves against terrorism?

That’s worst-case thinking, and it’s dangerous. It leads to bad decisions, bad design and bad security. A better approach is to realistically assess the threats, judge security measures on their effectiveness and take their costs into account. And the result of that calm, rational look will be the realization that there will always be places where we pack ourselves densely together, and that we should spend less time trying to secure those places and more time finding terrorist plots before they can be carried out.

So far, fans have been exasperated but mostly accepting of these new security measures. And this is precisely the problem—most of us don’t care all that much. Our options are to put up with these measures, or stay home. Going to a baseball game is not a political act, and metal detectors aren’t worth a boycott. But there’s an undercurrent of fear as well. If it’s in the name of security, we’ll accept it. As long as our leaders are scared of the terrorists, they’re going to continue the security theater. And we’re similarly going to accept whatever measures are forced upon us in the name of security. We’re going to accept the National Security Agency’s surveillance of every American, airport security procedures that make no sense and metal detectors at baseball and football stadiums. We’re going to continue to waste money overreacting to irrational fears.

We no longer need the terrorists. We’re now so good at terrorizing ourselves.

This essay previously appeared in the Washington Post.

Posted on April 15, 2015 at 6:58 AMView Comments

Analysis of the FBI's Failure to Stop the Boston Marathon Bombings

Detailed response and analysis of the inspectors general report on the Boston Marathon bombings:

Two opposite mistakes in an after-the-fact review of a terrorist incident are equally damaging. One is to fail to recognize the powerful difference between foresight and hindsight in evaluating how an investigative or intelligence agency should have behaved. After the fact, we know on whom we should have focused attention as a suspect, and we know what we should have protected as a target. With foresight alone, we know neither of these critically important clues to what happened and why. With hindsight, we can focus all of our attention narrowly; with foresight, we have to spread it broadly, as broadly as the imagination of our attackers may roam.

The second mistake is equally important. It is to confuse the fact that people in official positions, like others, will inevitably make mistakes in carrying out any complicated task, with the idea that no mistakes were really made. We can see mistakes with hindsight that can be avoided in the future by recognizing them clearly and designing solutions. After mistakes are made, nothing is more foolish than to hide them or pretend that they were not mistakes.

Posted on May 2, 2014 at 6:26 AMView Comments

The Story of the Bomb Squad at the Boston Marathon

This is interesting reading, but I’m left wanting more. What are the lessons here? How can we do this better next time? Clearly we won’t be able to anticipate bombings; even Israel can’t do that. We have to get better at responding.

Several years after 9/11, I conducted training with a military bomb unit charged with guarding Washington, DC. Our final exam was a nightmare scenario—a homemade nuke at the Super Bowl. Our job was to defuse it while the fans were still in the stands, there being no way to quickly and safely clear out 80,000 people. That scenario made two fundamental assumptions that are no longer valid: that there would be one large device and that we would find it before it detonated.

Boston showed that there’s another threat, one that looks a lot different. “We used to train for one box in a doorway. We went into a slower and less aggressive mode, meticulous, surgical. Now we’re transitioning to a high-speed attack, more maneuverable gear, no bomb suit until the situation has stabilized,” Gutzmer says. “We’re not looking for one bomber who places a device and leaves. We’re looking for an active bomber with multiple bombs, and we need to attack fast.”

A post-Boston final exam will soon look a lot different. Instead of a nuke at the Super Bowl, how about this: Six small bombs have already detonated, and now your job is to find seven more—among thousands of bags—while the bomber hides among a crowd of the fleeing, responding, wounded, and dead. Meanwhile the entire city overwhelms your backup with false alarms. Welcome to the new era of bomb work.

Posted on November 5, 2013 at 6:53 AMView Comments

Transparency and Accountability

As part of the fallout of the Boston bombings, we’re probably going to get some new laws that give the FBI additional investigative powers. As with the Patriot Act after 9/11, the debate over whether these new laws are helpful will be minimal, but the effects on civil liberties could be large. Even though most people are skeptical about sacrificing personal freedoms for security, it’s hard for politicians to say no to the FBI right now, and it’s politically expedient to demand that something be done.

If our leaders can’t say no—and there’s no reason to believe they can—there are two concepts that need to be part of any new counterterrorism laws, and investigative laws in general: transparency and accountability.

Long ago, we realized that simply trusting people and government agencies to always do the right thing doesn’t work, so we need to check up on them. In a democracy, transparency and accountability are how we do that. It’s how we ensure that we get both effective and cost-effective government. It’s how we prevent those we trust from abusing that trust, and protect ourselves when they do. And it’s especially important when security is concerned.

First, we need to ensure that the stuff we’re paying money for actually works and has a measureable impact. Law-enforcement organizations regularly invest in technologies that don’t make us any safer. The TSA, for example, could devote an entire museum to expensive but ineffective systems: puffer machines, body scanners, FAST behavioral screening, and so on. Local police departments have been wasting lots of post-9/11 money on unnecessary high-tech weaponry and equipment. The occasional high-profile success aside, police surveillance cameras have been shown to be a largely ineffective police tool.

Sometimes honest mistakes led organizations to invest in these technologies. Sometimes there’s self-deception and mismanagement—and far too often lobbyists are involved. Given the enormous amount of security money post-9/11, you inevitably end up with an enormous amount of waste. Transparency and accountability are how we keep all of this in check.

Second, we need to ensure that law enforcement does what we expect it to do and nothing more. Police powers are invariably abused. Mission creep is inevitable, and it results in laws designed to combat one particular type of crime being used for an ever-widening array of crimes. Transparency is the only way we have of knowing when this is going on.

For example, that’s how we learned that the FBI is abusing National Security Letters. Traditionally, we use the warrant process to protect ourselves from police overreach. It’s not enough for the police to want to conduct a search; they also need to convince a neutral third party—a judge—that the search is in the public interest and will respect the rights of those searched. That’s accountability, and it’s the very mechanism that NSLs were exempted from.

When laws are broken, accountability is how we punish those who abused their power. It’s how, for example, we correct racial profiling by police departments. And it’s a lack of accountability that permits the FBI to get away with massive data collection until exposed by a whistleblower or noticed by a judge.

Third, transparency and accountability keep both law enforcement and politicians from lying to us. The Bush Administration lied about the extent of the NSA’s warrantless wiretapping program. The TSA lied about the ability of full-body scanners to save naked images of people. We’ve been lied to about the lethality of tasers, when and how the FBI eavesdrops on cell-phone calls, and about the existence of surveillance records. Without transparency, we would never know.

A decade ago, the FBI was heavily lobbying Congress for a law to give it new wiretapping powers: a law known as CALEA. One of its key justifications was that existing law didn’t allow it to perform speedy wiretaps during kidnapping investigations. It sounded plausible—and who wouldn’t feel sympathy for kidnapping victims?—but when civil-liberties organizations analyzed the actual data, they found that it was just a story; there were no instances of wiretapping in kidnapping investigations. Without transparency, we would never have known that the FBI was making up stories to scare Congress.

If we’re going to give the government any new powers, we need to ensure that there’s oversight. Sometimes this oversight is before action occurs. Warrants are a great example. Sometimes they’re after action occurs: public reporting, audits by inspector generals, open hearings, notice to those affected, or some other mechanism. Too often, law enforcement tries to exempt itself from this principle by supporting laws that are specifically excused from oversight…or by establishing secret courts that just rubber-stamp government wiretapping requests.

Furthermore, we need to ensure that mechanisms for accountability have teeth and are used.

As we respond to the threat of terrorism, we must remember that there are other threats as well. A society without transparency and accountability is the very definition of a police state. And while a police state might have a low crime rate—especially if you don’t define police corruption and other abuses of power as crime—and an even lower terrorism rate, it’s not a society that most of us would willingly choose to live in.

We already give law enforcement enormous power to intrude into our lives. We do this because we know they need this power to catch criminals, and we’re all safer thereby. But because we recognize that a powerful police force is itself a danger to society, we must temper this power with transparency and accountability.

This essay previously appeared on TheAtlantic.com.

Posted on May 14, 2013 at 5:48 AMView Comments

Is the U.S. Government Recording and Saving All Domestic Telephone Calls?

I have no idea if “former counterterrorism agent for the FBI” Tom Clemente knows what he’s talking about, but that’s certainly what he implies here:

More recently, two sources familiar with the investigation told CNN that Russell had spoken with Tamerlan after his picture appeared on national television April 18.

What exactly the two said remains under investigation, the sources said.

Investigators may be able to recover the conversation, said Tom Clemente, a former counterterrorism agent for the FBI.

“We certainly have ways in national security investigations to find out exactly what was said in that conversation,” he told CNN’s Erin Burnett on Monday, adding that “all of that stuff is being captured as we speak whether we know it or like it or not.”

“It’s not necessarily something that the FBI is going to want to present in court, but it may help lead the investigation and/or lead to questioning of her,” he said.

I’m very skeptical about Clemente’s comments. He left the FBI shortly after 9/11, and he didn’t have any special security clearances. My guess is that he is speaking more about what the NSA and FBI could potentially do, and not about what they are doing right now. And I don’t believe that the NSA could save every domestic phone call, not at this time. Possibly after the Utah data center is finished, but not now. They could be saving the all the metadata now, but I’m skeptical about that too.

Other commentary.

EDITED TO ADD (5/7): Interesting comments. I think it’s worth going through the math. There are two possible ways to do this. The first is to collect, compress, transport, and store. The second is to collect, convert to text, transport, and store. So, what data rates, processing requirements, and storage sizes are we talking about?

Posted on May 7, 2013 at 12:57 PMView Comments

Intelligence Analysis and the Connect-the-Dots Metaphor

The FBI and the CIA are being criticized for not keeping better track of Tamerlan Tsarnaev in the months before the Boston Marathon bombings. How could they have ignored such a dangerous person? How do we reform the intelligence community to ensure this kind of failure doesn’t happen again?

It’s an old song by now, one we heard after the 9/11 attacks in 2001 and after the Underwear Bomber’s failed attack in 2009. The problem is that connecting the dots is a bad metaphor, and focusing on it makes us more likely to implement useless reforms.

Connecting the dots in a coloring book is easy and fun. They’re right there on the page, and they’re all numbered. All you have to do is move your pencil from one dot to the next, and when you’re done, you’ve drawn a sailboat. Or a tiger. It’s so simple that 5-year-olds can do it.

But in real life, the dots can only be numbered after the fact. With the benefit of hindsight, it’s easy to draw lines from a Russian request for information to a foreign visit to some other piece of information that might have been collected.

In hindsight, we know who the bad guys are. Before the fact, there are an enormous number of potential bad guys.

How many? We don’t know. But we know that the no-fly list had 21,000 people on it last year. The Terrorist Identities Datamart Environment, also known as the watch list, has 700,000 names on it.

We have no idea how many potential “dots” the FBI, CIA, NSA and other agencies collect, but it’s easily in the millions. It’s easy to work backwards through the data and see all the obvious warning signs. But before a terrorist attack, when there are millions of dots—some important but the vast majority unimportant—uncovering plots is a lot harder.

Rather than thinking of intelligence as a simple connect-the-dots picture, think of it as a million unnumbered pictures superimposed on top of each other. Or a random-dot stereogram. Is it a sailboat, a puppy, two guys with pressure-cooker bombs, or just an unintelligible mess of dots? You try to figure it out.

It’s not a matter of not enough data, either.

Piling more data onto the mix makes it harder, not easier. The best way to think of it is a needle-in-a-haystack problem; the last thing you want to do is increase the amount of hay you have to search through. The television show Person of Interest is fiction, not fact.

There’s a name for this sort of logical fallacy: hindsight bias. First explained by psychologists Daniel Kahneman and Amos Tversky, it’s surprisingly common. Since what actually happened is so obvious once it happens, we overestimate how obvious it was before it happened.

We actually misremember what we once thought, believing that we knew all along that what happened would happen. It’s a surprisingly strong tendency, one that has been observed in countless laboratory experiments and real-world examples of behavior. And it’s what all the post-Boston-Marathon bombing dot-connectors are doing.

Before we start blaming agencies for failing to stop the Boston bombers, and before we push “intelligence reforms” that will shred civil liberties without making us any safer, we need to stop seeing the past as a bunch of obvious dots that need connecting.

Kahneman, a Nobel prize winner, wisely noted: “Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.” Kahneman calls it “the illusion of understanding,” explaining that the past is only so understandable because we have cast it as simple inevitable stories and leave out the rest.

Nassim Taleb, an expert on risk engineering, calls this tendency the “narrative fallacy.” We humans are natural storytellers, and the world of stories is much more tidy, predictable and coherent than the real world.

Millions of people behave strangely enough to warrant the FBI’s notice, and almost all of them are harmless. It is simply not possible to find every plot beforehand, especially when the perpetrators act alone and on impulse.

We have to accept that there always will be a risk of terrorism, and that when the occasional plot succeeds, it’s not necessarily because our law enforcement systems have failed.

This essay previously appeared on CNN.

EDITED TO ADD (5/7): The hindsight bias was actually first discovered by Baruch Fischhoff: “Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty,” Journal of Experimental Psychology: Human Perception and Performance, 1(3), 1975, pp. 288-299.

Posted on May 7, 2013 at 6:10 AMView Comments

More Links on the Boston Terrorist Attacks

Max Abrahms has two sensible essays.

Probably the ultimate in security theater: Williams-Sonoma stops selling pressure cookers in the Boston area “out of respect.” They say it’s temporary. (I bought a Williams-Sonoma pressure cooker last Christmas; I wonder if I’m now on a list.)

A tragedy: Sunil Tripathi, whom Reddit and other sites wrongly identified as one of the bombers, was found dead in the Providence River. I hope it’s not a suicide.

And worst of all, New York Mayor Bloomberg scares me more than the terrorists ever could:

In the wake of the Boston Marathon bombings, Mayor Michael Bloomberg said Monday the country’s interpretation of the Constitution will “have to change” to allow for greater security to stave off future attacks.

“The people who are worried about privacy have a legitimate worry,” Mr. Bloomberg said during a press conference in Midtown. “But we live in a complex world where you’re going to have to have a level of security greater than you did back in the olden days, if you will. And our laws and our interpretation of the Constitution, I think, have to change.”

Terrorism’s effectiveness doesn’t come from the terrorist acts; it comes from our reactions to it. We need leaders who aren’t terrorized.

EDITED TO ADD (4/29): Only indirectly related, but the Kentucky Derby is banning “removable lens cameras” for security reasons.

EDITED TO ADD (4/29): And a totally unscientific CNN opinion poll: 57% say no to: “Is it justifiable to violate certain civil liberties in the name of national security?”

EDITED TO ADD (4/29): It seems that Sunil Tripathi died well before the Boston bombing. So while his family was certainly affected by the false accusations, he wasn’t.

EDITED TO ADD (4/29): On the difference between mass murder and terrorism:

What the United States means by terrorist violence is, in large part, “public violence some weirdo had the gall to carry out using a weapon other than a gun.”

EDITED TO ADD (5/14): On fear fatigue—and a good modeling of how to be indomitable. On the surprising dearth of terrorists. Why emergency medical response has improved since 9/11. What if the Boston bombers had been shooters instead. More on Williams-Sonoma: Shortly thereafter, they released a statement apologizing to anyone who might be offended. Don’t be terrorized. “The new terrorism”—from 2011 (in five parts, and this is the first one). This is kind of wordy, but it’s an interesting essay on the nature of fear…and cats. Glenn Greenwald on reactions to the bombing. How a 20-year-old Saudi victim of the bombing was instantly, and baselessly, converted by the US media and government into a “suspect.” Four effective responses to terrorism. People being terrorized. On not letting the bad guys win. Resilience. More resilience Why terrorism works. Data shows that terrorism has declined. Mass hysteria as a terrorist weapon.

Posted on April 29, 2013 at 10:27 AMView Comments

Random Links on the Boston Terrorist Attack

Encouraging poll data says that maybe Americans are starting to have realistic fears about terrorism, or at least are refusing to be terrorized.

Good essay by Scott Atran on terrorism and our reaction.

Reddit apologizes. I think this is a big story. The Internet is going to help in everything, including trying to identify terrorists. This will happen whether or not the help is needed, wanted, or even helpful. I think this took the FBI by surprise. (Here’s a good commentary on this sort of thing.)

Facial recognition software didn’t help. I agree with this, though; it will only get better.

EDITED TO ADD (4/25): “Hapless, Disorganized, and Irrational“: John Mueller and Mark Stewart describe the Boston—and most other—terrorists.

Posted on April 25, 2013 at 6:42 AMView Comments

The Police Now Like Amateur Photography

PhotographyIsNotACrime.com points out the obvious: after years of warning us that photography is suspicious, the police were happy to accept all of those amateur photographs and videos at the Boston Marathon.

Adding to the hypocrisy is that these same authorities will most likely start clamping down on citizens with cameras more than ever once the smoke clears and we once again become a nation of paranoids willing to give up our freedoms in exchange for some type of perceived security.

After all, that is exactly how it played out in the years after the 9/11 terrorist attacks where it became impossible to photograph buildings, trains or airplanes without drawing the suspicion of authorities as potential terrorists.

Posted on April 23, 2013 at 12:34 PMView Comments

The Boston Marathon Bomber Manhunt

I generally give the police a lot of tactical leeway in times like this. The very armed and very dangerous suspects warranted extraordinary treatment. They were perfectly capable of killing again, taking hostages, planting more bombs—and we didn’t know the extent of the plot or the group. That’s why I didn’t object to the massive police dragnet, the city-wide lock down, and so on.

Ross Anderson has a different take:

…a million people were under virtual house arrest; the 19-year-old fugitive from justice happened to be a Muslim. Whatever happened to the doctrine that infringements of one liberty to protect another should be necessary and proportionate?

In the London bombings, four idiots killed themselves in the first incident with a few dozen bystanders, but the second four failed and ran for it when their bombs didn’t go off. It didn’t occur to anyone to lock down London. They were eventually tracked down and arrested, together with their support team. Digital forensics played a big role; the last bomber to be caught left the country and changed his SIM, but not his IMEI. It’s next to impossible for anyone to escape nowadays if the authorities try hard.

He has a point, although I’m not sure I agree with it.

Opinions?

EDITED TO ADD (4/20): This makes the argument very well. On the other hand, readers are rightfully pointing out that the lock down was in response to the shooting of a campus police officer, a carjacking, a firefight, and a vehicle chase with thrown bombs: the sort of thing that pretty much only happens in the movies.

EDITED TO ADD (4/20): More commentary on this Slashdot thread.

Posted on April 20, 2013 at 8:19 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.