Essays in the Category “Terrorism”
Imagine this: A terrorist hacks into a commercial airplane from the ground, takes over the controls from the pilots and flies the plane into the ground. It sounds like the plot of some "Die Hard" reboot, but it's actually one of the possible scenarios outlined in a new Government Accountability Office report on security vulnerabilities in modern airplanes.
It's certainly possible, but in the scheme of Internet risks I worry about, it's not very high. I'm more worried about the more pedestrian attacks against more common Internet-connected devices.
Security theater meets America's pastime.
Fans attending Major League Baseball games are being greeted in a new way this year: with metal detectors at the ballparks. Touted as a counterterrorism measure, they're nothing of the sort. They're pure security theater: They look good without doing anything to make us safer. We're stuck with them because of a combination of buck passing, CYA thinking and fear.
The nation can survive the occasional terrorist attack, but our freedoms can't survive an invulnerable leader like Keith Alexander operating within inadequate constraints.
Leaks from the whistleblower Edward Snowden have catapulted the NSA into newspaper headlines and demonstrated that it has become one of the most powerful government agencies in the country. From the secret court rulings that allow it to collect data on all Americans to its systematic subversion of the entire Internet as a surveillance platform, the NSA has amassed an enormous amount of power.
There are two basic schools of thought about how this came to pass. The first focuses on the agency's power.
NSA apologists say spying is only used for menaces like "weapons of mass destruction" and "terror." But those terms have been radically redefined.
One of the assurances I keep hearing about the U.S. government's spying on American citizens is that it's only used in cases of terrorism. Terrorism is, of course, an extraordinary crime, and its horrific nature is supposed to justify permitting all sorts of excesses to prevent it. But there's a problem with this line of reasoning: mission creep.
Terrorism causes fear, and we overreact to that fear. Our brains aren't very good at probability and risk analysis. We tend to exaggerate spectacular, strange and rare events, and downplay ordinary, familiar and common ones. We think rare risks are more common than they are, and we fear them more than probability indicates we should.
As part of the fallout of the Boston bombings, we're probably going to get some new laws that give the FBI additional investigative powers. As with the Patriot Act after 9/11, the debate over whether these new laws are helpful will be minimal, but the effects on civil liberties could be large. Even though most people are skeptical about sacrificing personal freedoms for security, it's hard for politicians to say no to the FBI right now, and it's politically expedient to demand that something be done.
If our leaders can't say no—and there's no reason to believe they can—there are two concepts that need to be part of any new counterterrorism laws, and investigative laws in general: transparency and accountability.
The FBI and the CIA are being criticized for not keeping better track of Tamerlan Tsarnaev in the months before the Boston Marathon bombings. How could they have ignored such a dangerous person? How do we reform the intelligence community to ensure this kind of failure doesn't happen again?
It's an old song by now, one we heard after the 9/11 attacks in 2001 and after the Underwear Bomber's failed attack in 2009.
It is easy to feel scared and powerless in the wake of attacks like those at the Boston Marathon. But it also plays into the perpetrators' hands.
As the details about the bombings in Boston unfold, it'd be easy to be scared. It'd be easy to feel powerless and demand that our elected leaders do something—anything—to keep us safe.
It'd be easy, but it'd be wrong.
A core, not side, effect of technology is its ability to magnify power and multiply force—for both attackers and defenders. One side creates ceramic handguns, laser-guided missiles, and new-identity theft techniques, while the other side creates anti-missile defense systems, fingerprint databases, and automatic facial recognition systems.
The problem is that it's not balanced: Attackers generally benefit from new security technologies before defenders do. They have a first-mover advantage.
Horrific events, such as the massacre in Aurora, can be catalysts for social and political change. Sometimes it seems that they're the only catalyst; recall how drastically our policies toward terrorism changed after 9/11 despite how moribund they were before.
The problem is that fear can cloud our reasoning, causing us to overreact and to overly focus on the specifics. And the key is to steer our desire for change in that time of fear.
ABSTRACT: The problem of securing biological research data is a difficult and complicated one. Our ability to secure data on computers is not robust enough to ensure the security of existing data sets. Lessons from cryptography illustrate that neither secrecy measures, such as deleting technical details, nor national solutions, such as export controls, will work.
Science and Nature have each published papers on the H5N1 virus in humans after considerable debate about whether the research results in those papers could help terrorists create a bioweapon (1, 2).
A Debate between Sam Harris and Bruce Schneier
Return to Part 1
A profile that encompasses "anyone who could conceivably be Muslim" needs to include almost everyone. Anything less and you're missing known Muslim airplane terrorist wannabes.
SH:It includes a lot of people, but I wouldn't say almost everyone. In fact, I just flew out of San Jose this morning and witnessed a performance of security theater so masochistic and absurd that, given our ongoing discussion, it seemed too good to be true.
A Debate between Sam Harris and Bruce SchneierIntroduction by Sam Harris
I recently wrote two articles in defense of "profiling" in the context of airline security (1 & 2), arguing that the TSA should stop doing secondary screenings of people who stand no reasonable chance of being Muslim jihadists. I knew this proposal would be controversial, but I seriously underestimated how inflamed the response would be. Had I worked for a newspaper or a university, I could well have lost my job over it.
One thing that united many of my critics was their admiration for Bruce Schneier.
The Department of Homeland Security is getting rid of the color-coded threat level system. It was introduced after 9/11, and was supposed to tell you how likely a terrorist attack might be. Except that it never did.
Attacks happened more often when the level was yellow ("significant risk") than when it was orange ("high risk").
A heavily edited version of this essay appeared in the New York Daily News.
Securing the Washington Monument from terrorism has turned out to be a surprisingly difficult job. The concrete fence around the building protects it from attacking vehicles, but there's no visually appealing way to house the airport-level security mechanisms the National Park Service has decided are a must for visitors. It is considering several options, but I think we should close the monument entirely. Let it stand, empty and inaccessible, as a monument to our fears.
A short history of airport security: We screen for guns and bombs, so the terrorists use box cutters. We confiscate box cutters and corkscrews, so they put explosives in their sneakers. We screen footwear, so they try to use liquids. We confiscate liquids, so they put PETN bombs in their underwear.
As the details of the Times Square car bomb attempt emerge in the wake of Faisal Shahzad's arrest Monday night, one thing has already been made clear: Terrorism is fairly easy. All you need is a gun or a bomb, and a crowded target. Guns are easy to buy. Bombs are easy to make.
In the wake of Saturday's failed Times Square car bombing, it's natural to ask how we can prevent this sort of thing from happening again. The answer is stop focusing on the specifics of what actually happened, and instead think about the threat in general.
Think about the security measures commonly proposed. Cameras won't help.
We'll spend millions on new technology, and terrorists will just adapt
People intent on preventing a Moscow-style terrorist attack against the New York subway system are proposing a range of expensive new underground security measures, some temporary and some permanent.
They should save their money -- and instead invest every penny they're considering pouring into new technologies into intelligence and old-fashioned policing.
Intensifying security at specific stations only works against terrorists who aren't smart enough to move to another station. Cameras are useful only if all the stars align: The terrorists happen to walk into the frame, the video feeds are being watched in real time and the police can respond quickly enough to be effective.
President Obama in his speech last week rightly focused on fixing the intelligence failures that resulted in Umar Farouk Abdulmutallab being ignored, rather than on technologies targeted at the details of his underwear-bomb plot. But while Obama's instincts are right, reforming intelligence for this new century and its new threats is a more difficult task than he might like.
We don't need new technologies, new laws, new bureaucratic overlords, or - for heaven's sake - new agencies. What prevents information sharing among intelligence organizations is the culture of the generation that built those organizations.
In the headlong rush to "fix" security after the Underwear Bomber's unsuccessful Christmas Day attack, there's far too little discussion about what worked and what didn't, and what will and will not make us safer in the future.
The security checkpoints worked. Because we screen for obvious bombs, Umar Farouk Abdulmutallab -- or, more precisely, whoever built the bomb -- had to construct a far less reliable bomb than he would have otherwise. Instead of using a timer or a plunger or a reliable detonation mechanism, as would any commercial user of PETN, he had to resort to an ad hoc and much more inefficient homebrew mechanism: one involving a syringe and 20 minutes in the lavatory and we don't know exactly what else.
The Underwear Bomber failed. And our reaction to the failed plot is failing as well, by focusing on the specifics of this made-for-a-movie plot rather than the broad threat. While our reaction is predictable, it's not going to make us safer.
We're going to beef up airport security, because Umar Farouk AbdulMutallab allegedly snuck a bomb through a security checkpoint.
There are two kinds of profiling. There's behavioral profiling based on how someone acts, and there's automatic profiling based on name, nationality, method of ticket purchase, and so on. The first one can be effective, but is very hard to do right. The second one makes us all less safe.
Last week's attempted terror attack on an airplane heading from Amsterdam to Detroit has given rise to a bunch of familiar questions.
How did the explosives get past security screening? What steps could be taken to avert similar attacks? Why wasn't there an air marshal on the flight?
We need to move beyond security measures that look good on television to those that actually work, argues Bruce Schneier.
Terrorism is rare, far rarer than many people think. It's rare because very few people want to commit acts of terrorism, and executing a terrorist plot is much harder than television makes it appear. The best defences against terrorism are largely invisible: investigation, intelligence, and emergency response. But even these are less effective at keeping us safe than our social and political policies, both at home and abroad.
A couple of years ago, the Department of Homeland Security hired a bunch of science fiction writers to come in for a day and think of ways terrorists could attack America. If our inability to prevent 9/11 marked a failure of imagination, as some said at the time, then who better than science fiction writers to inject a little imagination into counterterrorism planning?
I discounted the exercise at the time, calling it "embarrassing." I never thought that 9/11 was a failure of imagination. I thought, and still think, that 9/11 was primarily a confluence of three things: the dual failure of centralized coordination and local control within the FBI, and some lucky breaks on the part of the attackers.
Terrorists attacking our food supply is a nightmare scenario that has been given new life during the recent swine flu outbreak. Although it seems easy to do, understanding why it hasn't happened is important. GR Dalziel, at the Nanyang Technological University in Singapore, has written a report chronicling every confirmed case of malicious food contamination in the world since 1950: 365 cases in all, plus 126 additional unconfirmed cases. What he found demonstrates the reality of terrorist food attacks.
This essay also appeared in The Hindu, Brisbane Times, and The Sydney Morning Herald.
It regularly comes as a surprise to people that our own infrastructure can be used against us. And in the wake of terrorist attacks or plots, there are fear-induced calls to ban, disrupt or control that infrastructure. According to officials investigating the Mumbai attacks, the terrorists used images from Google Earth to help learn their way around.
It's not true that no one worries about terrorists attacking chemical plants. It's just that our politics seem to leave us unable to deal with the threat. Toxins such as ammonia, chlorine, propane and flammable mixtures are being produced or stored as a result of legitimate industrial processes. Chlorine gas is particularly toxic; in addition to bombing a plant, someone could hijack a chlorine truck or blow up a railcar.
Most counterterrorism policies fail, not because of tactical problems, but because of a fundamental misunderstanding of what motivates terrorists in the first place. If we're ever going to defeat terrorism, we need to understand what drives people to become terrorists in the first place.
Conventional wisdom holds that terrorism is inherently political, and that people become terrorists for political reasons. This is the "strategic" model of terrorism, and it's basically an economic model.
We spend far more effort defending our countries against specific movie-plot threats, rather than the real, broad threats. In the US during the months after the 9/11 attacks, we feared terrorists with scuba gear, terrorists with crop dusters and terrorists contaminating our milk supply. Both the UK and the US fear terrorists with small bottles of liquid. Our imaginations run wild with vivid specific threats.
We've opened up a new front on the war on terror. It's an attack on the unique, the unorthodox, the unexpected. It's a war on different. If you act different, you might find yourself investigated, questioned and even arrested -- even if you did nothing wrong, and had no intention of doing anything wrong.
It's not true that no one worries about terrorists attacking chemical plants, it's just that our politics seem to leave us unable to deal with the threat.
Toxins such as ammonia, chlorine, propane and flammable mixtures are constantly being produced or stored in the United States as a result of legitimate industrial processes. Chlorine gas is particularly toxic; in addition to bombing a plant, someone could hijack a chlorine truck or blow up a railcar. Phosgene is even more dangerous.
Two people are sitting in a room together: an experimenter and a subject. The experimenter gets up and closes the door, and the room becomes quieter. The subject is likely to believe that the experimenter's purpose in closing the door was to make the room quieter.
This is an example of correspondent inference theory.
The recently publicized terrorist plot to blow up John F. Kennedy International Airport, like so many of the terrorist plots over the past few years, is a study in alarmism and incompetence: on the part of the terrorists, our government and the press.
Terrorism is a real threat, and one that needs to be addressed by appropriate means. But allowing ourselves to be terrorized by wannabe terrorists and unrealistic plots -- and worse, allowing our essential freedoms to be lost by using them as an excuse -- is wrong.
The alleged plan, to blow up JFK's fuel tanks and a small segment of the 40-mile petroleum pipeline that supplies the airport, was ridiculous.
Everyone had a reaction to the horrific events of the Virginia Tech shootings. Some of those reactions were rational. Others were not.
A high school student was suspended for customizing a first-person shooter game with a map of his school.
Data mining for terrorists: It's an idea that just won't die. But it won't find any terrorists, it puts us at greater risk of crimes like identity theft, and it gives the police far too much power in a free society.
The first massive government program to collect dossiers on every American for data mining purposes was called Total Information Awareness. The public found the idea so abhorrent, and objected so forcefully, that Congress killed funding for the program in September 2003.
It's called " splash-and-grab," and it's a new way to rob convenience stores. Two guys walk into a store, and one comes up to the counter with a cup of hot coffee or cocoa. He pays for it, and when the clerk opens the cash drawer, he throws the coffee in the clerk's face. The other one grabs the cash drawer, and they both run.
Since 9/11, we've spent hundreds of billions of dollars defending ourselves from terrorist attacks. Stories about the ineffectiveness of many of these security measures are common, but less so are discussions of why they are so ineffective. In short: Much of our country's counterterrorism security spending is not designed to protect us from the terrorists, but instead to protect our public officials from criticism when another attack occurs.
Boston, Jan. 31: As part of a guerilla marketing campaign, a series of amateur-looking blinking signs depicting characters in Aqua Teen Hunger Force, a show on the Cartoon Network, were placed on bridges, near a medical center, underneath an interstate highway and in other crowded public places.
You've seen them: those large concrete blocks in front of skyscrapers, monuments and government buildings, designed to protect against car and truck bombs. They sprang up like weeds in the months after 9/11, but the idea is much older. The prettier ones doubled as planters; the uglier ones just stood there.
Form follows function.
On Aug. 16, two men were escorted off a plane headed for Manchester, England, because some passengers thought they looked either Asian or Middle Eastern, might have been talking Arabic, wore leather jackets, and looked at their watches -- and the passengers refused to fly with them on board.
The men were questioned for several hours and then released.
On Aug. 15, an entire airport terminal was evacuated because someone's cosmetics triggered a false positive for explosives. The same day, a Muslim man was removed from an airplane in Denver for reciting prayers.
It's easy to defend against what they planned last time, but it's shortsighted.
Hours-long waits in the security line. Ridiculous prohibitions on what you can carry onboard. Last week's foiling of a major terrorist plot and the subsequent airport security graphically illustrates the difference between effective security and security theater.
None of the airplane security measures implemented because of 9/11 -- no-fly lists, secondary screening, prohibitions against pocket knives and corkscrews -- had anything to do with last week's arrests.
For a while now, I have been writing about our penchant for "movie-plot threats" -- terrorist fears based on very specific attack scenarios.
Terrorists with crop-dusters, terrorists exploding baby carriages in subways, terrorists filling school buses with explosives -- these are all movie-plot threats. They're good for scaring people, but it's just silly to build national security policy around them.
But if we're going to worry about unlikely attacks, why can't they be exciting and innovative ones?
Better to Put People, Not Computers, in Charge of Investigating Potential Plots
Collecting information about every American's phone calls is an example of data mining. The basic idea is to collect as much information as possible on everyone, sift through it with massive computers, and uncover terrorist plots. It's a compelling idea, and convinces many. But it's wrong.
It seems like every time someone tests airport security, airport security fails. In tests between November 2001 and February 2002, screeners missed 70 percent of knives, 30 percent of guns and 60 percent of (fake) bombs. And recently, testers were able to smuggle bomb-making parts through airport security in 21 of 21 attempts. It makes you wonder why we're all putting our laptops in a separate bin and taking off our shoes.
Does it make sense to surrender management, including security, of six U.S. ports to a Dubai-based company? This question has set off a heated debate between the administration and Congress, as members of both parties condemned the deal.
Most of the rhetoric is political posturing, but there's an interesting security issue embedded in the controversy.
Sometimes it seems like the people in charge of homeland security spend too much time watching action movies. They defend against specific movie plots instead of against the broad threats of terrorism.
We all do it. Our imaginations run wild with detailed and specific threats.
In the post-9/11 world, there's much focus on connecting the dots. Many believe data mining is the crystal ball that will enable us to uncover future terrorist plots. But even in the most wildly optimistic projections, data mining isn't tenable for that purpose. We're not trading privacy for security; we're giving up privacy and getting no security in return.
The World Series is no stranger to security. Fans try to sneak into the ballpark without tickets or with counterfeit tickets. Often foods and alcohol are prohibited from being brought into the ballpark, to enforce the monopoly of the high-priced concessions.
Violence is always a risk: both small fights and larger-scale riots that result from fans from both teams being in such close proximity -- like the one that almost happened during the sixth game of the American League Championship Series.
How would we know? An essay by one of the world's busiest security experts.
As I read the litany of terror threat warnings that the government has issued in the past three years, the thing that jumps out at me is how vague they are. The careful wording implies everything without actually saying anything. We hear "terrorists might try to bomb buses and rail lines in major U.S.
Want to learn how to create and sustain psychosis on a national scale? Look carefully at the public statements made by the Department of Homeland Security.
Here are a few random examples: "Weapons of mass destruction, including those containing chemical, biological or radiological agents or materials, cannot be discounted." "At least one of these attacks could be executed by the end of the summer 2003." "These credible sources suggest the possibility of attacks against the homeland around the holiday season and beyond."
The DHS's threat warnings have been vague, indeterminate, and unspecific. The threat index goes from yellow to orange and back again, although no one is entirely sure what either level means.
If you're watching the Olympic games on television, you've already seen the unprecedented security surrounding the 2004 Games. You're seen shots of guards and soldiers, and gunboats and frogmen patrolling the harbors.
But there's a lot more security behind the scenes. Olympic press materials state that there is a system of 1250 infrared and high-resolution surveillance cameras mounted on concrete poles.
If you fly out of Logan Airport and don't want to take off your shoes for the security screeners and get your bags opened up, pay attention. The US government is testing its "Trusted Traveler" program, and Logan is the fourth test airport. Currently, only American Airlines frequent fliers are eligible, but if all goes well the program will be opened up to more people and more airports.
Participants provide their name, address, phone number, and birth date, a set of fingerprints, and a retinal scan.
Last Tuesday's bomb scare contains valuable security lessons, both good and bad, about how to achieve security in these dangerous times.
Ninety minutes after taking off from Sydney Airport, a flight attendant on a United Airlines flight bound for Los Angeles found an airsickness bag -- presumably unused -- in a lavatory with the letters "BOB" written on it.
The flight attendant decided that the letters stood for "Bomb On Board" and immediately alerted the captain, who decided the risk was serious enough to turn the plane around and land back in Sydney.
Even a moment's reflection is enough to realise that this is an extreme over-reaction to a non-existent threat.
Want to help fight terrorism? Want to be able to stop and detain suspicious characters? Or do you just want to ride your horse on ten miles of trails normally closed to the public? Then you might want to join the George Bush Intercontinental (IAH) Airport Rangers program.
As the U.S. Supreme Court decides three legal challenges to the Bush administration's legal maneuverings against terrorism, it is important to keep in mind how critical these cases are to our nation's security. Security is multifaceted; there are many threats from many different directions. It includes the security of people against terrorism, and also the security of people against tyrannical government.
Posturing, pontifications, and partisan politics aside, the one clear generalization that emerges from the 9/11 hearings is that information--timely, accurate, and free-flowing--is critical in our nation's fight against terrorism. Our intelligence and law-enforcement agencies need this information to better defend our nation, and our citizens need this information to better debate massive financial expenditures for anti-terrorist measures, changes in law that aid law enforcement and diminish civil liberties, and the upcoming Presidential election
The problem is that the current administration has consistently used terrorism information for political gain. Again and again, the Bush administration has exaggerated the terrorist threat for political purposes. They're embarked on a re-election strategy that involves a scared electorate voting for the party that is perceived to be better able to protect them.
Every day, some 82,000 foreign visitors set foot in the US with a visa, and since early this year, most of them have been fingerprinted and photographed in the name of security. But despite the money spent, the inconveniences suffered, and the international ill will caused, these new measures, like most instituted in the wake of September 11, are mostly ineffectual.
Terrorist attacks are very rare. So rare, in fact, that the odds of being the victim of one in an industrialized country are almost nonexistent.
The fact that U.S. intelligence agencies can't tell terrorists from children on passenger jets does little to inspire confidence.
Security can fail in two different ways. It can fail to work in the presence of an attack: a burglar alarm that a burglar successfully defeats. But security can also fail to work correctly when there's no attack: a burglar alarm that goes off even if no one is there.
Im Jahr 2004 werden die USA viele Milliarden Dollar für Sicherheit ausgeben. Leider ist das meiste davon zum Fenster herausgeworfen – wirklichen Schutz bringt diese Aufrüstung nicht
VON BRUCE SCHNEIER
Der 11. September 2001 hat ein Trauma hinterlassen. Seit den Terroranschlägen brauchen die Amerikaner das Gefühl von mehr Sicherheit.
In September 2002, JetBlue Airways secretly turned over data about 1.5 million of its passengers to a company called Torch Concepts, under contract with the Department of Defense.
Torch Concepts merged this data with Social Security numbers, home addresses, income levels and automobile records that it purchased from another company, Acxiom Corp. All this was to test an automatic profiling system to automatically give each person a terrorist threat ranking.
A joint congressional intelligence inquiry has concluded that 9/11 could have been prevented if our nation's intelligence agencies shared information better and coordinated more effectively. This is both a trite platitude and a profound proscription.
Intelligence is easy to understand after the fact. With the benefit of hindsight, it's easy to draw lines from people in flight school here, to secret meetings in foreign countries there, over to interesting tips from informants, and maybe to INS records.
THERE'S considerable confusion between the concepts of secrecy and security, and it is causing a lot of bad security and some surprising political arguments. Secrecy is not the same as security, and most of the time secrecy contributes to a false feeling of security instead of to real security.
Last month, the SQL Slammer worm ravished the Internet, infecting in some 15 minutes about 13 root servers that direct information traffic, and thus disrupting services as diverse as the 911 network in Seattle and much of Bank of America's 13,000 ATM machines. The worm took advantage of a software vulnerability in a Microsoft database management program, one that allowed a malicious piece of software to take control of the computer.
The events of 11 September offer a rare chance to rethink public security.
Appalled by the events of 11 September, many Americans have declared so loudly that they are willing to give up civil liberties in the name of security that this trade-off seems to be a fait accompli. Article after article in the popular media debates the 'balance' of privacy and security -- are various types of increase in security worth the consequent losses to privacy and civil liberty? Rarely do I see discussion about whether this linkage is valid.
Security and privacy are not two sides of an equation.
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient Systems, Inc.