Entries Tagged "risks"
Page 14 of 15
I’ve already written about the DHS’s database of top terrorist targets and how dumb it is. Important sites are not on the list, and unimportant ones are. The reason is pork, of course; states get security money based on this list, so every state wants to make sure they have enough sites on it. And over the past five years, states with Republican congressmen got more money than states without.
Here’s another article on this general topic, centering around an obscure quantity: the square root of terrorist intent:
The Department of Homeland Security is the home of many mysteries. There is, of course, the color-coded system for gauging the threat of an attack. And there is the department database of national assets to protect against a terrorist threat, which includes Old MacDonald’s Petting Zoo in Woodville, Ala., and the Apple and Pork Festival in Clinton, Ill.
And now Jim O’Brien, the director of the Office of Emergency Management and Homeland Security in Clark County, Nev., has discovered another hard-to-fathom DHS notion: a mathematical value purporting to represent the square root of terrorist intent. The figure appears deep in the mind-numbingly complex risk-assessment formulas that the department used in 2006 to decide the likelihood that a place is or will become a terrorist target — an all-important estimate outside the Beltway, because greater slices of the federal anti-terrorism pie go to the locations with the highest scores. Overall, the department awarded $711 million in high-risk urban counterterrorism grants last year.
As O’Brien reviewed the risk-assessment formulas — a series of calculations that runs into the billions — he found himself unable to account for several factors, the terrorist-intent notion principal among them. “I have a Ph.D. I think I understand formulas,” he says. “Take the square root of terrorist intent? Now, give me a break.” The whole notion, O’Brien says, is a contradiction in terms: “How can you quantify what somebody is thinking?”
Other designations for variables in the formula are almost befuddling, O’Brien says, such as the “attractiveness factor,” which seeks to establish how terrorists might prefer one sort of target over another, and the “chatter factor,” which tries to gauge the intent of potential terror plotters based on communication intercepts.
“One man’s garbage is another man’s treasure,” he says. “So I don’t know how you measure attractiveness.” The chatter factor, meanwhile, leaves O’Brien entirely in the dark: “I’m not sure what that means.”
What I said last time still applies:
We’re never going to get security right if we continue to make it a parody of itself.
I’ve written repeatedly about the difference between perceived and actual risk, and how it explains many seemingly perverse security trade-offs. Here’s a Los Angeles Times op-ed that does the same. The author is Daniel Gilbert, psychology professor at Harvard. (I just recently finished his book Stumbling on Happiness, which is not a self-help book but instead about how the brain works. Strongly recommended.)
The op-ed is about the public’s reaction to the risks of global warming and terrorism, but the points he makes are much more general. He gives four reasons why some risks are perceived to be more or less serious than they actually are:
- We over-react to intentional actions, and under-react to accidents, abstract events, and natural phenomena.
That’s why we worry more about anthrax (with an annual death toll of roughly zero) than influenza (with an annual death toll of a quarter-million to a half-million people). Influenza is a natural accident, anthrax is an intentional action, and the smallest action captures our attention in a way that the largest accident doesn’t. If two airplanes had been hit by lightning and crashed into a New York skyscraper, few of us would be able to name the date on which it happened.
- We over-react to things that offend our morals.
When people feel insulted or disgusted, they generally do something about it, such as whacking each other over the head, or voting. Moral emotions are the brain’s call to action.
He doesn’t say it, but it’s reasonable to assume that we under-react to things that don’t.
- We over-react to immediate threats and under-react to long-term threats.
The brain is a beautifully engineered get-out-of-the-way machine that constantly scans the environment for things out of whose way it should right now get. That’s what brains did for several hundred million years — and then, just a few million years ago, the mammalian brain learned a new trick: to predict the timing and location of dangers before they actually happened.
Our ability to duck that which is not yet coming is one of the brain’s most stunning innovations, and we wouldn’t have dental floss or 401(k) plans without it. But this innovation is in the early stages of development. The application that allows us to respond to visible baseballs is ancient and reliable, but the add-on utility that allows us to respond to threats that loom in an unseen future is still in beta testing.
- We under-react to changes that occur slowly and over time.
The human brain is exquisitely sensitive to changes in light, sound, temperature, pressure, size, weight and just about everything else. But if the rate of change is slow enough, the change will go undetected. If the low hum of a refrigerator were to increase in pitch over the course of several weeks, the appliance could be singing soprano by the end of the month and no one would be the wiser.
It’s interesting to compare this to what I wrote in Beyond Fear (pages 26-27) about perceived vs. actual risk:
- People exaggerate spectacular but rare risks and downplay common risks. They worry more about earthquakes than they do about slipping on the bathroom floor, even though the latter kills far more people than the former. Similarly, terrorism causes far more anxiety than common street crime, even though the latter claims many more lives. Many people believe that their children are at risk of being given poisoned candy by strangers at Halloween, even though there has been no documented case of this ever happening.
- People have trouble estimating risks for anything not exactly like their normal situation. Americans worry more about the risk of mugging in a foreign city, no matter how much safer it might be than where they live back home. Europeans routinely perceive the U.S. as being full of guns. Men regularly underestimate how risky a situation might be for an unaccompanied woman. The risks of computer crime are generally believed to be greater than they are, because computers are relatively new and the risks are unfamiliar. Middle-class Americans can be particularly naïve and complacent; their lives are incredibly secure most of the time, so their instincts about the risks of many situations have been dulled.
- Personified risks are perceived to be greater than anonymous risks. Joseph Stalin said, “A single death is a tragedy, a million deaths is a statistic.” He was right; large numbers have a way of blending into each other. The final death toll from 9/11 was less than half of the initial estimates, but that didn’t make people feel less at risk. People gloss over statistics of automobile deaths, but when the press writes page after page about nine people trapped in a mine — complete with human-interest stories about their lives and families — suddenly everyone starts paying attention to the dangers with which miners have contended for centuries. Osama bin Laden represents the face of Al Qaeda, and has served as the personification of the terrorist threat. Even if he were dead, it would serve the interests of some politicians to keep him “alive” for his effect on public opinion.
- People underestimate risks they willingly take and overestimate risks in situations they can’t control. When people voluntarily take a risk, they tend to underestimate it. When they have no choice but to take the risk, they tend to overestimate it. Terrorists are scary because they attack arbitrarily, and from nowhere. Commercial airplanes are perceived as riskier than automobiles, because the controls are in someone else’s hands — even though they’re much safer per passenger mile. Similarly, people overestimate even more those risks that they can’t control but think they, or someone, should. People worry about airplane crashes not because we can’t stop them, but because we think as a society we should be capable of stopping them (even if that is not really the case). While we can’t really prevent criminals like the two snipers who terrorized the Washington, DC, area in the fall of 2002 from killing, most people think we should be able to.
- Last, people overestimate risks that are being talked about and remain an object of public scrutiny. News, by definition, is about anomalies. Endless numbers of automobile crashes hardly make news like one airplane crash does. The West Nile virus outbreak in 2002 killed very few people, but it worried many more because it was in the news day after day. AIDS kills about 3 million people per year worldwide — about three times as many people each day as died in the terrorist attacks of 9/11. If a lunatic goes back to the office after being fired and kills his boss and two coworkers, it’s national news for days. If the same lunatic shoots his ex-wife and two kids instead, it’s local news…maybe not even the lead story.
Good essay on perceived vs. actual risk. The hook is Mayor Daley of Chicago demanding a no-fly-zone over Chicago in the wake of the New York City airplane crash.
Other politicians (with the spectacular and notable exception of New York City Mayor Michael Bloomberg) and self-appointed “experts” are jumping on the tragic accident — repeat, accident — in New York to sound off again about the “danger” of light aircraft, and how they must be regulated, restricted, banned.
OK, for all of those ranting about “threats” from GA aircraft, we’ll believe that you’re really serious about controlling “threats” when you call for:
- Banning all vans within cities. A small panel van was used in the first World Trade Center attack. The bomb, which weighed 1,500 pounds, killed six and injured 1,042.
- Banning all box trucks from cities. Timothy McVeigh’s rented Ryder truck carried a 5,000-pound bomb that killed 168 in Oklahoma City.
- Banning all semi-trailer trucks. They can carry bombs weighing more than 50,000 pounds.
- Banning newspapers on subways. That’s how the terrorists hid packages of sarin nerve gas in the Tokyo subway system. They killed 12.
- Banning backpacks on all buses and subways. That’s how the terrorists got the bombs into the London subway system. They killed 52.
- Banning all cell phones on trains. That’s how they detonated the bombs in backpacks placed on commuter trains in Madrid. They killed 191.
- Banning all small pleasure boats on public waterways. That’s how terrorists attacked the USS Cole, killing 17.
- Banning all heavy or bulky clothing in all public places. That’s how suicide bombers hide their murderous charges. Thousands killed.
Number of people killed by a terrorist attack using a GA aircraft? Zero.
Number of people injured by a terrorist attack using a GA aircraft? Zero.
Property damage from a terrorist attack using a GA aircraft? None.
So Mr. Mayor (and Mr. Governor, Ms. Senator, Mr. Congressman, and Mr. “Expert”), if you’re truly serious about “protecting” the public, advocate all of the bans I’ve listed above. Using the “logic” you apply to general aviation aircraft, you’re forced to conclude that newspapers, winter coats, cell phones, backpacks, trucks, and boats all pose much greater risks to the public.
So be consistent in your logic. If you are dead set on restricting a personal transportation system that carries more passengers than any single airline, reaches more American cities than all the airlines combined, provides employment for 1.3 million American citizens and $160 billion in business “to protect the public,” then restrict or control every other transportation system that the terrorists have demonstrated they can use to kill.
And, on the same topic, why it doesn’t make sense to ban small aircraft from cities as a terrorism defense.
From yesterday’s New York Times, “Ten Ways to Avoid the Next 9/11”:
If we are fortunate, we will open our newspapers this morning knowing that there have been no major terrorist attacks on American soil in nearly five years. Did we just get lucky?
The Op-Ed page asked 10 people with experience in security and counterterrorism to answer the following question: What is one major reason the United States has not suffered a major attack since 2001, and what is the one thing you would recommend the nation do in order to avoid attacks in the future?
Actually, they asked more than 10, myself included. But some of us were cut because they didn’t have enough space. This was my essay:
Despite what you see in the movies and on television, it’s actually very difficult to execute a major terrorist act. It’s hard to organize, plan, and execute an attack, and it’s all too easy to slip up and get caught. Combine that with our intelligence work tracking terrorist cells and interdicting terrorist funding, and you have a climate where major attacks are rare. In many ways, the success of 9/11 was an anomaly; there were many points where it could have failed. The main reason we haven’t seen another 9/11 is that it isn’t as easy as it looks.
Much of our counterterrorist efforts are nothing more than security theater: ineffectual measures that look good. Forget the “war on terror”; the difficulty isn’t killing or arresting the terrorists, it’s finding them. Terrorism is a law enforcement problem, and needs to be treated as such. For example, none of our post-9/11 airline security measures would have stopped the London shampoo bombers. The lesson of London is that our best defense is intelligence and investigation. Rather than spending money on airline security, or sports stadium security — measures that require us to guess the plot correctly in order to be effective — we’re better off spending money on measures that are effective regardless of the plot.
Intelligence and investigation have kept us safe from terrorism in the past, and will continue to do so in the future. If the CIA and FBI had done a better job of coordinating and sharing data in 2001, 9/11 would have been another failed attempt. Coordination has gotten better, and those agencies are better funded — but it’s still not enough. Whenever you read about the billions being spent on national ID cards or massive data mining programs or new airport security measures, think about the number of intelligence agents that the same money could buy. That’s where we’re going to see the greatest return on our security investment.
Today is Ben Franklin’s 300th birthday. Among many other discoveries and inventions, Franklin worked out a way of protecting buildings from lightning strikes, by providing a conducting path to ground — outside a building — from one or more pointed rods high atop the structure. People tried this, and it worked. Franklin became a celebrity, not just among “electricians,” but among the general public.
An article in this month’s issue of Physics Today has a great 1769 quote by Franklin about lightning rods, and the reality vs. the feeling of security:
Those who calculate chances may perhaps find that not one death (or the destruction of one house) in a hundred thousand happens from that cause, and that therefore it is scarce worth while to be at any expense to guard against it. But in all countries there are particular situations of buildings more exposed than others to such accidents, and there are minds so strongly impressed with the apprehension of them, as to be very unhappy every time a little thunder is within their hearing; it may therefore be well to render this little piece of new knowledge as general and well understood as possible, since to make us safe is not all its advantage, it is some to make us easy. And as the stroke it secures us from might have chanced perhaps but once in our lives, while it may relieve us a hundred times from those painful apprehensions, the latter may possibly on the whole contribute more to the happiness of mankind than the former.
Sidebar photo of Bruce Schneier by Joe MacInnis.