Entries Tagged "risk assessment"

Page 13 of 21

Teaching Risk Analysis in School

Good points:

“I regard myself as part of a movement we call risk literacy,” Professor Spiegelhalter told The Times. “It should be a basic component of discussion about issues in media, politics and in schools.

“We should essentially be teaching the ability to deconstruct the latest media story about a cancer risk or a wonder drug, so people can work out what it means. Really, that should be part of everyone’s language.””

As an aspect of science, risk was “as important as learning about DNA, maybe even more important,” he said. “The only problem is putting it on the curriculum: that can be the kiss of death. At the moment we can do it as part of maths outreach, maths inspiration, which is a real privilege because we can make it fun. It’s not teaching to an exam. But I actually think it should be in there, partly to make the curriculum more interesting.”

Reminds me of John Paulos’s Innumeracy.

Posted on January 26, 2009 at 1:55 PMView Comments

"The Cost of Fearing Strangers"

Excellent essay from the Freakonomics blog:

As we wrote in Freakonomics, most people are pretty terrible at risk assessment. They tend to overstate the risk of dramatic and unlikely events at the expense of more common and boring (if equally devastating) events. A given person might fear a terrorist attack and mad cow disease more than anything in the world, whereas in fact she’d be better off fearing a heart attack (and therefore taking care of herself) or salmonella (and therefore washing her cutting board thoroughly).

Why do we fear the unknown more than the known? That’s a larger question than I can answer here (not that I’m capable anyway), but it probably has to do with the heuristics—the shortcut guesses—our brains use to solve problems, and the fact that these heuristics rely on the information already stored in our memories.

And what gets stored away? Anomalies—the big, rare, “black swan” events that are so dramatic, so unpredictable, and perhaps world-changing, that they imprint themselves on our memories and con us into thinking of them as typical, or at least likely, whereas in fact they are extraordinarily rare.

Nothing I haven’t said before. Remember, if it’s in the news don’t worry about it. The very definition of news is “something that almost never happens.” When something is so common that it’s no longer news—car crashes, domestic violence—that’s when you should worry about it.

Posted on January 19, 2009 at 6:19 AMView Comments

"Nut Allergy" Fear and Overreaction

Good article:

Professor Nicolas Christakis, a professor of medical sociology at Harvard Medical School, told the BMJ there was “a gross over-reaction to the magnitude of the threat” posed by food allergies, and particularly nut allergies.

In the US, serious allergic reactions to foods cause just 2,000 of more than 30 million hospitalisations a year and comparatively few deaths—150 a year from all food allergies combined.

In the UK there are around 10 deaths each year from food allergies.

Professor Christakis said the issue was not whether nut allergies existed or whether they could occasionally be serious. Nor was the issue whether reasonable preventative steps should be made for the few children who had documented serious allergies, he argued.

“The issue is what accounts for the extreme responses to nut allergies.”

He said the number of US schools declaring themselves to be entirely “nut free”—banning staples like peanut butter, homemade baked goods and any foods without detailed ingredient labels—was rising, despite clear evidence that such restrictions were unnecessary.

“School entrances have signs admonishing visitors to wash their hands before entry to avoid [nut] contamination.”

He said these responses were extreme and had many of the hallmarks of mass psychogenic illness (MPI), previously known as epidemic hysteria.

Sound familiar?

Posted on December 19, 2008 at 6:56 AMView Comments

Online Age Verification

A discussion of a security trade-off:

Child-safety activists charge that some of the age-verification firms want to help Internet companies tailor ads for children. They say these firms are substituting one exaggerated threat—the menace of online sex predators—with a far more pervasive danger from online marketers like junk food and toy companies that will rush to advertise to children if they are told revealing details about the users.

It’s an old story: protecting against the rare and spectacular by making yourself more vulnerable to the common and pedestrian.

Posted on November 21, 2008 at 11:47 AMView Comments

Interview on Nuclear Terror

With Brian Michael Jenkins from Rand Corp. I like his distinction between “terrorism” and “terror”:

NJ: Why did you decide to delve so deeply into the psychological underpinnings of nuclear terror?

Jenkins: Well, I couldn’t write about the history of nuclear terrorism, because at least as of yet there hasn’t been any. So that would have been a very short book. Nonetheless, the U.S. government has stated that it is the No. 1 threat to the national security of the United States. In fact, according to public opinion polls, two out of five Americans consider it likely that a terrorist will detonate a nuclear bomb in an American city within the next five years. That struck me as an astonishing level of apprehension.

NJ: To what do you attribute that fear?

Jenkins: I concluded that there is a difference between nuclear terrorism and nuclear terror. Nuclear terrorism is about the possibility that terrorists will acquire and detonate a nuclear weapon. Nuclear terror, on the other hand, concerns our anticipation of such an attack. It’s about our imagination. And while there is no history of nuclear terrorism, there is a rich history of nuclear terror. It’s deeply embedded in our popular culture and in policy-making circles.

This is also good:

NJ: How do you break this chain reaction of fear?

Jenkins: The first thing we have to do is truly understand the threat. Nuclear terrorism is a frightening possibility but it is not inevitable or imminent, and there is no logical progression from truck bombs to nuclear bombs. Some of the steps necessary to a sustainable strategy we’ve already begun. We do need better intelligence-sharing internationally and enhanced homeland security and civil defense, and we need to secure stockpiles of nuclear materials around the world.

Nations that might consider abetting terrorists in acquiring nuclear weapons should also be made aware that we will hold them fully responsible in the event of an attack. We need to finish the job of eliminating Al Qaeda, not only to prevent another attack but also to send the message to others that if you go down this path, we will hunt you down relentlessly and destroy you.

NJ: What should political leaders tell the American people?

Jenkins: Rather than telling Americans constantly to be very afraid, we should stress that even an event of nuclear terrorism will not bring this Republic to its knees. Some will argue that fear is useful in galvanizing people and concentrating their minds on this threat, but fear is not free. It creates its own orthodoxy and demands obedience to it. A frightened population is intolerant. It trumpets a kind of “lapel pin” patriotism rather than the real thing. A frightened population is also prone both to paralysis—we’re doomed!—and to dangerous overreaction.

I believe that fear gets in the way of addressing the issue of nuclear terrorism in a sustained and sensible way. Instead of spreading fear, our leaders should speak to the American traditions of courage, self-reliance, and resiliency. Heaven forbid that an act of nuclear terrorism ever actually occurs, but if it does, we’ll get through it.

Posted on November 11, 2008 at 6:26 AMView Comments

ANSI Cyberrisk Calculation Guide

Interesting:

In a nutshell, the guide advocates that organizations calculate cyber security risks and costs by asking questions of every organizational discipline that might be affected: legal, compliance, business operations, IT, external communications, crisis management, and risk management/insurance. The idea is to involve everyone who might be affected by a security breach and collect data on the potential risks and costs.

Once all of the involved parties have weighed in, the guide offers a mathematical formula for calculating financial risk: Essentially, it is a product of the frequency of an event multiplied by its severity, multiplied by the likelihood of its occurrence. If risk can be transferred to other organizations, that part of the risk can be subtracted from the net financial risk.

Guide is here.

Posted on October 24, 2008 at 7:04 AMView Comments

Does Risk Management Make Sense?

We engage in risk management all the time, but it only makes sense if we do it right.

“Risk management” is just a fancy term for the cost-benefit tradeoff associated with any security decision. It’s what we do when we react to fear, or try to make ourselves feel secure. It’s the fight-or-flight reflex that evolved in primitive fish and remains in all vertebrates. It’s instinctual, intuitive and fundamental to life, and one of the brain’s primary functions.

Some have hypothesized that humans have a “risk thermostat” that tries to maintain some optimal risk level. It explains why we drive our motorcycles faster when we wear a helmet, or are more likely to take up smoking during wartime. It’s our natural risk management in action.

The problem is our brains are intuitively suited to the sorts of risk management decisions endemic to living in small family groups in the East African highlands in 100,000 BC, and not to living in the New York City of 2008. We make systematic risk management mistakes—miscalculating the probability of rare events, reacting more to stories than data, responding to the feeling of security rather than reality, and making decisions based on irrelevant context. And that risk thermostat of ours? It’s not nearly as finely tuned as we might like it to be.

Like a rabbit that responds to an oncoming car with its default predator avoidance behavior—dart left, dart right, dart left, and at the last moment jump—instead of just getting out of the way, our Stone Age intuition doesn’t serve us well in a modern technological society. So when we in the security industry use the term “risk management,” we don’t want you to do it by trusting your gut. We want you to do risk management consciously and intelligently, to analyze the tradeoff and make the best decision.

This means balancing the costs and benefits of any security decision—buying and installing a new technology, implementing a new procedure or forgoing a common precaution. It means allocating a security budget to mitigate different risks by different amounts. It means buying insurance to transfer some risks to others. It’s what businesses do, all the time, about everything. IT security has its own risk management decisions, based on the threats and the technologies.

There’s never just one risk, of course, and bad risk management decisions often carry an underlying tradeoff. Terrorism policy in the U.S. is based more on politics than actual security risk, but the politicians who make these decisions are concerned about the risks of not being re-elected.

Many corporate security decisions are made to mitigate the risk of lawsuits rather than address the risk of any actual security breach. And individuals make risk management decisions that consider not only the risks to the corporation, but the risks to their departments’ budgets, and to their careers.

You can’t completely remove emotion from risk management decisions, but the best way to keep risk management focused on the data is to formalize the methodology. That’s what companies that manage risk for a living—insurance companies, financial trading firms and arbitrageurs—try to do. They try to replace intuition with models, and hunches with mathematics.

The problem in the security world is we often lack the data to do risk management well. Technological risks are complicated and subtle. We don’t know how well our network security will keep the bad guys out, and we don’t know the cost to the company if we don’t keep them out. And the risks change all the time, making the calculations even harder. But this doesn’t mean we shouldn’t try.

You can’t avoid risk management; it’s fundamental to business just as to life. The question is whether you’re going to try to use data or whether you’re going to just react based on emotions, hunches and anecdotes.

This essay appeared as the first half of a point-counterpoint with Marcus Ranum in Information Security magazine.

Posted on October 14, 2008 at 1:25 PMView Comments

1 11 12 13 14 15 21

Sidebar photo of Bruce Schneier by Joe MacInnis.