Second SHB Workshop Liveblogging (7)

Session Six—”Terror”—chaired by Stuart Schechter.

Bill Burns, Decision Research (suggested reading: The Diffusion of Fear: Modeling Community Response to a Terrorist Strike), studies social reaction to risk. He discussed his theoretical model of how people react to fear events, and data from the 9/11 attacks, the 7/7 bombings in the UK, and the 2008 financial collapse. Basically, we can’t remain fearful. No matter what happens, fear spikes immediately after and recovers 45 or so days afterwards. He believes that the greatest mistake we made after 9/11 was labeling the event as terrorism instead of an international crime.

Chris Cocking, London Metropolitan University (suggested reading: Effects of social identity on responses to emergency mass evacuation), looks at the group behavior of people responding to emergencies. Traditionally, most emergency planning is based on the panic model: people in crowds are prone to irrational behavior and panic. There’s also a social attachment model that predicts that social norms don’t break down in groups. He prefers a self-categorization approach: disasters create a common identity, which results in orderly and altruistic behavior among strangers. The greater the threat, the greater the common identity, and spontaneous resilience can occur. He displayed a photograph of “panic” in New York on 9/11 and showed how it wasn’t panic at all. Panic seems to be more a myth than a reality. This has policy implications during an event: provide people with information, and people are more likely to underreact than overreact, if there is overreaction, it’s because people are acting as individuals rather than groups, so those in authority should encourage a sense of collective identity. “Crowds can be part of the solution rather than part of the problem.”

Richard John, University of Southern California (suggested reading: Decision Analysis by Proxy for the Rational Terrorist), talked about the process of social amplification of risk (with respect to terrorism). Events result in relatively small losses; it’s the changes in behavior following an event that result in much greater losses. There’s a dynamic of risk perception, and it’s very contextual. He uses vignettes to study how risk perception changes over time, and discussed some of the studies he’s conducting and ideas for future studies.

Mark Stewart, University of Newcastle, Australia (suggested reading: A risk and cost-benefit assessment of United States aviation security measures; Risk and Cost-Benefit Assessment of Counter-Terrorism Protective Measures to Infrastructure), examines infrastructure security and whether the costs exceed the benefits. He talked about cost/benefit trade-off, and how to apply probabilistic terrorism risk assessment; then, he tried to apply this model to the U.S. Federal Air Marshal Service. His result: they’re not worth it. You can quibble with his data, but the real value is a transparent process. During the discussion, I said that it is important to realize that risks can’t be taken in isolation, that anyone making a security trade-off is balancing several risks: terrorism risks, political risks, the personal risks to his career, etc.

John Adams, University College London (suggested reading: Deus e Brasileiro?; Can Science Beat Terrorism?; Bicycle bombs: a further inquiry), applies his risk thermostat model to terrorism. He presented a series of amusing photographs of overreactions to risk, most of them not really about risk aversion but more about liability aversion. He talked about bureaucratic paranoia, as well as bureaucratic incitements to paranoia, and how this is beginning to backfire. People treat risks differently, depending on whether they are voluntary, impersonal, or imposed, and whether people have total control, diminished control, or no control.

Dan Gardner, Ottawa Citizen (suggested reading: The Science of Fear: Why We Fear the Things We Shouldn’t—and Put Ourselves in Greater Danger), talked about how the media covers risks, threats, attacks, etc. He talked about the various ways the media screws up, all of which were familiar to everyone. His thesis is not that the media gets things wrong in order to increase readership/viewership and therefore profits, but that the media gets things wrong because reporters are human. Bad news bias is not a result of the media hyping bad news, but the natural human tendency to remember the bad more than the good. The evening news is centered around stories because people—including reporters—respond to stories, and stories with novelty, emotion, and drama are better stories.

Some of the discussion was about the nature of panic: whether and where it exists, and what it looks like. Someone from the audience questioned whether panic was related to proximity to the event; someone else pointed out that people very close to the 7/7 bombings took pictures and made phone calls—and that there was no evidence of panic. Also, on 9/11 pretty much everyone below where the airplanes struck the World Trade Center got out safely; and everyone above couldn’t get out, and died. Angela Sasse pointed out that the previous terrorist attack against the World Trade Center, and the changes made in evacuation procedures afterwards, contributed to the lack of panic on 9/11. Bill Burns said that the purest form of panic is a drowning person. Jean Camp asked whether the recent attacks against women’s health providers should be classified as terrorism, or whether we are better off framing it as crime. There was also talk about sky marshals and their effectiveness. I said that it isn’t sky marshals that are a deterrent, but the idea of sky marshals. Terence Taylor said that increasing uncertainty on the part of the terrorists is, in itself, a security measure. There was also a discussion about how risk-averse terrorists are; they seem to want to believe they have an 80% or 90% change of success before they will launch an attack.

Next, lunch—and two final sessions this afternoon.

Adam Shostack’s liveblogging is here. Ross Anderson’s liveblogging is in his blog post’s comments. Matt Blaze’s audio is here.

Posted on June 12, 2009 at 12:01 PM5 Comments


HJohn June 12, 2009 12:34 PM

@: “I said that it is important to realize that risks can’t be taken in isolation, that anyone making a security trade-off is balancing several risks: terrorism risks, political risks, the personal risks to his career, etc.”

I believe that is a core reason that even intelligent people make poor decisions. It is not so much that they think a specific event will happen, it is that any one of several rare events will cost them their jobs, or worse, the mere question of “what are you doing about ____” may cost them their jobs. I like a previous example of countermeasures against infant abductions–the security manager, even when correct, is making a CLD if he says “nothing, the risk is too small.”

I do find it remarkable that when something is rare and spectacular, the people in awe are often the same people that expect defenses against it to be commonplace.

(CLD = career limited decision)

HJohn June 12, 2009 12:46 PM

@: “It is not so much that they think a specific event will happen, it is that any one of several rare events will cost them their jobs, or worse, the mere question of “what are you doing about ____” may cost them their jobs.”

In other words, if I have the following prospects:
A. Accept a rare risk and get fired by a boss or the voters.
B. Waste money and resources on a rare risk and keep my job.

B becomes more attractive, and it really is a rational choice given the options, especially considering B is inevitable (if I don’t do it, they’ll fire me and hire/elect someone who will).

Bryan Feir June 12, 2009 12:47 PM

The bit about ‘the natural human tendency to remember the bad more than the good’ doesn’t strike me as right; it’s not so much we remember the bad more than the good, it’s really more that the bad tends to happen less often than the good or the neutral, and be more spectacular in the process. Spectacular rare good stories do happen as well, and they get press, too.

As has been mentioned many times on this blog, people tend to remember big rare events more often than common events, which messes with our senses of probability. First time I recall reading about that was an article by Douglas Hofstadter where he talks about what he called the ‘oddmatch’ phenomenon and its effects on people’s beliefs in psychic powers and the like.

In other words, I agree with Gardner’s basic point that the media shows a bias and that the bias isn’t entirely deliberate, but I’m nitpicking on the wording of why it happens.

Clive Robinson June 15, 2009 5:39 AM

As with Bryan Feir,

“I agree with Gardner’s basic point that the media shows a bias and that the bias isn’t entirely deliberate”

However I suspect the reason is actually similar to the one that brought about the credit crunch, and it’s a form of herd mentality.

In an information poor high value competitive market place people have to take timley action (write stories / make trades / etc) as their personal well being depends on it.

The problem is, too little information available in the time required to take informed action. Be it due to a genuine lack of information or the information being to complex or diffuse to analyse in the time available before the action must be started.

As the old truisum (supposition by Aristotle) has it “Nature abhors a vacuum”. And in the “information vacuum” the people involved make assumptions bassed on their experiance or personal beliefs.

Importantly it does not matter a jot if the assuptions are valid or not as an action has to be taken, and in the press and financial markets the quicker the better…

Therefor the first people to take “visable action” become trend setters and can easily set the “position” an organisation, industry or nation takes.

This happens simply due to the “assumption” becoming a “reality” due to the paucity of information to the contry, and herd mentality ensuring that others follow whatever lead is apparent.

This is often either because the others figure the trend setter knows more than they do, and therfore must be right, or they see a trend starting and jump on early.

Like rolling a snowball down a mountin the trend can become for a while unstopable under it’s own weight as the rest blindly follow.

You see a limited form of this at social events, where nobody gets on the dance floor as the don’t want to stand out.

However when just one or two people start dancing (providing they are average or worse at dancing) others will figure well “I might as well join in” the result is the dance floor fills relativly quickly there after.

We call this “breaking the ice” simply because it “sets the lead” for others to follow.

In the computer industry there used to be another trueisum “Nobbody got fired for buying IBM”.

This bassicly means that if somebody is dominant (for whatever reason) they can set the direction of a market and others will follow for “safety” without applying judgment.

You see a form of this in children where a group have done something wrong you ask one child why they did it and they say “because X was doing it”.

Therefor two “herd” instincts of “follow the leader” and “Do what the dominent do” define what tends to happen in any given market and form the basis of “safety in numbers”.

Due to this once a trend is started it tends to develop a “life of it’s own”. Often way beyond the point where sufficient information is available to make rational judgment.

Why does this happen well I suspect it is down to the group dynamics of established organisation of sufficient size to be considered a “herd”.

If you think back to the children doing what X did without judgment it will be found that X is more popular for one or more reasons.

Usually as children develop into adolescents it is because X is better at “social communication” than the others and therefor is more “popular”. Importantly this trend carries on into adulthood.

If you look at society in general people with good “social communication” rise up in organisations and are seen to be successfull irrespective of if they have any other non social tallents.

People who are not good at “social communication” are not seen as successfull or only successfull in “technical issues” (often the latter are shown to have some level of autisum or Asburgers syndrom).

Oddly the best of these “successfull people” in large organisations actually do nothing except “network” (play politics and socialise) and ensure that they are seen with success and not failure.

The way they often do this is to be “an ever onward dynamic leader” or more correctly be very visable at the start of a project and move on befor reality starts to happen and hard information shows the falacy of the assumptions they made early on.

Having moved on then when whatever they started fails it can be blamed on others for not following their good initial lead, or if it succeds they can claim it was due to the good foundations they put in place. They are aided in this in that most people do not want to advertise failure but keep it quiet.

However as these mountin goat “climbers” approach the top their tactics have to change (they can only jump from place to place easily in the foot hills).

Those that reach the top often have a very different stratagy and it is often bassed on “fall guys” and “plausable deniablity”.

Essentialy they suround themselves with people who are either significantly inferior or who have personality weaknesses. Basicly they know or feal that they are only there by the leaders good graces and will therfore “fall on their sword” to protect the leader, knowing that if they do it right then a little later they will be “rehabilitated” back into another equivalent position.

Those with some ability at “social communictions” tend to be “second water” or second in command to those that surround and protect the leader. This is often due to their “social communications” skill level not being sufficient for them to be leaders but sufficient to make them a threat to the current leader. Again they tend not to have other non social talents and often have well developed antisocial talents, the worst examples are those that ruthlessly “lead by fear”.

Those with less developed “social communications” skills
and more importantly other better developed non social skills tend to be those that start organisations and build them up to worthwhile entities. However history shows that unless carefull they will be ousted by those with better social communications skills once the organisation is sufficiently large to support the limited skill set of a “climber”.

Unfortunatly those that are good at analysing any and all information tend to have poorer “social communications” skills and tend to respond to just being “buttered up” with job titles and praise.

This tends to make them easy prey for those with no real talent other than “social communication”.

Even in the accademic fields this situation still applies, however there is an additional game that has to be played.

As those at the top of the organisation often lack anything other than “networking” skills they find it difficult to judge those who have other well developed non social skills. Thus they use metrics such as number of “papers published” under “peer review” in accademic circles or the number of patents etc in other circles.

But what of those with higher function, good non social specialised skills but moderate “social communications”?

Well some of them end up as “technology evangalists” some as consultants and some as designers or as the rest of society has it “nerds”.

Hamish June 28, 2009 12:54 PM

Just came across an interesting idea – “Panic panic” – the idea that officials worry about panicing the general population far more than is warranted. And trying to avoid the panic, they don’t treat their population with respect, try to falsely reassure when the threat is serious etc. The population smells a rat and you can occasionally end up with a real panic.

I heard the idea on an episode of “All in the mind”, a BBC radio show [1]. The person being interviewed was Peter Sandman [2] and you can read a good article about the idea at [3].

[1] –
[2] –
[3] –

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.