The Effects of Near Misses on Risk Decision-Making

This is interesting research: “How Near-Miss Events Amplify or Attenuate Risky Decision Making,” Catherine H. Tinsley, Robin L. Dillon, and Matthew A. Cronin.

In the aftermath of many natural and man-made disasters, people often wonder why those affected were underprepared, especially when the disaster was the result of known or regularly occurring hazards (e.g., hurricanes). We study one contributing factor: prior near-miss experiences. Near misses are events that have some nontrivial expectation of ending in disaster but, by chance, do not. We demonstrate that when near misses are interpreted as disasters that did not occur, people illegitimately underestimate the danger of subsequent hazardous situations and make riskier decisions (e.g., choosing not to engage in mitigation activities for the potential hazard). On the other hand, if near misses can be recognized and interpreted as disasters that almost happened, this will counter the basic “near-miss” effect and encourage more mitigation. We illustrate the robustness of this pattern across populations with varying levels of real expertise with hazards and different hazard contexts (household evacuation for a hurricane, Caribbean cruises during hurricane season, and deep-water oil drilling). We conclude with ideas to help people manage and communicate about risk.

Another paper.

Posted on June 9, 2015 at 8:15 AM23 Comments

Comments

Outis June 9, 2015 8:26 AM

George Carlin: “Here’s a phrase that apparently the airlines simply made up: near miss. They say that if 2 planes almost collide, it’s a near miss. Bullsh*t, my friend. It’s a near hit! A collision is a near miss. [WHAM! CRUNCH!] “Look, they nearly missed!” “Yes, but not quite.”

Sorry, couldn’t help it …

anymoose June 9, 2015 8:59 AM

Nothing really surprising about this. It is all about controlling the level of fear with wording. For example if you are in network security which would you say to get more funding; we were probed 10k times and never breached OR we detected 10k probes and if they haven’t already they will find a weakness.

albert June 9, 2015 10:52 AM

Thanks, @Outis. Carlin always managed to succinctly sum up the silliness of modern life.
.
“…people often wonder why those affected were underprepared…”
.
So it’s the affected ones who were ‘unprepared’? These ‘academics’ are starting to get on my nerves. I’d like to drag their sorry asses down to the 9th Ward in NOLA, or Unit 1 at Fukushima.
.
The ‘affected ones’ had no control over preparations. The gov’t/corporate system controls that stuff. They don’t rats ass about spending money on things like prevention or remediation.
.
They’d rather roll the dice. Reactive instead of proactive. Just like the “War on Terror”.
.

Clive Robinson June 9, 2015 11:43 AM

Is it me or is the underlying effect better known as “familiarity breeds contempt”?

Anura June 9, 2015 11:54 AM

To be pedantic, “near miss” uses the term “near” in terms of distance, not as a synonym for “almost” – that is, two aircraft have a near miss is when they are near each other but don’t collide, whereas if they were on other sides of the planet it would be a far miss, or what is more commonly known as an unnoteworthy event.

David Hawthorne June 9, 2015 12:26 PM

“Near Misses” are obvious for the problems they potentially can introduce into risk assessment number crunching, though. Not sure how the insurance industry does it, but I would be surprised if they really screw up their own numbers.

In terms of people not in a position to be in charge of handling disasters (for instance), however, their risk assessments are always going to be screwed up. For that, there simply should be education. Everyone should get basic schooling in the science of estimating risk and threat.

As in so many fields, the technology is available for training professionals who are in a position of likely dealing with disasters (major, medium, minor); but this technology is poorly virtualized. As it stands, what happens is they have to fly around the country to go to specialized centers where they practice drill scenarios composed by their instructors (who dream up scenarios). This is infrequent and many potential groups are poorly involved, if at all.

This sort of training can be much better virtualized, not unlike with gun training. Or general education, for that matter. Effectively leaving the era behind of pre-assembly line like virtualized solutions, to a more effective, repeatable, profound educational system utilizing “systems that work” born from a highly competitive market and repeating those systems across the countries.

My guess is this evolution of technology and information sciences will slowly improve, rather then quickly. The emergence of mass produced VR technology next year may really help bump it up, however.

Marcos El Malo June 9, 2015 1:06 PM

@Clive

‘Is it me or is the underlying effect better known as “familiarity breeds contempt”?’

That might be one mechanism. I was born and raised in “earthquake” country. Cupboard rattling quakes are not uncommon. One can get quite jaded about quakes, and the bigger ones can seem thrilling, like an amusement park ride.

I think there is another mechanism at work in some cases. Let’s say one is in the path of a hurricane and evacuates voluntarily. This will probably quite an inconvenience and headache, and might involve considerable expense. When the hurricane takes a different path, it all seems like wasted effort, so the next time a hurricane is approaching, one might wait longer to evacuate or even not evacuate at all and take ones chances.

rgaff June 9, 2015 1:58 PM

@ Anura

So you’re saying that it’s like the phrase “identically alike” where you use two words that mean the same thing or close to the same thing for no obvious reason than habit. The word “miss” already implies a closeness in distance, otherwise it would be so unnoteworthy as to not be mentioned at all. Put another way, “they flew around opposite sides of the planet? dang, they missed each other!” wouldn’t make sense unless you were talking on the scale of giant comets, not relatively tiny airplanes.

@ David Hawthorne

The problem is most people’s experience with risk assessment is from people with vested interest! Either pumping up the risk to get more of something they value, or downplaying risk so as not to spend as much, both to the public’s detriment usually…

Rob June 9, 2015 2:02 PM

After getting stabbed in the cheek…

“2 inches to the right and you would have lost that eye”

“2 inches to the left and he would have missed my face entirely”

All depends on perspective.

Dr. I. Needtob Athe June 9, 2015 4:41 PM

I think it’s a simple matter of “Not again! Last time I went through all that crap for nothing!”

Mike Amling June 9, 2015 4:53 PM

IIRC, after the fact it came out that there had been some near misses before the Challenger shuttle disaster where a rubber O-ring burned through partly but not completely. Rather than interpreting this as “We’re outside the envelope of conditions that we require for safe operation,” the effective reaction (but not everyone’s reaction) was “Since it didn’t burn through completely, the mission was safe.”

moo June 9, 2015 5:36 PM

Slightly off topic, but this reminds me of a book I recently read. “Engineering a Safer World” by Nancy Leveson, from MIT press, available as a free pdf download (just google it). Its about applying systems engineering principles to safety designs and analyses. It contains some interesting case studies accidents that had complex causes which were barely explored by the more traditional kind of post-accident analyses. For example, one entire chapter is devoted to a friendly-fire incident after the Gulf War, where two Air Force F-15’s shot down a pair of U.S. Black Hawk helicopters killing everyone aboard–the crews and over a dozen VIP passengers. The incident happened despite many technical and operational safeguards that were supposed to prevent exactly that from happening. But there were literally dozens of different factors that contributed to the eventual outcome, and Leveson’s framework helps understand how the overall system had slowly migrated from its initially-safe design over to daily operation conditions where an accident of this kind was basically waiting to happen. And it also illuminates how numerous communication failures, misunderstandings and unclear delegation of responsibilities at the higher levels of the command structure contributed to the outcome. Throughout the book, she applies her ideas in analyses of plane crashes, the Bhopal industrial plant disaster, the Walkerton water contamination e-coli outbreak, a failed $1.2bn satellite launch, and more.

I found it a very interesting read, and I couldn’t help thinking that some of the ideas for designing safer systems might also be applicable to designing better security into systems too. I think you would just do the same kind of analyses, but using “hazards” and “constraints” and so on that were security-oriented. The definition of a “loss” might include theft of data, denial of service, etc. instead of harm to people, destruction of property, etc.

moo June 9, 2015 6:00 PM

@Mike Amling:

Feynman’s analysis of the safety conditions around the Challenger launch:
http://science.ksc.nasa.gov/shuttle/missions/51-l/docs/rogers-commission/Appendix-F.txt

There’s something compelling about that kind of short, clear accident analysis. I was reminded a lot of that report while reading Leveson’s book that I mentioned in my last post. The way they write, both Feynman and Leveson convey a clear picture of how systemic deficiencies contributed to the slow erosion of safety margins, until “normal” operations were much less safe than believed and accident was probable or even inevitable. Leveson’s book also talks about actions NASA took after the Challenger disaster, to reform the safety culture there and make sure that channels, policies and processes were created so that “technical conscience” safety concerns could then be raised by anyone in the program without just being swept under the rug.

David Hawthorne June 9, 2015 11:07 PM

@rgaff

The problem is most people’s experience with risk assessment is from people with vested interest! Either pumping up the risk to get more of something they value, or downplaying risk so as not to spend as much, both to the public’s detriment usually…

Lol, oh holy hell, yes. Very aware of that. Can’t read the news or browse the internet without running across fear talking back to me, posing as everyday people. 😉 Succumbed. People trade their inner peace for it, can’t imagine that cost is any kind of good buy.

It is a virus, is what it is.

David Hawthorne June 9, 2015 11:10 PM

@Rob

Ouch, that is a severe experience to go through. I can’t help but think “thank God they did not know where any arteries were”.

Jenny Juno June 9, 2015 11:23 PM

Does this same phenomenon explain why people text and drive? Is it that all of the times that such inattention to the road put them in a dangerous situation that they then escaped by sheer luck teaches them that they will be lucky next time too?

Anura June 9, 2015 11:30 PM

@Jenny Juno

I knew someone who had an accident when texting while driving drunk, and they didn’t learn their lesson. Although maybe because no one was injured and they didn’t get arrested, they figured they were still pretty lucky.

Winter June 10, 2015 12:54 AM

“Slightly off topic, but this reminds me of a book I recently read. “Engineering a Safer World” by Nancy Leveson, from MIT press, available as a free pdf download (just google it).”

Google Scholar shows a lot of other (shorter) papers of her:

A new accident model for engineering safer systems
https://esd.mit.edu/WPS/internal-symposium/esd-wp-2003-01.19.pdf

It also includes a paper from 1986 about creating safe software.
http://dl.acm.org/citation.cfm?id=7528

See the progress we have made 😉

Winter June 10, 2015 12:59 AM

“I knew someone who had an accident when texting while driving drunk, and they didn’t learn their lesson.”

It is well known that you do not learn much while drunk. However, you learn very fast when using coke. Sadly, you learn the wrong lesson from coke.

Floridian June 10, 2015 9:42 AM

An interesting effect of Hurricane Katrina was the Houston populace’s reaction to Hurricane Rita.

http://www.chron.com/news/houston-texas/houston/article/8-years-ago-seemingly-all-of-Houston-evacuated-4839142.php

For hundreds of thousands of people, the decision to evacuate actually put them at greater risk. Some people even died mid-evacuation.

I daresay Houstonians will not respond to the next hurricane in the same way–and that is not necessarily a bad thing. The key will be making sure that the people who DO need to evacuate (e.g., barrier island residents) actually do so, rather than ride out the storm for fear of getting caught up in another Rita evacuation nightmare.

albert June 10, 2015 11:43 AM

It’s not engineering, it’s money. Industry ignores safety standards, punishes whistleblowers, and gradually erodes government oversight.

“Engineering A Smarter, Less Greedy World” would be worth a read. Reducing the loss of life and limb to pages of statistics is fine for the insurance industry, but doesn’t improve the situation. As long as they can put a price on human life, the old ways will continue.
.
In general, I think systems designs are fairly solid, but the spectre of cost and corner cutting is always there. Lack of maintenance of safety systems is the most important issue we have right now. Deteriorating public and private infrastructure is the 900lb sasquatch in the room. We keep kicking the can down the road, and it will wind up at the feet of the Pied Piper, and what will we pay him with?
.
There are no engineering solutions to these problems. They are political. A good start would be to establish criminal liability for executives who can be shown to be culpable in an engineering disaster. There needs to be a gov’t agency to force compliance, of say, NTSB findings; one that’s not in bed with the industry they’re responsible for.
.

vas pup June 13, 2015 10:39 AM

@David:”Everyone should get basic schooling in the science of estimating risk and threat.”
Are you talking about estimating objective risk (based on science, math, statistics) or subjective risk (based on knee-jerk reaction, group/crowd reaction to danger, affected by mass media and big business to make profits, not to save you)?
I guess schooling in latter is even more important than former. Generally: Think by you own head and primary with frontal part of your brain, not limbic system when evaluating risks.

hwk July 9, 2015 6:17 AM

Very interesting. I’ve heared about this approach (near-miss) in (I think it was) a dutch hospital a few years ago. They introduced an anonymous reporting system for failures that almost happend. With this data the hospital was able to take a closer look at the critical processes and most likely failures. And of course they were able to prevent future failures through awareness and process optimization.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.