Five "Neglects" in Risk Management

Good list, summarized here:

1. Probability neglect – people sometimes don’t consider the probability of the occurrence of an outcome, but focus on the consequences only.

2. Consequence neglect – just like probability neglect, sometimes individuals neglect the magnitude of outcomes.

3. Statistical neglect – instead of subjectively assessing small probabilities and continuously updating them, people choose to use rules-of-thumb (if any heuristics), which can introduce systematic biases in their decisions.

4. Solution neglect – choosing an optimal solution is not possible when one fails to consider all of the solutions.

5. External risk neglect – in making decisions, individuals or groups often consider the cost/benefits of decisions only for themselves, without including externalities, sometimes leading to significant negative outcomes for others.

Posted on August 22, 2012 at 12:34 PM18 Comments

Comments

Seth August 22, 2012 1:46 PM

It’s quite possible to choose an optimal solution when one fails to consider all the solutions. It is only necessary that the optimal solution be in the set of considered solutions (and the best of that set be chosen). You just can’t know that you chose the optimal solution (there might be a better one you didn’t consider) but that’s also the case when you happened to consider all the solutions, but you just can’t prove there aren’t others.

George William Herbert August 22, 2012 3:10 PM

Ah, very good. I have done very similar steps in my IT reliability work, but this is much more methodical.

I’ve found that IT folks typically either neglect probability or consequence (though usually not both), almost always neglect statistics, are very experientially focused on narrow solution sets, and are only so-so on externalities.

We have a long ways to go.

jim August 22, 2012 3:47 PM

? Number five is not an error at all, that’s just rationality. Unless we are talking the sort of neglect that actually does come back to bite ya…

Doug August 22, 2012 4:15 PM

I think this post answers your question about why people think the TSA is doing an OK job.

Clive Robinson August 22, 2012 5:06 PM

Well,

    Solution neglect – choosing an optimal solution is not possible when one fails to consider all of the solutions

I would say that is a normal state of experiance for anyone who is not omnipotent.

That is as a normal state due to “imperfect knowledge” by the time you find an optimal soution it’s time is already past and the future holds an as yet unknown optimal solution.

Daniel August 22, 2012 5:25 PM

While it’s easy to say not to rely on statistical rules of thumb to avoid ‘Statistical Neglect’. This isn’t so easy in fields were there isn’t a good sized body of statistical information to work from.

In Information Security for example
What is the likelihood of
– a website having a XSS vulnerability and being exploited?
— the website is only available on the Intranet?

Or a more complex example?
– a PC being infected by a virus?
— when the PC is 6 months behind on patches?
— antivirus is not operating correctly?
— The PC is conected to the internet?

This is something at IT risk people have been struggling with for years and in my experience most use rules-of-thumb or make ‘educated’ guesses.

Wael August 22, 2012 5:27 PM

@ Clive Robinson

I would say that is a normal state of experiance for anyone who is not omnipotent.

Did you mean Omniscient? That relates “knowledge”; Omnipotence relates to “Control”.

Wael August 22, 2012 5:43 PM

Regarding #4 — Solution neglect:

In academia, looking at all possible solutions and finding the optimal choice is acceptable. In the industry, one cannot afford to adopt that approach. One relies on Rules of thumb, principles, and “experience”, which is loosely defined. It is worth mentioning, in my opinion, that the Rules of thumb and principles probably originated in the academic environment first. This “industry” approach (ignoring some solutions) serves as a good first model thinking. If it needs to be refined, then we can adopt some “academic style thinking” – or leverage it.

If you take a good commercial chess program for example, it does not evaluate all possible “solutions” – it may do that for a few moves. But as the possibilities increase, and “exponential explosion” kicks in, the program (algorithm) at one point will have to disregard considering some lines (ignores all possibilities) — pruning.

Clive Robinson August 23, 2012 1:03 AM

@ Wael,

Did you mean Omniscient? That relates”knowledge”; Omnipotence relates to “Control”

Yes, you’ve caught me being half asleep (it was getting late in Europe and I’ve not slept well since comming out of hospital) and using the predictive text / auto complete spell checker and hit the wrong word.

Mind you thinking about it in dawns early light (yet another night with disturbed sleep 🙁 for an individual to have god like “control” over something they have to have god like “knowledge” of what they are controling. So omnipotent implies omniscient 😉

anony August 23, 2012 3:19 AM

If you’re interested in cognitive errors, heuristics, and fallacies (such as the 5 listed in the post), a good review of the past 50 years of development in behavioural cognitive psychology and behavioural economics can be found in Daniel Kahneman’s excellent “Thinking, Fast and Slow.”

Christopher August 23, 2012 10:44 AM

Has Bruce commented on Thinking Fast and Slow? Some (most?) of it clearly overlaps his areas of interest. Having read it, his take would be interesting.

Roger August 23, 2012 4:29 PM

Unlike the summary page on the blog, the original PDF makes it clearer that they are talking about really large scale projects such as national disaster preparedness. As other commenters already observed, this “fully rational” approach is simply not practical for many problems. By the time you finished collecting data, it would already have happened!

Having said that, this can lead directly to a pernicious new form of mismanagement that I have noticed a lot lately. Someone does a “Risk Management” study, using a fairly standardised formal method that many readers will have seen. The author of the study is in a hurry and the data isn’t available, so he chooses the “likelihood” and “consequence” values for each option based on “gut feel”. This is supposedly OK because each column covers a range of an order of magnitude, so a rough guess should do.

The result is to provide a veneer of formality and rationality to a process that is actually just one guy’s “gut instinct.” And should you object to the chosen solution, well, they have a “Risk Management Analysis” to back it up!

Worse, this common practice seems to have created widespread acceptance for just guessing likelihoods. Since “guessing” is forensically indistinguishable from “fabricating”, that makes it very easy for the Risk Analyst to work backward from a pre-selected solution by just choosing values that put his solution in the green zone, and the others in the amber zone!

At that, Risk Management becomes an very elaborate mask for bias, nepotism or outright corruption.

At fault is the idea of grouping likelihood bands into orders of magnitude. The thinking is that we may not know the true probability, but surely we can guess it within that large margin of error? In fact, we cannot; psychological risk studies show that for rare but catastrophic events the typical estimation error is multiple orders of magnitude. (And it’s nearly always an underestimate.)

Roger August 23, 2012 4:50 PM

I said:
“And it’s nearly always an underestimate.”

Sorry, that should have been “And it’s nearly always an underestimate for negative outcomes.” Its usually an overestimate for positive ones.

If anyone would like to see a few cites to back this up, Google “optimism bias.”

Jon August 24, 2012 7:05 PM

One beautiful example in the PDF is how the cost of the cleanup, in this case $175 million, for a specific mine likely exceeds the total value of the minerals extracted.

However, the value of the minerals extracted went into private pockets 120 years ago. The cost of cleanup is borne by us all today.

Talk about an externality, both in person and in time.

J.

Danny Moules August 28, 2012 4:31 AM

@Jim So glad I don’t work with you. I don’t work with you, right? There’s a book you should read: http://www.schneier.com/book-lo.html

@Wael “Tries to assert things without any supporting evidence… then abuses the word leverage. All credibility is thus lost.”

Wael August 28, 2012 12:18 PM

@ Danny Moules

Tries to assert things without any supporting evidence… then abuses the word leverage. All credibility is thus lost

What are you talking about?
All credibility is lost? Hmm! Hard to lose something I don’t have 😉

Bruce Wilder September 8, 2012 4:18 PM

  1. Solution neglect – choosing an optimal solution is not possible when one fails to consider all of the solutions.”

I think I would formulate 4 quite differently. This phrasing makes it sound like one has to exhaustively search out possible solutions, and I don’t think that’s the essence of this problem.

“Solution” and “optimal” both seem to suggest to many people, utopian or idealized outcomes, where “the problem” ceases to exist, or benefits are achieved without on-going costs.

To an economist, “optimal” suggests a balance, where marginal cost is exactly equal to marginal benefit, and no incremental improvement is possible. That’s a better conception than utopian thinking, because it inherently acknowledges that absolute risk mitigation is not feasible or desirable, but its static frame is also misleading. Simple cost-benefit analysis is not that helpful.

Risk management is a control problem, and, as such, presents a dynamic, not a static, field of opportunities and paths. The absolute best, ideal-type “solution”, conceived of as a destination, might not be a practical guide, when you have to choose among locally accessible paths. You still want to consider in what direction a chooseable path seems to lead, but you are choosing a path more than a destination. You can only choose paths in your general vicinity (not from an ideal universal set), and you won’t be choosing a final solution; you are choosing where you will choose solutions from, next year or a decade from now.

So, I would consider revising the phrasing to something more like, “investment neglect”. The risk manager may be managing operations within the existing appartus, to keep risk on the feasible frontier, but should be considering committments or investments that will allow moving the frontier dynamically forward.

Pretty abstract phrasing, I know; I hope it makes some sense.

Rob Elamb July 2, 2014 9:41 PM

I would add another one:
Risk Executive Neglect – some organizations don’t have a person or office in charge of management risk. Its a big problem because then the company has no direction on what level of risk the can accept. They just wing it.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.