Book Review: How Risky Is It, Really?

David Ropeik is a writer and consultant who specializes in risk perception and communication. His book, How Risky Is It, Really?: Why Our Fears Don’t Always Match the Facts, is a solid introduction to the biology, psychology, and sociology of risk. If you’re well-read on the topic already, you won’t find much you didn’t already know. But if this is a new topic for you, or if you want a well-organized guide to the current research on risk perception all in one place, this pretty close to the perfect book.

Ropeik builds his model of human risk perception from the inside out. Chapter 1 is about fear, our largely subconscious reaction to risk. Chapter 2 discusses bounded rationality, the cognitive shortcuts that allow us to efficiently make risk trade-offs. Chapter 3 discusses some of the common cognitive biases we have that cause us to either overestimate or underestimate risk: trust, control, choice, natural vs. man-made, fairness, etc.—thirteen in all. Finally, Chapter 4 discusses the sociological aspects of risk perception: how our estimation of risk depends on that of the people around us.

The book is primarily about how we humans get risk wrong: how our perception of risk differs from the reality of risk. But Ropeik is careful not to use the word “wrong,” and repeatedly warns us not to do it. Risk perception is not right or wrong, he says; it simply is. I don’t agree with this. There is both a feeling and reality of risk and security, and when they differ, we make bad security trade-offs. If you think your risk of dying in a terrorist attack, or of your children being kidnapped, is higher than it really is, you’re going to make bad security trade-offs. Yes, security theater has its place, but we should try to make that place as small as we can.

In Chapter 5, Ropeik tries his hand at solutions to this problem: “closing the perception gap” is how he puts it; reducing the difference between the feeling of security and the reality is how I like to explain it. This is his weakest chapter, but it’s also a very hard problem. My writings along this line are similarly weak. Still, his ideas are worth reading and thinking about.

I don’t have any other complaints with the book. Ropeik nicely balances readability with scientific rigor, his examples are interesting and illustrative, and he is comprehensive without being boring. Extensive footnotes allow the reader to explore the actual research behind the generalities. Even though I didn’t learn much from reading it, I enjoyed the ride.

How Risky Is It, Really? is available in hardcover and for the Kindle. Presumably a paperback will come out in a year or so. Ropeik has a blog, although he doesn’t update it much.

Posted on August 2, 2010 at 6:38 AM12 Comments


Brandioch Conner August 2, 2010 10:04 AM

I think that the discussion has progressed to the point where different terms are needed for different concepts.

It gets too confusing to use “risk” to refer to real risk and the perception of risk.

The same with “security” to refer to the perception of security and the reality of security.

Any suggestions?

spaceman spiff August 2, 2010 10:22 AM

@ brandioch conner

Well put! What about adding a ‘p’refix? Ie, p-risk, p-security?

Rich Wilson August 2, 2010 10:38 AM

I was just thinking about this kind of thing while watching people argue about the danger of putting family profile stickers, with kids’ names, on cars. Because a predator might follow you home and use your kids names to lure them. As unlikely as that may be, the cost of NOT putting you kids’ names on your car, or on their backpacks is also negligible, so people go for it.

But take something with a greater cost, and even if the reward is more tangible, it’s out of the question. Like taking your kid in the car. Our using your kids’ names in public.

The problem is, we spend a lot of overhead analyzing and worrying about the small stuff. And as I’ve already promised myself to stop stressing about it, I’ll just sign off with a link back to that silly discussion.

mcb August 2, 2010 12:03 PM

@ Brandioch Conner

“It gets too confusing to use ‘risk’ to refer to real risk and the perception of risk.

The same with ‘security’ to refer to the perception of security and the reality of security.”

You think that’s bad try hanging out with the insurance people until you grok the difference between risk, hazard, peril, and loss…aack thpt!

In the minds of executives and employees (or clients) perception is their reality. Of course funding committed to address (mis)perceptions is diverted from the pool available to mitigate “real” risk, to provide “real” security. Our task is to make sense of challenges and their solutions, then communicate them to decision-makers in such a manner that their perceptions become a useful (actionable and funded) version of ours. Gratefully, failure to manage perceived risk is primarily a political problem, more dangerous to our careers than it is to those whose personnel, assets, and interests we protect.

Ropeik’s book sounds like an interesting read.

dm August 3, 2010 1:01 AM

“If you’re well-read on the topic already”.

Translation: read Beyond Fear, and the relevant blog posts on the psychology of fear by Bruce Schneier?

HJohn August 3, 2010 12:50 PM

I ask questions like “what’s the real risk?” and “what’s the worst that could reasonable happen?” all the time. I’ve been an auditor for 12 years, and one of the most frustrating things in my professions is just how unreasonable some of my counterparts are. Accurate assessment of risk requires knowledge, judgment, perspective, and common sense, among other things.

One of the more depressing aspects of flawed risk assessment is how many in my profession no longer ask “what’s the worse that will happen?” and start asking “what if ______ says this or asks that?” The blank could be an external auditor or assessor, management, press, customer, anyone. That’s one of the more frustrating aspects of the environment, how many people just want to have an answer that sounds good to a dumb question, regardless of the value of the actual answer.

Here’s one real life example of a case of this. A colleague of mine works is a developer and they finished implementing a costly project. Not complex, just costly. They opted not to do a post-implementation review, stating (correctly) that it would provide no value. The Chief Audit Executive strong armed them into doing anyway, and his justification was “i’m afraid the quality assurance team will ask why one wasn’t done.” To me, that was not a good reason.

In any case, that’s just a little from my personal universe. It happens in most careers, and I think it is too often lost on people how much the opportunity cost is when fear overrides reality.

tab August 4, 2010 7:42 AM

Real risk vs. perceived risk has followed us throughout the ages. When the TSA(US) insisted on 3oz containers for liquids (and not allowing water bottles), I was reminded of when I was travelling extensively for work (80’s and 90’s).
When you went through the checkpoint, the security guards simply requested that you take a drink on the basis that if it was toxic (in some way) you would succumb well before boarding and/or takeoff.
However, in a lot of circles perception is reality. Show someone a graph that shows computer downtime. Any spike will be cause for explanation. Show them a graph that shows computer uptime (using the same numbers as a basis) and they will hardly notice.

HJohn August 4, 2010 9:46 AM

@tab: “Show someone a graph that shows computer downtime. Any spike will be cause for explanation. Show them a graph that shows computer uptime (using the same numbers as a basis) and they will hardly notice”

I think that’s similar to the framing effect. Prove to an auditor or manager that your uptime in a 24/7 environment the past year was 99.9%, and they’ll conclude that the environment is stable. However, tell them that you were down for 4 hours last year (which is an uptime of 99.95%) and they’re less likely to conclude you have adequate stability.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.