Risk Intuition

People have a natural intuition about risk, and in many ways it’s very good. It fails at times due to a variety of cognitive biases, but for normal risks that people regularly encounter, it works surprisingly well: often better than we give it credit for.

This struck me as I listened to yet another conference presenter complaining about security awareness training. He was talking about the difficulty of getting employees at his company to actually follow his security policies: encrypting data on memory sticks, not sharing passwords, not logging in from untrusted wireless networks. “We have to make people understand the risks,” he said.

It seems to me that his co-workers understand the risks better than he does. They know what the real risks are at work, and that they all revolve around not getting the job done. Those risks are real and tangible, and employees feel them all the time. The risks of not following security procedures are much less real. Maybe the employee will get caught, but probably not. And even if he does get caught, the penalties aren’t serious.

Given this accurate risk analysis, any rational employee will regularly circumvent security to get his or her job done. That’s what the company rewards, and that’s what the company actually wants.

“Fire someone who breaks security procedure, quickly and publicly,” I suggested to the presenter. “That’ll increase security awareness faster than any of your posters or lectures or newsletters.” If the risks are real, people will get it.

You see the same sort of risk intuition on motorways. People are less careful about posted speed limits than they are about the actual speeds police issue tickets for. It’s also true on the streets: people respond to real crime rates, not public officials proclaiming that a neighbourhood is safe.

The warning stickers on ladders might make you think the things are considerably riskier than they are, but people have a good intuition about ladders and ignore most of the warnings. (This isn’t to say that some people don’t do stupid things around ladders, but for the most part they’re safe. The warnings are more about the risk of lawsuits to ladder manufacturers than risks to people who climb ladders.)

As a species, we are naturally tuned in to the risks inherent in our environment. Throughout our evolution, our survival depended on making reasonably accurate risk management decisions intuitively, and we’re so good at it, we don’t even realise we’re doing it.

Parents know this. Children have surprisingly perceptive risk intuition. They know when parents are serious about a threat and when their threats are empty. And they respond to the real risks of parental punishment, not the inflated risks based on parental rhetoric. Again, awareness training lectures don’t work; there have to be real consequences.

It gets even weirder. The University College London professor John Adams popularised the metaphor of a mental risk thermostat. We tend to seek some natural level of risk, and if something becomes less risky, we tend to make it more risky. Motorcycle riders who wear helmets drive faster than riders who don’t.

Our risk thermostats aren’t perfect (that newly helmeted motorcycle rider will still decrease his overall risk) and will tend to remain within the same domain (he might drive faster, but he won’t increase his risk by taking up smoking), but in general, people demonstrate an innate and finely tuned ability to understand and respond to risks.

Of course, our risk intuition fails spectacularly and often, with regards to rare risks , unknown risks, voluntary risks, and so on. But when it comes to the common risks we face every day—the kinds of risks our evolutionary survival depended on—we’re pretty good.

So whenever you see someone in a situation who you think doesn’t understand the risks, stop first and make sure you understand the risks. You might be surprised.

This essay previously appeared in The Guardian.

EDITED TO ADD (8/12): Commentary on risk thermostat.

Posted on August 6, 2009 at 5:08 AM53 Comments

Comments

BF Skinner August 6, 2009 5:22 AM

In mixed gov’t/contractor operations I’ve consulted at I made/make the recomendation routinely that if a sysadim/op/whatever clears an audit log without written authorization they should be immediately terminated.

I’ve come to expect no great agreement from contractors but surprisingly the gov’t types aren’t that up for it either.

Soft option is ridicule which can be effecitive in people remembering the story and the putz involved.

Only terminations we ever had for “security” reasons were users surfing pr0n. And HR is reluctant to publicize the reasons for dismissal.

I liked the British navy (Nelson’s Navy) way of doing things. Call the crew to quarters, read out the charges and the punishment, then the lash. Brutal but everyone does sees the effect.

Nowadays what happens at Captains Mast is a confidential matter although the Navy and Coast Guard do post good order and discipline reports they are too anonymous. No way for someone to say “that could be me” and change their ways.

first?

BF Skinner August 6, 2009 5:24 AM

Yes! (air punch)

And either your server is in the Rockies or the clock is off by 2 hours.

cakmpls August 6, 2009 6:09 AM

“It’s also true on the streets: people respond to real crime rates, not public officials proclaiming that a neighbourhood is safe.”

That might be so in an individual, defined area, but people’s overall response to real crime rates isn’t so accurate. Studies have shown, have they not, that while crime rates in the U.S. were falling, people’s fear of crime and estimates of the incidence of crime were increasing?

Bob August 6, 2009 6:19 AM

“Studies have shown, have they not, that while crime rates in the U.S. were falling, people’s fear of crime and estimates of the incidence of crime were increasing?” Yes, they have. The “crime rates” are those gathered, classified and published by those responsible for reducing crime. They are not epistles from the heavens, and like public school publicity, often differ from observation. The old saw applies. “Are you going to believe me or your lying eyes?”

Bruce Schneier August 6, 2009 6:23 AM

“That might be so in an individual, defined area, but people’s overall response to real crime rates isn’t so accurate. Studies have shown, have they not, that while crime rates in the U.S. were falling, people’s fear of crime and estimates of the incidence of crime were increasing?”

Exactly. Those studies are for crime rates in the country or some such. People get their intuition about their local crime rate from experience; they get their intuition about regional or national crime rates from television.

bob August 6, 2009 6:24 AM

Many years ago I stopped at a traffic light next to a small car, like a Honda Prelude, with the window rolled down and a forward-facing child seat in the passenger seat with a ~4yo boy in it. There were about ~8 golf-ball sized rocks laying on the dashboard directly in front of the child’s face. I felt obligated (since over the decades I it has become obvious that I paid a lot more attention in science [math, civics, history…] class than the population at large) to point out to the driver that if she was rear-ended those stones would stay in place while the child’s face accelerated into them and it would hurt a lot, possibly disfiguration, and suggested she should probably move them to the floor. She gave me a smartass remark which essentially said “mind your own business @#$!” and left them there (which is about what I expected, but I had done my duty).

I stipulate the risk of it occurring was low, but it was not zero (I’ve been rear-ended 3 times bad enough to require body work; once it bent the back of the driver’s seat, rocks on the dash in that event would have been memorable) and the cost/benefit ratio of leaving rocks in the child’s path was being seriously miscalculated.

@BF Skinner: Rewards as well. I was given a performance bonus a couple of years ago. They made confidentiality a requirement of it. How is it supposed to motivate others to take initiative and excel if no one knows about it?

Trichinosis USA August 6, 2009 6:34 AM

What if the security policy is stupid?

As in, the individuals who run the ecommerce front ends (the most important, at-risk systems in the company) are not allowed to put security patches on the desktops that they use because the desktops are under the responsibility of another department which consistently refuses to patch them?

Yes, I am taking this from a real-life situation.

Robert August 6, 2009 6:51 AM

For some time now the “No Parking in street” ordinance in my housing development has been unenforced. Recently new signs were erected and notices passed out that enforcement would begin. People ignored this and continued as before because they’d heard the same line over and over and nothing ever happened. The tow truck showed up the other night and started hooking up the first car it came to, and like a flash there wasn’t a car in sight parked on the street and hasn’t been since. We’ll see if it takes more than one example before the lesson sinks in.

dude August 6, 2009 7:30 AM

what politician ever declared an area safe? wheres the fear vote in that? 25% of the electorate vote their fears all the time, the rest vote their fears some of the time. You cant get many votes tellling the truth, you have to make the world seem like its going to hell. Michelle Bachman sees this.

Alex August 6, 2009 7:33 AM

@Trichy:

In an organization, policy tends to be an attempt to codify the “risk thermometer”. So if it’s “stupid” it’s a bad attempt. Lot’s of exceptions are indicative that the thermometer is off. Lot’s of incidents is an indicator in the opposite direction.

@TheBruce

In a real sense, we (the risk geeks in the industry) feel that the risk decision is very Bayesian. We’re intuitively using “priors” to create a “distribution” when we make every decision that involves negative impact. To speak to your example, the child is simply operating with a much smaller amount of prior information. On the other hand, ladder warnings aren’t ignored, but inference is made on impact and based on the relative frequency with which our minds have digested actual “loss events”.

Much of this sort of thought has already been covered by a few folks in our industry who really think about it regularly. Their journey is pretty well documented in the blogosphere.

David August 6, 2009 8:11 AM

The child in the back seat is also at risk. There’s less possible interaction between child and parent. If the child needs help, for some reason, the child is less likely to get it from the back seat. Probably more important, a child in the back is almost certainly much more likely to be forgotten than one in the front seat, and that’s killed children.

I don’t know what the comparative risks are, and I don’t know the long-term effects of less parent-child interaction, except that they’re likely to be negative. I don’t know how many children would start out as safer drivers if they were used to sitting in front and better able to watch the road, the driver, and the controls.

In short, I consider the whole idea of keeping the kids in the back seat to be incompletely thought out, and I dislike the idea of basing safety measures on the assumption that they belong in the back.

Uthor August 6, 2009 8:30 AM

I’ve wondered about that motorcycle statistic. Does it look at motorcycle riders that have gone from not wearing a helmet to putting one on, or does it look at a sample of riders?

Motorcycle stereotypes are rampant. Generally, a sport bike rider will have a helmet (though often strapped to the bike instead of the head) and will ride fast while a cruiser rider will go without a helmet and ride slow. Not that this is always true (I ride a cruiser with a helmet and go faster than I should, so I guess I fit THAT profile at least).

Mark J. August 6, 2009 8:46 AM

As a former helmet-wearing motorcyclist and current roadster owner, I believe the increased speed seen by new helmet wearers is due to the helmet-induced lack of wind noise at the higher speeds. Driving my roadster at 80MPH with the top down “feels” a whole lot faster than the same speed with the top up. Same deal with helmets. That wind rushing past your ears at 80MPH sends a big risk signal to your brain. Less wind noise has the opposite effect.

casey August 6, 2009 8:56 AM

There was a sniper in the DC area and I saw a woman in tears running to a Target store holding a child by each hand. Clearly, she thought that the sniper might be around, but also thought she needed whatever Target was selling. Her risk intuition was not working and also not unique. Soccer games were cancelled, but practice was not. New risk is unsettling.

The ‘normal’ state is when you have made and accepted your risk decisions. This happens over a long period and does not disturb the person very much. When a new risk is introduced(real or not), the process is rapid and uncomfortable.

paul August 6, 2009 9:50 AM

Is it possible that we need a certain level of (perceived or otherwise) risk to keep our internal calculations from hunting? There are a lot of analog error-correction circuits that require a certain minimum error level to function properly; if you try to hit exactly zero and stay there you get DC drift and other weirdness that eventually makes the system more unstable than it otherwise would be. (A little analogous, perhaps, to all the cognitive situations where a continuous low-level stimulus vanishes from perception.)

When you’re in the center of a big safe zone, you have no feedback to give you an idea of how close to the edge of your operating envelope you may be. So it may be cognitively more comfortable to move until you start getting feedback.

Alex August 6, 2009 10:00 AM

@Paul:

“Is it possible that we need a certain level of (perceived or otherwise) risk to keep our internal calculations from hunting?”

That’s what I think. If Bayes is an adequate representation of our decision making, then our minds can’t really operate with a huge amount of uncertainty or non-informative prior information. If/when that happens, we have no (or acknowledged by the subconscious as really bad) output. That’s unsettling.

RvnPhnx August 6, 2009 11:20 AM

@Mark J.
Makes sense to me. 20MPH feels a lot faster on a regular bicycle when wearing a headset (single side, thank you–so I can still hear cars) because it whistles.

@Uthor
I’ve found from experience that people often abuse equipment designed to increase the safety of whatever they are doing. For instance, the most dangerous people on the slopes are the recreational skiiers and snowboarders wearing helmets (they weigh more than the little tykes that bomb everything with no control whatsoever). I’ve noticed the same sort of paradox with the bicyclists whom are forced to wear helmets (parents, law or otherwise) in situations that would not normally warrant them (small country roads, parks without motorized traffic). Cyclists whom only tend to wear the helmets in situations of actual increased efficacy of the helmet (single-track, high speed dirt roads, trials riders, nutty folk whom throw loops) seem to respect them much more.

@paul, Alex
Habituation is required by neurological systems to establish proper baseline output. So far as I’m concerned that’s pretty definitive. There are thankfully “safe” ways to prime much of the system, but things such as “Abstinence-Only” Sex Ed and HR egonomic seminars don’t apparently make the list.

Rich Wilson August 6, 2009 4:02 PM

There’s an example of this in sport right now. NBA player Rashard Lewis gets a 10 game ban for failing a drug test. A cyclist in the same situation would face 2 years. And life for the 2nd offense.

Obviously the NBA and UCI have a different assessment of the risk of a drug scandal.

Rich Wilson August 6, 2009 4:06 PM

There’s an example of this in sport right now. NBA player Rashard Lewis gets a 10 game ban for failing a drug test. A cyclist in the same situation would face 2 years. And life for the 2nd offense.

Obviously the NBA and UCI have a different assessment of the risk of a drug scandal.

Anonymous August 6, 2009 4:26 PM

“I’ve noticed the same sort of paradox with the bicyclists whom are forced to wear helmets (parents, law or otherwise) in situations that would not normally warrant them (small country roads, parks without motorized traffic).”

I don’t recall the name of this phenomenon, but when mitigating risk (e.g. bike helmet) people tend to increase their risk appetite proportionally more than the risk mitigation warrants.

I don’t know if abuse is the proper word. I don’t believe it is intentional…

snow white wonder August 6, 2009 4:51 PM

re: cakmpls

This may have to do with callibration.
Start with zero fear.
Show locals negative consequences
that are irrecoverable to the involved:
The rate of pigs eaten by wolves declines,
but the remaining pigs are still incresaing their fear
until – what?
Until the pigs stop gett6ing eaten!
This is callibration against error.
What is error? Getting eaten!

Harry Johnston August 6, 2009 4:57 PM

I’m not sure the speeding-ticket scenario is a good example of people accurately assessing risks. As I understand the statistics, traffic accidents are one of the largest risks we face day to day, and speed is a significant factor. That is, the inherent risk from speeding is objectively greater than the risk of being ticketed, and only the cognitive biases involved make people worry more about the latter.

moo August 6, 2009 6:21 PM

@Harry Johnston: Exactly. What’s the risk in getting a speeding ticket for going 20 over the limit? Wasting twenty minutes of your time and paying $95 ? It sucks but its hardly the end of the world.

On the other hand, in a collision that extra 20 might be the difference between a shocking collision, and a fatal one.

If you want to really understand the risks of excessive speed, get to know your friendly neighborhood EMT and ask him what responding to the highway crash scenes is like. You know, the ones with 20-foot bloody streaks across the pavement, where the passengers came to rest… If you still know anyone who doesn’t wear seatbelts in this day and age, I suggest forcing them to read this:
http://nielsenhayden.com/makinglight/archives/008845.html

Bill Wildprett August 6, 2009 7:10 PM

I agree with Bruce that employees perceive the risk of not getting their work done as more important than security. Everyone know that the ‘Boss’ will come down hard on them if they don’t perform because that’s the business logic and model we’ve been using for a century.

I’ve argued that compliance penalties should affect executive bonuses.

http://suspiciousminds.wordpress.com/2009/07/03/insecurely-aware/

Until the ‘message’ comes down from the higher-ups, it won’t be realized by the troops. We need a new school of thought on how to evangelize security awareness to the rank-and-file employees.

Me, I’m still worried about tearing the mattress tag off… 😉

Robert August 6, 2009 7:16 PM

@moo

“What’s the risk in getting a speeding ticket for going 20 over the limit? Wasting twenty minutes of your time and paying $95 ? It sucks but its hardly the end of the world.”

Here in NC, it’s lose your license for at least a month or more. You can imagine the domino effect from that.

bethan August 6, 2009 9:22 PM

in preventing violent crime, instincts are the most important factor. unfortunately, many people’s inctincts are corrupted by prior victimization. i’ve talked to a lot of parents who over-shelter their kids to prevent a repeat of their own experience, and others who don’t want to over-shelter their kids so they expose them to very unnecessary risk.

it can be challenging to train people to find a few neutral moments in each decision process so they can ::listen:: to their instincts before they act.

Iain August 7, 2009 1:43 AM

I’ve got a theory incubating in my head that says human brains haven’t evolved to understand very large or very small numbers, and very often get products of large numbers (consequences) and small numbers (risks) wrong. We “understand” 10 things and 100 things, but I don’t think people instinctively understand the difference between a billion and a trillion (except in a purely intellectual sense). I suspect we often think a 1:10^9 chance is “the same” as a 1:10^12 or 1:10^15 chance, and don’t “feel” that one is a million times more likely to happen.

Ichinin August 7, 2009 4:11 AM

Ofcourse people understand risks, its just that sometimes people do not care and they are not qualified to judge what is a risk and what is not.

Try getting a unemployed person that particiopate in a jobseeker program to care about having a complex password on his/her account.

Try convincing a bunch of factory floorworkers that all know eachother to have separate accounts and not to write a macro in the terminal software for their AS400 login procedure.

As for that last example, i had to explain to them that the information they used daily for manufactoring robots was worth a bunch of millions. Then they got the point and started to care.

For a policy to be effective, it should explain why it is necessary and why employees should care: Most employees do not give a shit if the company loose a million, but if job security is in the equation – they will care.

Pete Austin August 7, 2009 4:47 AM

Re: Motorcycle riders who wear helmets drive faster than riders who don’t.

Let me rephrase that for you: Motorcyclists sometimes don’t bother to wear their helmets when they’re only driving slowly.

See? Entirely rational and no need for a “Risk Thermostat”.

Fred August 7, 2009 5:36 AM

It’s well known how the risk thermostat is not reliable. Examples of wrong risk perception, like driving your car versus taking a flight are nothing new.
Training users to understand risks is very important.

Dingbat August 7, 2009 7:59 AM

“As a species, we are naturally tuned in to the risks inherent in our environment. Throughout our evolution, our survival depended on making reasonably accurate risk management decisions intuitively, and we’re so good at it, we don’t even realise we’re doing it.” How do you know that?

Ding

Keeping you honest August 7, 2009 11:52 AM

Well if you’re going to plug John Adams I think I’ll return the favor and mention Gerald Wilde’s book Target Risk. Perhaps “target” is not so good a term as “thermostat” but in the book Wilde usually says “homeostasis.” And many of Wilde’s examples are especially interesting, particularly a controlled and blinded experiment involving a taxi company and anti-lock brakes. Like many informative experiments, one wonders how it got past the ethics committee.

David Donahue August 7, 2009 12:42 PM

@bob & @Parrot: Its probably something more like performance bonuses being handed out based partially on divisional budgetary availability than exclusively on individual performance.

The confidentiality clause is an HR risk mitigation to prevent jealousy based morale problems between employees comparing salaries and bonuses and feeling that one or the other was unfair.

You still get the bonus so your performance incentive is enhanced and you’re more likely to speak of it in general (vs. specific) terms and that’d help overall company morale.

Davi Ottenheimer August 7, 2009 8:11 PM

$10 says Bruce doesn’t ride motorcycles.

But seriously, when it comes to calculating risk the question has to include whether other drivers will behave differently depending on whether a rider has a helmet or not.

Taking into account only the rider’s behavior/intuition is a typical flaw in risk analysis. I expected better in this article.

I’ll post some data on this in the next day or so. The results are not only surprising but also very frustrating if you happen to ride.

Fernando Pereira August 7, 2009 11:07 PM

@ RvnPhnx: “For instance, the most dangerous people on the slopes are the recreational skiiers and snowboarders wearing helmets” Can you cite any reputable scientific evidence for this claim? The last time I looked, just over a year ago, I spent quite a while with Pubmed and Google and I could not find any study supporting such a claim. I did find several studies, mainly from Canada and from Norway, showing significantly lower levels of serious head trauma for skiers/snowboarders that were wearing a helmet when they had an accident than for those who were not wearing a helmet. As Mark J. noted, a helmet might reduce the feeling of speed, but the statistics are still favorable to wearing a helmet.

As for risk adaptation/risk homeostasis, there’s a lot of hype and confusion in those claims. Many of the discussions are ideological (Wilde is a libertarian that uses the homeostasis argument against government safety regulations), not scientific. In particular, supporters of those hypotheses often extrapolate from cases where a plausible mechanism can be at least proposed to situations where there is no plausible mechanism. To cut a long story short, homeostasis requires feedback which requires a risk signal from the environment. The antilock brakes example has such a feedback mechanism in the different rates of close calls with and without the safety measure at a given speed in given conditions. But many other cases have no such mechanism, in that the safety measure does not mechanically change the system (for a given skiing speed, wearing a helmet does not reduce the close call rate). In snowsports, other technical improvements affect close call rates, in particular recent skis, which are much more stable and manoeverable at speed, and very powerful snowmobiles that make it easier for riders to manage dangerous snow conditions.

Clive Robinson August 8, 2009 6:30 AM

@ Fernando Pereira,

As you note,

“As for risk adaptation/risk homeostasis, there’s a lot of hype and confusion in those claims.”

One problem is the incorect use of control theory to biomechanical semiautomonus systems.

As you say,

“To cut a long story short, homeostasis requires feedback which requires a risk signal from the environment.”

There is an implicit assumption by most people that “feedback” is always good, unfortunatly in all cases it is both good and bad and it is other parts of the system that keep the bad asspects of feedback in check.

For those that are interested very very briefly it started with servo theory. What hapens is you change an input to a servo and at some future point the output of the servo starts to follow the change at the input with a “delay in the response”. Therfore the servo has an inpult delay which is where the trouble starts. The usuall purpose of a servo is to provide an advantage mechanisum (gain) of some form most often it is in terms of “work” that is a small easily generated signal is amplified in some manner to drive a load of some kind.

Due to the lack of “perfection” in all such systems the input to output gain or amplification is not entirly linear, and this gives rise to it having a “frequency and phase response” and “distortion” charecteristic which can be very complex in nature.

It is to over come the effects of distortion and frequency response that “feedback” is used.

Unfortunatly the generic term “feedback” is only half the story and can be applied in two ways (positive and negative). The other half is “feedforward” and it to can be applied in a positive or negative method.

The most frequently used method is “negitive feedback” the least used method is “positive feed forward”. The reason for this is “stability” it is generaly increased with negative feedback and decreased with positive feed forward.

In the analog world there is a truism that “nothing is ever the same twice” and the digital equivalent of “almost the same but not quite” (due to quantatisation etc).

What is often not realised is that like the “servo” that you are trying to control with feedback, the feedback mechanisum also has it’s own charecteristics of gain, delay and frequency/phase response.

Unless the feedback charecteristics are correctly selected with respect to the servo then even with negative feedback stability cannot be unconditionaly guarenteed.

Some of the odd behaviour seen in bio systems can be explained by stability issues due to incrorrect feedback charecteristics and interaction of multiple feedback paths (hand shake when trying to thread a needle or use tweezers and a magnifing glass).

One charecteristic that most complex biological systems have is a “learning process” that adapts the feedback response charecteristics to reduce unwanted responses (think learning to ride a bike etc or drawing with a pen/brush).

There is a good reason for this, systems with to much stability via feedback are often called “over damped” they respond slowly which is in it’s self harmfull (to balance etc). They can also be prone to issues to do with changing inputs that can lead to errors being incorrectly added not subtracted and thus make runaway problems an issue (behaviour similar in aperance to being drunk or sea sick).

Due to a lack of understanding of control theory or inability to apply it to a complex multiple adaptive feedback system analysis of bio systems can and does produce a lot of false results. Which unfortunatly due to lack of correct investigation gives rise to false understanding (Old Nun’s posture etc) which many academic papers in the field are riddled with.

This is not a new issue natural philosphy (physics) had this issue with the way things fall (think feathers and hammers). Untill the independent effects of gravaty and wind resistance etc could be correctly compensated for.

Tom August 15, 2009 6:30 PM

The issue with helmeted motorcycle rider driving faster is quite simple.

When wearing a helmet, your side vision is greatly impeded. This lack of input causes us mere humans to think we are going slower. It would be natural to speed up somewhat to match pre-helmet speed.

Try it when a passenger in a car next. Put your hands to each side of your head (like a blinker on a horse). The net effect is that the speed seems to have dropped.

Peter Martin August 15, 2009 7:53 PM

A nitpick perhaps, but one relevant to several issues raised.

You suggested:

“Fire someone who breaks security procedure, quickly and publicly…. ”

and added:

“If the risks are real, people will get it.”

This could be taken to imply that penalties are of high priority — a questionable assumption.

Surely the most relevant factor here is a procedure that gives a public indication of the risk of getting caught in breaking protocols.

Some workplaces handle the security breach detection risk (not brilliantly) by signalling security inspections in advance. At least it tends to mean workplaces are cleaned up a bit more frequently than might be the case.

More importantly, publicising the fact that automated checks are somehow carried out on a random basis might be more effective.

Note that here in Australia, the introduction of random breath tests for drink driving had a huge effect on road behaviour and accident figures. Penalties were less relevant than the assessment of likely breach detection.

This is implied to some extent in your remarks about motorway speeds: the risks of arrest at a given speed are more relevant than the actual speed limit.

Prisons are full of people who claim they were just doing what other people do: they were just “unlucky” and were caught! So increasing the perception of the likelihood of being “unlucky” seems to be a key.

And of course, many criminologists can cite case after case where the level of penalties can be shown to have had relatively minor effects, where offenders just don’t think they’re going to be caught.

Stephen wilson August 15, 2009 9:12 PM

Hear Hear Bruce!

IT Security people too often presume to know more about risk than their clients. There has been enormous commercial pressure on security professionals and sales people to incorporate “risk” talk into their pitches. I’ve worked inside multiple security product firms and consultancies and ever since c. 1999 the common internal sales strategy for all these outfits has been to avoid promoting technology and instead to promote “risk management”.

There’s a grain of truth in this orientation of course, but unless handled with care, “risk management” is a facile slogan.

I’ve been in sales presentations with banking executives where the security specialists will say with gravitas (and more than a touch of condescension) that “we’re not here to tell you about security; we’re here to talk about risk management”. And then proceed to try and sell an IPS or firewall.

In truth, there is almot nothing new anyone can tell a senior banker about risk. In particular, bankers know exactly what security related losses they are suffering, and how to manage them. Financially.

Cheers,

Steve Wilson, Lockstep.

Clive Robinson August 16, 2009 12:18 AM

@ Tom,

“Try it when a passenger in a car next. Put your hands to each side of your head (like a blinker on a horse). The net effect is that the speed seems to have dropped.”

Only works if you don’t have tunnel vision 😉

And the underlying reason is the rate of angular change in your field of view.

For those of use who are a little slow on a Sunday morning (arn’t we all if we can be 8)

If something is coming directly towards you it’s rate of angular change (at it’s center) is zero. Therefore your brain recognises the movment only by the relative change in size.

However when an object moves on what is effectivly a parallel path to you and is abreast of you (ie “normal” or at 90 degs to your path) the oposit is true. The rate of angular change is maximum and the rate of change in size is zero.

If like me you have a very wide field of view (nearly 180 deg) you might well find putting on a full face helmet is like suffering sensory deprivation (as is wearing a wide brimed hat or putting up the hood of a coat) and causes a sense of “unreality” after quite a short while.

There has been little research done on it but, this wide field of view problem has been shown to be a stronger indicator of “highway hypnosis” (http://en.wikipedia.org/wiki/Highway_hypnosis) or “white line fever”, than suggestability is.

One reason why the research may be minimal is that if it can be shown that it is the case, then some of the accepted “mind models” are wrong…

Gamer Troll August 19, 2009 1:43 AM

Some of us are better at reading risk because we practice more. In war games like Risk, the final arbiter of the success of your strategy is a set of dice, but the game has a rulebook so you can mathematically determine the probability of the success of an attack, because you have all the information you need to do so.

On my bike, I choose to stay on sidewalks, alleys and back streets. I don’t wear a helmet, for the greater visibility, and subsequent informational advantage about my surroundings. I don’t go much above walking pace unless I can see an empty sidewalk. The choice I feel I’m making is taking responsibility for not hitting, or even rushing, the more legitimate users of the sidewalk.

Many car users seem to feel frustrated and impatient, and have no problem with rushing other road users. There’s a lot more of me to break than just my head, and I don’t have insurance on my bike. If I’m involved in a road accident while on my bike, I’m likely to get injured. It might be somebody else’s fault, but that only gives me somebody to sue. It cannot, for example, give me the use of my legs back.

On the sidewalk, I know every collision is almost certain to be my fault. I manage that risk by riding cautiously.

Bankers have financial risks like DDoS extortion that they deal with financially. Risk Management is impossible without win conditions, or at least “not loose” conditions. For me on my bike, “not loose” means my ass and my bike are in one piece, and I haven’t hit anybody. For the bankers, “not loose” is that their website stays open to their legitimate customers.

Trying to sell them an IDS, is like trying to sell me a bike helmet. A solution I don’t need, for a problem that I already have a workaround for.

Stewart McKenna September 13, 2009 5:56 PM

“It seems to me that his co-workers understand the risks better than he does. They know what the real risks are at work, and that they all revolve around not getting the job done. Those risks are real and tangible, and employees feel them all the time. The risks of not following security procedures are much less real. Maybe the employee will get caught, but probably not. And even if he does get caught, the penalties aren’t serious.”

The same principle applies , unfortunately, to ‘Quality Systems’. The peons have to follow the rules, and get fired if they don’t, but leadership/Management will throw the rules out the window at the slightest pretext.

Jeff D James December 9, 2010 6:52 AM

Your statement of “Fire someone who breaks security procedure, quickly and publicly,” is certainly a solution, which does indeed make practical sense, however the claim of ignorance is always there as a fallback isn’t it? Unless of course it is a continual occurrence

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.