People Understand Risks—But Do Security Staff Understand People?

Natural human risk intuition deserves respect -- even when it doesn't help the security team

This essay also appeared in The Sydney Morning Herald, and The Age.

People have a natural intuition about risk, and in many ways it’s very good. It fails at times due to a variety of cognitive biases, but for normal risks that people regularly encounter, it works surprisingly well: often better than we give it credit for.

This struck me as I listened to yet another conference presenter complaining about security awareness training. He was talking about the difficulty of getting employees at his company to actually follow his security policies: encrypting data on memory sticks, not sharing passwords, not logging in from untrusted wireless networks. “We have to make people understand the risks,” he said.

It seems to me that his co-workers understand the risks better than he does. They know what the real risks are at work, and that they all revolve around not getting the job done. Those risks are real and tangible, and employees feel them all the time. The risks of not following security procedures are much less real. Maybe the employee will get caught, but probably not. And even if he does get caught, the penalties aren’t serious.

Given this accurate risk analysis, any rational employee will regularly circumvent security to get his or her job done. That’s what the company rewards, and that’s what the company actually wants.

“Fire someone who breaks security procedure, quickly and publicly,” I suggested to the presenter. “That’ll increase security awareness faster than any of your posters or lectures or newsletters.” If the risks are real, people will get it.

You see the same sort of risk intuition on motorways. People are less careful about posted speed limits than they are about the actual speeds police issue tickets for. It’s also true on the streets: people respond to real crime rates, not public officials proclaiming that a neighbourhood is safe.

The warning stickers on ladders might make you think the things are considerably riskier than they are, but people have a good intuition about ladders and ignore most of the warnings. (This isn’t to say that some people don’t do stupid things around ladders, but for the most part they’re safe. The warnings are more about the risk of lawsuits to ladder manufacturers than risks to people who climb ladders.)

As a species, we are naturally tuned in to the risks inherent in our environment. Throughout our evolution, our survival depended on making reasonably accurate risk management decisions intuitively, and we’re so good at it, we don’t even realise we’re doing it.

Parents know this. Children have surprisingly perceptive risk intuition. They know when parents are serious about a threat and when their threats are empty. And they respond to the real risks of parental punishment, not the inflated risks based on parental rhetoric. Again, awareness training lectures don’t work; there have to be real consequences.

It gets even weirder. The University College London professor John Adams popularised the metaphor of a mental risk thermostat . We tend to seek some natural level of risk, and if something becomes less risky, we tend to make it more risky. Motorcycle riders who wear helmets drive faster than riders who don’t.

Our risk thermostats aren’t perfect (that newly helmeted motorcycle rider will still decrease his overall risk) and will tend to remain within the same domain (he might drive faster, but he won’t increase his risk by taking up smoking), but in general, people demonstrate an innate and finely tuned ability to understand and respond to risks.

Of course, our risk intuition fails spectacularly and often , with regards to rare risks , unknown risks, voluntary risks, and so on. But when it comes to the common risks we face every day—the kinds of risks our evolutionary survival depended on—we’re pretty good.

So whenever you see someone in a situation who you think doesn’t understand the risks, stop first and make sure you understand the risks. You might be surprised.

Categories: Psychology of Security

Sidebar photo of Bruce Schneier by Joe MacInnis.