Psychology of Security
By Bruce Schneier
Communications of the ACM
The security literature is filled with risk pathologies, heuristics that we use to help us evaluate risks. I've collected them from many different sources.
|Exaggerated Risks||Downplayed Risks|
|Beyond one’s control||More under control|
|Externally imposed||Taken willingly|
|Talked about||Not discussed|
|Intentional or man-made||Natural|
|Immediate||Long-term or diffuse|
|Sudden||Evolving slowly over time|
|Affecting them personally||Affecting others|
|New and unfamiliar||Familiar|
|Directed against their children||Directed toward themselves|
|Morally offensive||Morally desirable|
|Entirely without redeeming features||Associated with some ancillary benefit|
|Not like their current situation||Like their current situation|
When you look over the list of exaggerated and downplayed risks in the table here, the most remarkable thing is how reasonable so many of them seem. This makes sense for two reasons. One, our perceptions of risk are deeply ingrained in our brains, the result of millions of years of evolution. And two, our perceptions of risk are generally pretty good, and are what have kept us alive and reproducing during those millions of years of evolution.
This is an important point. A general intuition about risks is central to life on this planet. Imagine a rabbit sitting in a field, eating clover. Suddenly, he spies a fox. He's going to make risk evaluation: stay or flee? The rabbits that are good at making these evaluations are going to reproduce, while the others are either going to get eaten or starve. This means that, as a successful species on the planet, humans should be really good at evaluating risks.
And yet, at the same time we seem hopelessly bad at it. We exaggerate some risks while minimizing others. We misunderstand or mischaracterize risks. Even simple security we get wrong, wrong, wrong—again and again. It's a seeming paradox.
The truth is that we are very well adapted to dealing with the security environment endemic to hominids living in small family groups on the highland plains of East Africa. However, the environment of New York in 2007 is different from Kenya circa 100,000 BC. And so our perception of risk diverges from the reality of risk, and we get things wrong.
When our risk perceptions fail today, it's because of new situations that have occurred at a faster rate than evolution: situations that exist in the world of 2007, but didn't in the world of 100,000 BC. Like a squirrel whose predator-evasion techniques fail when confronted with a car, or a passenger pigeon who finds that evolution prepared him to survive the hawk but not the shotgun, our innate capabilities to deal with risk can fail when confronted with such things as modern human society, technology, and the media. And, even worse, they can be made to fail by others—politicians, marketers, and so on—who exploit our natural failures for their gain.
This topic is explored more fully in my essay, "The Psychology of Security."
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..