Essays in the Category “Psychology of Security”

Our Decreasing Tolerance To Risk

  • Bruce Schneier
  • Forbes
  • August 23, 2013

We're afraid of risk. It's a normal part of life, but we're increasingly unwilling to accept it at any level. So we turn to technology to protect us. The problem is that technological security measures aren't free.  

Read More →

The Boston Marathon Bombing: Keep Calm and Carry On

It is easy to feel scared and powerless in the wake of attacks like those at the Boston Marathon. But it also plays into the perpetrators' hands.

  • Bruce Schneier
  • The Atlantic
  • April 15, 2013

German translation

As the details about the bombings in Boston unfold, it'd be easy to be scared. It'd be easy to feel powerless and demand that our elected leaders do something—anything—to keep us safe. 

It'd be easy, but it'd be wrong.

Read More →

On Security Awareness Training

The focus on training obscures the failures of security design

  • Bruce Schneier
  • Dark Reading
  • March 19, 2013

Should companies spend money on security awareness training for their employees? It's a contentious topic, with respected experts on both sides of the debate. I personally believe that training users in security is generally a waste of time, and that the money can be spent better elsewhere. Moreover, I believe that our industry's focus on training serves to obscure greater failings in security design.

Read More →

Unsafe Security: A Sociologist Aptly Analyzes our Failures in Top-Down Protection

  • Bruce Schneier
  • Reason
  • January 2013

Against Security: How We Go Wrong at Airports, Subways, and Other Sites of Ambiguous Danger, by Harvey Molotch, Princeton University Press, 278 pages, $35.

Security is both a feeling and a reality, and the two are different things. People can feel secure when they’re actually not, and they can be secure even when they believe otherwise.

This discord explains much of what passes for our national discourse on security policy.

Read More →

The Importance of Security Engineering

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2012

In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don't recommend reading the entire discussion; we spent 14,000 words talking past each other. But what's interesting is how our debate illustrates the differences between a security engineer and an intelligent layman.

Read More →

Drawing the Wrong Lessons from Horrific Events

  • Bruce Schneier
  • CNN
  • July 31, 2012

Horrific events, such as the massacre in Aurora, can be catalysts for social and political change. Sometimes it seems that they're the only catalyst; recall how drastically our policies toward terrorism changed after 9/11 despite how moribund they were before.

The problem is that fear can cloud our reasoning, causing us to overreact and to overly focus on the specifics. And the key is to steer our desire for change in that time of fear.

Read More →

Detecting Cheaters

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2011

Our brains are specially designed to deal with cheating in social exchanges. The evolutionary psychology explanation is that we evolved brain heuristics for the social problems that our prehistoric ancestors had to deal with. Once humans became good at cheating, they then had to become good at detecting cheating -- otherwise, the social group would fall apart.

Perhaps the most vivid demonstration of this can be seen with variations on what's known as the Wason selection task, named after the psychologist who first studied it.

Read More →

Worst-Case Thinking Makes Us Nuts, Not Safe

  • Bruce Schneier
  • CNN
  • May 12, 2010

At a security conference recently, the moderator asked the panel of distinguished cybersecurity leaders what their nightmare scenario was. The answers were the predictable array of large-scale attacks: against our communications infrastructure, against the power grid, against the financial system, in combination with a physical attack.

I didn't get to give my answer until the afternoon, which was: "My nightmare scenario is that people keep talking about their nightmare scenarios."

There's a certain blindness that comes from worst-case thinking. An extension of the precautionary principle, it involves imagining the worst possible outcome and then acting as if it were a certainty.

Read More →

Nature's Fears Extend to Online Behavior

  • Bruce Schneier
  • The Japan Times
  • November 3, 2009

It's hard work being prey. Watch the birds at a feeder. They're constantly on alert, and will fly away from food -- from easy nutrition -- at the slightest movement or sound. Given that I've never, ever seen a bird plucked from a feeder by a predator, it seems like a whole lot of wasted effort against a small threat.

Read More →

People Understand Risks -- But Do Security Staff Understand People?

Natural human risk intuition deserves respect -- even when it doesn't help the security team

  • Bruce Schneier
  • The Guardian
  • August 5, 2009

This essay also appeared in The Sydney Morning Herald, and The Age.

People have a natural intuition about risk, and in many ways it's very good. It fails at times due to a variety of cognitive biases, but for normal risks that people regularly encounter, it works surprisingly well: often better than we give it credit for.

This struck me as I listened to yet another conference presenter complaining about security awareness training.

Read More →

Security, Group Size, and the Human Brain

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2009

If the size of your company grows past 150 people, it's time to get name badges. It's not that larger groups are somehow less secure, it's just that 150 is the cognitive limit to the number of people a human brain can maintain a coherent social relationship with.

Primatologist Robin Dunbar derived this number by comparing neocortex -- the "thinking" part of the mammalian brain -- volume with the size of primate social groups. By analyzing data from 38 primate genera and extrapolating to the human neocortex size, he predicted a human "mean group size" of roughly 150.

Read More →

How Science Fiction Writers Can Help, or Hurt, Homeland Security

  • Bruce Schneier
  • Wired
  • June 18, 2009

A couple of years ago, the Department of Homeland Security hired a bunch of science fiction writers to come in for a day and think of ways terrorists could attack America. If our inability to prevent 9/11 marked a failure of imagination, as some said at the time, then who better than science fiction writers to inject a little imagination into counterterrorism planning?

I discounted the exercise at the time, calling it "embarrassing." I never thought that 9/11 was a failure of imagination. I thought, and still think, that 9/11 was primarily a confluence of three things: the dual failure of centralized coordination and local control within the FBI, and some lucky breaks on the part of the attackers.

Read More →

The Kindness of Strangers

  • Bruce Schneier
  • The Wall Street Journal
  • March 12, 2009

When I was growing up, children were commonly taught: "don't talk to strangers." Strangers might be bad, we were told, so it's prudent to steer clear of them.

And yet most people are honest, kind, and generous, especially when someone asks them for help. If a small child is in trouble, the smartest thing he can do is find a nice-looking stranger and talk to him.

Read More →

Does Risk Management Make Sense?

  • Bruce Schneier
  • Information Security
  • October 2008

This essay appeared as the first half of a point-counterpoint with Marcus Ranum. Marcus's half is here.

We engage in risk management all the time, but it only makes sense if we do it right.

"Risk management" is just a fancy term for the cost-benefit tradeoff associated with any security decision. It's what we do when we react to fear, or try to make ourselves feel secure.

Read More →

How the Human Brain Buys Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2008

People tend to be risk-averse when it comes to gains, and risk-seeking when it comes to losses. If you give people a choice between a $500 sure gain and a coin-flip chance of a $1,000 gain, about 75 percent will pick the sure gain. But give people a choice between a $500 sure loss and a coin-flip chance of a $1,000 loss, about 75 percent will pick the coin flip.

People don't have a standard mathematical model of risk in their heads.

Read More →

The Difference Between Feeling and Reality in Security

  • Bruce Schneier
  • Wired
  • April 03, 2008

Security is both a feeling and a reality, and they're different. You can feel secure even though you're not, and you can be secure even though you don't feel it. There are two different concepts mapped onto the same word -- the English language isn't working very well for us here -- and it can be hard to know which one we're talking about when we use the word.

There is considerable value in separating out the two concepts: in explaining how the two are different, and understanding when we're referring to one and when the other.

Read More →

Inside the Twisted Mind of the Security Professional

  • Bruce Schneier
  • Wired
  • March 20, 2008

Uncle Milton Industries has been selling ant farms to children since 1956. Some years ago, I remember opening one up with a friend. There were no actual ants included in the box. Instead, there was a card that you filled in with your address, and the company would mail you some ants.

Read More →

The Psychology of Security (Part 2)

  • Bruce Schneier
  • January 18, 2008

Return to Part 1

The Availability Heuristic

The "availability heuristic" is very broad, and goes a long way toward explaining how people deal with risk and trade-offs. Basically, the availability heuristic means that people "assess the frequency of a class or the probability of an event by the ease with which instances or occurrences can be brought to mind."28 In other words, in any decision-making process, easily remembered (available) data are given greater weight than hard-to-remember data.

In general, the availability heuristic is a good mental shortcut. All things being equal, common events are easier to remember than uncommon ones.

Read More →

The Psychology of Security (Part 1)

  • Bruce Schneier
  • January 18, 2008

Introduction

Security is both a feeling and a reality. And they're not the same.

The reality of security is mathematical, based on the probability of different risks and the effectiveness of different countermeasures. We can calculate how secure your home is from burglary, based on such factors as the crime rate in the neighborhood you live in and your door-locking habits.

Read More →

The Evolutionary Brain Glitch That Makes Terrorism Fail

  • Bruce Schneier
  • Wired
  • July 12, 2007

Two people are sitting in a room together: an experimenter and a subject. The experimenter gets up and closes the door, and the room becomes quieter. The subject is likely to believe that the experimenter's purpose in closing the door was to make the room quieter.

This is an example of correspondent inference theory.

Read More →

Don't Look a Leopard in the Eye, and Other Security Advice

  • Bruce Schneier
  • Wired
  • May 31, 2007

If you encounter an aggressive lion, stare him down. But not a leopard; avoid his gaze at all costs. In both cases, back away slowly; don't run. If you stumble on a pack of hyenas, run and climb a tree; hyenas can't climb trees.

Read More →

Virginia Tech Lesson: Rare Risks Breed Irrational Responses

  • Bruce Schneier
  • Wired
  • May 17, 2007

French translation

Everyone had a reaction to the horrific events of the Virginia Tech shootings. Some of those reactions were rational. Others were not.

A high school student was suspended for customizing a first-person shooter game with a map of his school.

Read More →

Psychology of Security

  • Bruce Schneier
  • Communications of the ACM
  • May 2007

The security literature is filled with risk pathologies, heuristics that we use to help us evaluate risks. I've collected them from many different sources.

Risks of Risks Exaggerated Risks Downplayed Risks Spectacular Pedestrian Rare Common Personified Anonymous Beyond one’s control More under control Externally imposed Taken willingly Talked about Not discussed Intentional or man-made Natural Immediate Long-term or diffuse Sudden Evolving slowly over time Affecting them personally Affecting others New and unfamiliar Familiar Uncertain Well understood Directed against their children Directed toward themselves Morally offensive Morally desirable Entirely without redeeming features Associated with some ancillary benefit Not like their current situation Like their current situation

When you look over the list of exaggerated and downplayed risks in the table here, the most remarkable thing is how reasonable so many of them seem. This makes sense for two reasons.

Read More →

Why the Human Brain Is a Poor Judge of Risk

  • Bruce Schneier
  • Wired
  • March 22, 2007

The human brain is a fascinating organ, but it's an absolute mess. Because it has evolved over millions of years, there are all sorts of processes jumbled together rather than logically organized. Some of the processes are optimized for only certain kinds of situations, while others don't work as well as they could. There's some duplication of effort, and even some conflicting brain processes.

Read More →

In Praise of Security Theater

  • Bruce Schneier
  • Wired
  • January 25, 2007

Danish translation
Portuguese translation

While visiting some friends and their new baby in the hospital last week, I noticed an interesting bit of security. To prevent infant abduction, all babies had RFID tags attached to their ankles by a bracelet. There are sensors on the doors to the maternity ward, and if a baby passes through, an alarm goes off.

Infant abduction is rare, but still a risk.

Read More →

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient Systems, Inc.