Essays in the Category "Psychology of Security"

Page 2 of 3

Worst-Case Thinking Makes Us Nuts, Not Safe

  • Bruce Schneier
  • CNN
  • May 12, 2010

At a security conference recently, the moderator asked the panel of distinguished cybersecurity leaders what their nightmare scenario was. The answers were the predictable array of large-scale attacks: against our communications infrastructure, against the power grid, against the financial system, in combination with a physical attack.

I didn’t get to give my answer until the afternoon, which was: “My nightmare scenario is that people keep talking about their nightmare scenarios.”

There’s a certain blindness that comes from worst-case thinking. An extension of the …

Nature's Fears Extend to Online Behavior

  • Bruce Schneier
  • The Japan Times
  • November 3, 2009

It’s hard work being prey. Watch the birds at a feeder. They’re constantly on alert, and will fly away from food—from easy nutrition—at the slightest movement or sound. Given that I’ve never, ever seen a bird plucked from a feeder by a predator, it seems like a whole lot of wasted effort against a small threat.

Assessing and reacting to risk is one of the most important things a living creature has to deal with. The amygdala, an ancient part of the brain that first evolved in primitive fishes, has that job. It’s what’s responsible for the fight-or-flight reflex. Adrenaline in the bloodstream, increased heart rate, increased muscle tension, sweaty palms; that’s the amygdala in action. You notice it when you fear a dark alley, have vague fears of terrorism, or worry about predators stalking your children on the Internet. And it works fast, faster than consciousnesses: show someone a snake and their amygdala will react before their conscious brain registers that they’re looking at a snake…

People Understand Risks—But Do Security Staff Understand People?

Natural human risk intuition deserves respect -- even when it doesn't help the security team

  • Bruce Schneier
  • The Guardian
  • August 5, 2009

This essay also appeared in The Sydney Morning Herald, and The Age.

People have a natural intuition about risk, and in many ways it’s very good. It fails at times due to a variety of cognitive biases, but for normal risks that people regularly encounter, it works surprisingly well: often better than we give it credit for.

This struck me as I listened to yet another conference presenter complaining about security awareness training. He was talking about the difficulty of getting employees at his company to actually follow his security policies: encrypting data on memory sticks, not sharing passwords, not logging in from untrusted wireless networks. “We have to make people understand the risks,” he said…

Security, Group Size, and the Human Brain

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2009

View or Download in PDF Format

If the size of your company grows past 150 people, it’s time to get name badges. It’s not that larger groups are somehow less secure, it’s just that 150 is the cognitive limit to the number of people a human brain can maintain a coherent social relationship with.

Primatologist Robin Dunbar derived this number by comparing neocortex—the “thinking” part of the mammalian brain—volume with the size of primate social groups. By analyzing data from 38 primate genera and extrapolating to the human neocortex size, he predicted a human “mean group size” of roughly 150…

How Science Fiction Writers Can Help, or Hurt, Homeland Security

  • Bruce Schneier
  • Wired
  • June 18, 2009

A couple of years ago, the Department of Homeland Security hired a bunch of science fiction writers to come in for a day and think of ways terrorists could attack America. If our inability to prevent 9/11 marked a failure of imagination, as some said at the time, then who better than science fiction writers to inject a little imagination into counterterrorism planning?

I discounted the exercise at the time, calling it “embarrassing.” I never thought that 9/11 was a failure of imagination. I thought, and still think, that 9/11 was primarily a confluence of three things: the dual failure of centralized coordination and local control within the FBI, and some lucky breaks on the part of the attackers. More imagination leads to more …

The Kindness of Strangers

  • Bruce Schneier
  • The Wall Street Journal
  • March 12, 2009

When I was growing up, children were commonly taught: “don’t talk to strangers.” Strangers might be bad, we were told, so it’s prudent to steer clear of them.

And yet most people are honest, kind, and generous, especially when someone asks them for help. If a small child is in trouble, the smartest thing he can do is find a nice-looking stranger and talk to him.

These two pieces of advice may seem to contradict each other, but they don’t. The difference is that in the second instance, the child is choosing which stranger to talk to. Given that the overwhelming majority of people will help, the child is likely to get help if he chooses a random stranger. But if a stranger comes up to a child and talks to him or her, it’s not a random choice. It’s more likely, although still unlikely, that the stranger is up to no good…

Does Risk Management Make Sense?

  • Bruce Schneier
  • Information Security
  • October 2008

This essay appeared as the first half of a point-counterpoint with Marcus Ranum. Marcus’s half is here.

We engage in risk management all the time, but it only makes sense if we do it right.

“Risk management” is just a fancy term for the cost-benefit tradeoff associated with any security decision. It’s what we do when we react to fear, or try to make ourselves feel secure. It’s the fight-or-flight reflex that evolved in primitive fish and remains in all vertebrates. It’s instinctual, intuitive and fundamental to life, and one of the brain’s primary functions…

How the Human Brain Buys Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2008

View or Download in PDF Format

People tend to be risk-averse when it comes to gains, and risk-seeking when it comes to losses. If you give people a choice between a $500 sure gain and a coin-flip chance of a $1,000 gain, about 75 percent will pick the sure gain. But give people a choice between a $500 sure loss and a coin-flip chance of a $1,000 loss, about 75 percent will pick the coin flip.

People don’t have a standard mathematical model of risk in their heads. Their trade-offs are more subtle, and result from our brains have developed. A computer might not see the difference between the two choices—it’s simply a measure of how risk-averse you are—but humans do…

The Difference Between Feeling and Reality in Security

  • Bruce Schneier
  • Wired
  • April 3, 2008

Security is both a feeling and a reality, and they’re different. You can feel secure even though you’re not, and you can be secure even though you don’t feel it. There are two different concepts mapped onto the same word—the English language isn’t working very well for us here—and it can be hard to know which one we’re talking about when we use the word.

There is considerable value in separating out the two concepts: in explaining how the two are different, and understanding when we’re referring to one and when the other. There is value as well in recognizing when the two converge, understanding why they diverge, and knowing how they can be made to converge again…

Inside the Twisted Mind of the Security Professional

  • Bruce Schneier
  • Wired
  • March 20, 2008

Uncle Milton Industries has been selling ant farms to children since 1956. Some years ago, I remember opening one up with a friend. There were no actual ants included in the box. Instead, there was a card that you filled in with your address, and the company would mail you some ants. My friend expressed surprise that you could get ants sent to you in the mail.

I replied: “What’s really interesting is that these people will send a tube of live ants to anyone you tell them to.”

Security requires a particular mindset. Security professionals—at least the good ones—see the world differently. They can’t walk into a store without noticing how they might shoplift. They can’t use a computer without wondering about the security vulnerabilities. They can’t vote without trying to figure out how to vote twice. They just can’t help it…

Sidebar photo of Bruce Schneier by Joe MacInnis.