Essays in the Category "Psychology of Security"
Page 1 of 3
Living in Code Yellow
In 1989, handgun expert Jeff Cooper invented something called the Color Code to describe what he called the ‘combat mind-set.’ Here is his summary:
In White you are unprepared and unready to take lethal action. If you are attacked in White you will probably die unless your adversary is totally inept.
In Yellow you bring yourself to the understanding that your life may be in danger and that you may have to do something about it.
In Orange you have determined upon a specific adversary and are prepared to take action which may result in his death, but you are not in a lethal mode…
Our Decreasing Tolerance To Risk
We’re afraid of risk. It’s a normal part of life, but we’re increasingly unwilling to accept it at any level. So we turn to technology to protect us. The problem is that technological security measures aren’t free. They cost money, of course, but they cost other things as well. They often don’t provide the security they advertise, and—paradoxically—they often increase risk somewhere else. This problem is particularly stark when the risk involves another person: crime, terrorism, and so on. While technology has made us much safer against natural risks like accidents and disease, it works less well against man-made risks…
The Boston Marathon Bombing: Keep Calm and Carry On
It is easy to feel scared and powerless in the wake of attacks like those at the Boston Marathon. But it also plays into the perpetrators' hands.
As the details about the bombings in Boston unfold, it’d be easy to be scared. It’d be easy to feel powerless and demand that our elected leaders do something—anything—to keep us safe.
It’d be easy, but it’d be wrong. We need to be angry and empathize with the victims without being scared. Our fears would play right into the perpetrators’ hands—and magnify the power of their victory for whichever goals whatever group behind this, still to be uncovered, has. We don’t have to be scared, and we’re not powerless. We actually have all the power here, and there’s one thing we can do to render terrorism ineffective: …
On Security Awareness Training
The focus on training obscures the failures of security design
Should companies spend money on security awareness training for their employees? It’s a contentious topic, with respected experts on both sides of the debate. I personally believe that training users in security is generally a waste of time, and that the money can be spent better elsewhere. Moreover, I believe that our industry’s focus on training serves to obscure greater failings in security design.
In order to understand my argument, it’s useful to look at training’s successes and failures. One area where it doesn’t work very well is health. We are forever trying to train people to have healthier lifestyles: eat better, exercise more, whatever. And people are forever ignoring the lessons. One basic reason is psychological: we just aren’t very good at trading off immediate gratification for long-term benefit. A healthier you is an abstract eventually; sitting in front of the television all afternoon with a McDonald’s Super Monster Meal sounds really good …
Unsafe Security: A Sociologist Aptly Analyzes our Failures in Top-Down Protection
Against Security: How We Go Wrong at Airports, Subways, and Other Sites of Ambiguous Danger, by Harvey Molotch, Princeton University Press, 278 pages, $35.
Security is both a feeling and a reality, and the two are different things. People can feel secure when they’re actually not, and they can be secure even when they believe otherwise.
This discord explains much of what passes for our national discourse on security policy. Security measures often are nothing more than security theater, making people feel safer without actually increasing their protection…
The Importance of Security Engineering
View or Download in PDF Format
In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don’t recommend reading the entire discussion; we spent 14,000 words talking past each other. But what’s interesting is how our debate illustrates the differences between a security engineer and an intelligent layman. Harris was uninterested in the detailed analysis required to understand a security system and unwilling to accept that security engineering is a specialized discipline with a body of knowledge and relevant expertise. He trusted his intuition…
Drawing the Wrong Lessons from Horrific Events
Horrific events, such as the massacre in Aurora, can be catalysts for social and political change. Sometimes it seems that they’re the only catalyst; recall how drastically our policies toward terrorism changed after 9/11 despite how moribund they were before.
The problem is that fear can cloud our reasoning, causing us to overreact and to overly focus on the specifics. And the key is to steer our desire for change in that time of fear.
Our brains aren’t very good at probability and risk analysis. We tend to exaggerate spectacular, strange and rare events, and downplay ordinary, familiar and common ones. We think rare risks are more common than they are. We fear them more than probability indicates we should…
Empathy and Security
View or Download in PDF Format
Several independent streams of research seem to have converged on the role of empathy in security. Understanding how empathy works and fails—and how it can be harnessed—could be important as we develop security systems that protect people over computer networks.
Mirror neurons are part of a recently discovered brain system that activates both when an individual does something and when that individual observes someone else doing the same thing. They’re what allow us to “mirror” the behaviors of others, and they seem to play a major role in language acquisition, theory of mind, and empathy…
Our brains are specially designed to deal with cheating in social exchanges. The evolutionary psychology explanation is that we evolved brain heuristics for the social problems that our prehistoric ancestors had to deal with. Once humans became good at cheating, they then had to become good at detecting cheating—otherwise, the social group would fall apart.
Perhaps the most vivid demonstration of this can be seen with variations on what’s known as the Wason selection task, named after the psychologist who first studied it. Back in the 1960s, it was a test of logical reasoning; today, it’s used more as a demonstration of evolutionary psychology. But before we get to the experiment, let’s get into the mathematical background…
Sidebar photo of Bruce Schneier by Joe MacInnis.