Entries Tagged "psychology of security"

Page 8 of 26

Top Secret America on the Post-9/11 Cycle of Fear and Funding

I’m reading Top Secret America: The Rise of the New American Security State, by Dana Priest and William M. Arkin. Both work for The Washington Post. The book talks about the rise of the security-industrial complex in post 9/11 America. This short quote is from Chapter 3:

Such dread was a large part of the post-9/11 decade. A culture of fear had created a culture of spending to control it, which, in turn, had led to a belief that the government had to be able to stop every single plot before it took place, regardless of whether it involved one network of twenty terrorists or one single deranged person. This expectation propelled more spending and even more zero-defect expectations. There were tens of thousands of unsolved murders in the United States by 2010, but few newspapers ever blared this across their front pages or even tried to investigate how their police departments had to failed to solve them all over the years. But when it came to terrorism, newspaper and other media outlets amplified each mistake, which amplified the threat, which amplified the fear, which prompted more spending, and on and on and on.

It’s a really good book so far. I recommend it.

EDITED TO ADD (7/13): The project’s website has a lot of interesting information as well.

Posted on June 27, 2012 at 6:35 AMView Comments

Economic Analysis of Bank Robberies

Yes, it’s clever:

The basic problem is the average haul from a bank job: for the three-year period, it was only £20,330.50 (~$31,613). And it gets worse, as the average robbery involved 1.6 thieves. So the authors conclude, “The return on an average bank robbery is, frankly, rubbish. It is not unimaginable wealth. It is a very modest £12,706.60 per person per raid.”

“Given that the average UK wage for those in full-time employment is around £26,000, it will give him a modest life-style for no more than 6 months,” the authors note. If a robber keeps hitting banks at a rate sufficient to maintain that modest lifestyle, by a year and a half into their career, odds are better than not they’ll have been caught. “As a profitable occupation, bank robbery leaves a lot to be desired.”

Worse still, the success of a robbery was a bit like winning the lottery, as the standard deviation on the £20,330.50 was £53,510.20. That means some robbers did far better than average, but it also means that fully a third of robberies failed entirely.

(If, at this point, you’re thinking that the UK is just a poor location for the bank robbery industry, think again, as the authors use FBI figures to determine that the average heist in the States only nets $4,330.00.)

There are ways to increase your chance of getting a larger haul. “Every extra member of the gang raises the expected value of the robbery proceeds by £9,033.20, on average and other things being equal,” the authors note. Brandishing some sort of firearm adds another £10 300.50, “again on average and other things being equal.”

We all kind of knew this—that’s why most of us aren’t bank robbers. The interesting question, at least to me, is why anyone is a bank robber. Why do people do things that, by any rational economic analysis, are irrational?

The answer is that people are terrible at figuring this sort of stuff out. They’re terrible at estimating the probability that any of their endeavors will succeed, and they’re terrible at estimating what their reward will be if they do succeed. There is a lot of research supporting this, but the most recent—and entertaining—thing on the topic I’ve seen recently is this TED talk by Daniel Gilbert.

Note bonus discussion terrorism at the very end.

EDITED TO ADD (7/14): Bank robbery and the Dunning-Kruger effect.

Posted on June 22, 2012 at 7:20 AMView Comments

Honor System Farm Stands

Many roadside farm stands in the U.S. are unstaffed. They work on the honor system: take what you want, and pay what you owe.

And today at his farm stand, Cochran says, just as at the donut shop years ago, most customers leave more money than they owe.

That doesn’t surprise social psychologist Michael Cunningham of the University of Louisville who has used “trust games” to investigate what spurs good and bad behavior for the last 25 years. For many people, Cunningham says, trust seems to be at least as strong a motivator as guilt. He thinks he knows why.

“When you sell me something I want and trust me to pay you even when you’re not looking, you’ve made my life good in two ways,” Cunningham tells The Salt. “I get something delicious, and I also get a good feeling about myself. Both of those things make me feel good about the world­ that I’m in a good place. And I also see you as a contributor to that good ­ as somebody I want to reward. It’s a win win.”

I like systems that leverage personal moral codes for security. But I’ll bet that the pay boxes are bolted to the tables. It’s one thing for someone to take produce without paying. It’s quite another for him to take the entire day’s receipts.

Posted on June 18, 2012 at 6:40 AMView Comments

Security and Human Behavior (SHB 2012)

I’m at the Fifth Interdisciplinary Workshop on Security and Human Behavior, SHB 2012. Google is hosting this year, at its offices in lower Manhattan.

SHB is an invitational gathering of psychologists, computer security researchers, behavioral economists, sociologists, law professors, business school professors, political scientists, anthropologists, philosophers, and others—all of whom are studying the human side of security—organized by Alessandro Acquisti, Ross Anderson, and me. It’s not just an interdisciplinary event; most of the people here are individually interdisciplinary.

This is the best and most intellectually stimulating conference I attend all year. I told that to one of the participants yesterday, and he said something like: “Of course it is. You’ve specifically invited everyone you want to listen to.” Which is basically correct. The workshop is organized into panels of 6-7 people. Each panelist gets ten minutes to talk about what he or she is working on, and then we spend the rest of the hour and a half in discussion.

Here is the list of participants. The list contains links to readings from each of them—definitely a good place to browse for more information on this topic. Ross Anderson, who has far more discipline than I, is liveblogging this event. Go to the comments of that blog post to see summaries of the individual sessions.

Here are links to my posts on the first, second, third, and fourth SHB workshops. Follow those links to find summaries, papers, and audio recordings of the workshops.

Posted on June 5, 2012 at 1:16 PMView Comments

The Unreliability of Eyewitness Testimony

Interesting article:

The reliability of witness testimony is a vastly complex subject, but legal scholars and forensic psychologists say it’s possible to extract the truth from contradictory accounts and evolving memories. According to Barbara Tversky, professor emerita of psychology at Stanford University, the bottom line is this: “All other things equal, earlier recountings are more likely to be accurate than later ones. The longer the delay, the more likely that subsequent information will get confused with the target memory.”

[…]

Memory is a reconstructive process, says Richard Wise, a forensic psychologist at the University of North Dakota. “When an eyewitness recalls a crime, he or she must reconstruct his or her memory of the crime.” This, he says, is an unconscious process. To reconstruct a memory, the eyewitness draws upon several sources of information, only one being his or her actual recollection.

“To fill in gaps in memory, the eyewitness relies upon his or her expectation, attitudes, prejudices, bias, and prior knowledge. Furthermore, information supplied to an eyewitness after a crime (i.e., post-event information) by the police, prosecutor, other eyewitnesses, media, etc., can alter an eyewitness’s memory of the crime,” Wise said in an email.

That external input is what makes eyewitness testimony so unreliable. Eyewitnesses are generally unaware that their memory has been altered by post-event information, and feel convinced they’re recalling only the incident itself. “Once an eyewitness’s memory of the crime has been altered by post-event information, it is difficult or impossible to restore the eyewitness’s original memory of the crime,” Wise told Life’s Little Mysteries.

Posted on June 4, 2012 at 6:36 AMView Comments

The Psychology of Immoral (and Illegal) Behavior

When I talk about Liars and Outliers to security audiences, one of the things I stress is our traditional security focus—on technical countermeasures—is much narrower than it could be. Leveraging moral, reputational, and institutional pressures are likely to be much more effective in motivating cooperative behavior.

This story illustrates the point. It’s about the psychology of fraud, “why good people do bad things.”

There is, she says, a common misperception that at moments like this, when people face an ethical decision, they clearly understand the choice that they are making.

“We assume that they can see the ethics and are consciously choosing not to behave ethically,” Tenbrunsel says.

This, generally speaking, is the basis of our disapproval: They knew. They chose to do wrong.

But Tenbrunsel says that we are frequently blind to the ethics of a situation.

Over the past couple of decades, psychologists have documented many different ways that our minds fail to see what is directly in front of us. They’ve come up with a concept called “bounded ethicality”: That’s the notion that cognitively, our ability to behave ethically is seriously limited, because we don’t always see the ethical big picture.

One small example: the way a decision is framed. “The way that a decision is presented to me,” says Tenbrunsel, “very much changes the way in which I view that decision, and then eventually, the decision it is that I reach.”

Essentially, Tenbrunsel argues, certain cognitive frames make us blind to the fact that we are confronting an ethical problem at all.

Tenbrunsel told us about a recent experiment that illustrates the problem. She got together two groups of people and told one to think about a business decision. The other group was instructed to think about an ethical decision. Those asked to consider a business decision generated one mental checklist; those asked to think of an ethical decision generated a different mental checklist.

Tenbrunsel next had her subjects do an unrelated task to distract them. Then she presented them with an opportunity to cheat.

Those cognitively primed to think about business behaved radically different from those who were not—no matter who they were, or what their moral upbringing had been.

“If you’re thinking about a business decision, you are significantly more likely to lie than if you were thinking from an ethical frame,” Tenbrunsel says.

According to Tenbrunsel, the business frame cognitively activates one set of goals—to be competent, to be successful; the ethics frame triggers other goals. And once you’re in, say, a business frame, you become really focused on meeting those goals, and other goals can completely fade from view.

Also:

Typically when we hear about large frauds, we assume the perpetrators were driven by financial incentives. But psychologists and economists say financial incentives don’t fully explain it. They’re interested in another possible explanation: Human beings commit fraud because human beings like each other.

We like to help each other, especially people we identify with. And when we are helping people, we really don’t see what we are doing as unethical.

The article even has some concrete security ideas:

Now if these psychologists and economists are right, if we are all capable of behaving profoundly unethically without realizing it, then our workplaces and regulations are poorly organized. They’re not designed to take into account the cognitively flawed human beings that we are. They don’t attempt to structure things around our weaknesses.

Some concrete proposals to do that are on the table. For example, we know that auditors develop relationships with clients after years of working together, and we know that those relationships can corrupt their audits without them even realizing it. So there is a proposal to force businesses to switch auditors every couple of years to address that problem.

Another suggestion: A sentence should be placed at the beginning of every business contract that explicitly says that lying on this contract is unethical and illegal, because that kind of statement would get people into the proper cognitive frame.

Along similar lines, some years ago Ross Anderson made the suggestion that the webpages of peoples’ online bank accounts should include their photographs, based on the research that it’s harder to commit fraud against someone who you identify with as a person.

Two excellent papers on this topic:

Abstract of the second paper:

Dishonesty plays a large role in the economy. Causes for (dis)honest behavior seem to be based partially on external rewards, and partially on internal rewards. Here, we investigate how such external and internal rewards work in concert to produce (dis)honesty. We propose and test a theory of self-concept maintenance that allows people to engage to some level in dishonest behavior, thereby benefiting from external benefits of dishonesty, while maintaining their positive view about themselves in terms of being honest individuals. The results show that (1) given the opportunity to engage in beneficial dishonesty, people will engage in such behaviors; (2) the amount of dishonesty is largely insensitive to either the expected external benefits or the costs associated with the deceptive acts; (3) people know about their actions but do not update their self-concepts; (4) causing people to become more aware of their internal standards for honesty decreases their tendency for deception; and (5) increasing the “degrees of freedom” that people have to interpret their actions increases their tendency for deception. We suggest that dishonesty governed by self-concept maintenance is likely to be prevalent in the economy, and understanding it has important implications for designing effective methods to curb dishonesty.

Posted on May 30, 2012 at 12:54 PMView Comments

The Problem of False Alarms

The context is tornado warnings:

The basic problem, Smith says, it that sirens are sounded too often in most places. Sometimes they sound in an entire county for a warning that covers just a sliver of it; sometimes for other thunderstorm phenomena like large hail and/or strong straight-line winds; and sometimes for false alarm warnings ­ warnings for tornadoes that were incorrectly detected.

The residents of Joplin, Smith contends, were numbed by the too frequent blaring of sirens. As a result of too many past false alarms, he writes: “The citizens of Joplin were unwittingly being trained to NOT act when the sirens sounded.”

Posted on May 30, 2012 at 6:44 AMView Comments

The Ubiquity of Cyber-Fears

A new study concludes that more people are worried about cyber threats than terrorism.

…the three highest priorities for Americans when it comes to security issues in the presidential campaign are:

  1. Protecting government computer systems against hackers and criminals (74 percent)
  2. Protecting our electric power grid, water utilities and transportation systems against computer or terrorist attacks (73 percent)
  3. Homeland security issues such as terrorism (68 percent)

Posted on May 24, 2012 at 11:31 AMView Comments

Racism as a Vestigal Remnant of a Security Mechanism

Roots of Racism,” by Elizabeth Culotta in Science:

Our attitudes toward outgroups are part of a threat-detection system that allows us to rapidly determine friend from foe, says psychologist Steven Neuberg of ASU Tempe. The problem, he says, is that like smoke detectors, the system is designed to give many false alarms rather than miss a true threat. So outgroup faces alarm us even when there is no danger.

Lots of interesting stuff in the article. Unfortunately, it requires registration to access.

Posted on May 22, 2012 at 1:10 PMView Comments

Rules for Radicals

It was written in 1971, but this still seems like a cool book:

For an elementary illustration of tactics, take parts of your face as the point of reference; your eyes, your ears, and your nose. First the eyes: if you have organized a vast, mass-based people’s organization, you can parade it visibly before the enemy and openly show your power. Second the ears; if your organization is small in numbers, then do what Gideon did: conceal the members in the dark but raise a din and clamor that will make the listener believe that your organization numbers many more than it does. Third, the nose; if your organization is too tiny even for noise, stink up the place.

Always remember the first rule of power tactics: Power is not only what you have but what the enemy thinks you have.

The second rule is: Never go outside the experience of your people. When an action or tactic is outside the experience of the people, the result is confusion, fear, and retreat. It also means a collapse of communication, as we have notes.

The third rule is: Wherever possible go outside the experience of the enemy. Here you want to cause confusion, fear, and retreat.

The fourth rule is: Make the enemy live up to their own book of rules. You can kill them with this, for they can no more obey their own rules than the Christian church can live up to Christianity.

The fourth rule carries within in the fifth rule: Ridicule is man’s most potent weapon. It is almost impossible to counterattack ridicule. Also it infuriates the opposition, who then react to your advantage.

The sixth rule is: A good tactic is one that your people enjoy. If your people are not having a ball doing it, there is something very wrong with the tactic.

The seventh rule: A tactic that drags on too long becomes a drag.

[…]

The twelfth rule: The price of a successful attack is a constructive alternative. You cannot risk being trapped by the enemy in his sudden agreement with your demand and saying “You’re right—we don’t know what to do about this issue. Now you tell us.”

The thirteenth rule: Pick the target, freeze it, personalize it, and polarize it.

Posted on May 17, 2012 at 7:20 AMView Comments

1 6 7 8 9 10 26

Sidebar photo of Bruce Schneier by Joe MacInnis.