The Psychology of Immoral (and Illegal) Behavior
When I talk about Liars and Outliers to security audiences, one of the things I stress is our traditional security focus—on technical countermeasures—is much narrower than it could be. Leveraging moral, reputational, and institutional pressures are likely to be much more effective in motivating cooperative behavior.
This story illustrates the point. It’s about the psychology of fraud, “why good people do bad things.”
There is, she says, a common misperception that at moments like this, when people face an ethical decision, they clearly understand the choice that they are making.
“We assume that they can see the ethics and are consciously choosing not to behave ethically,” Tenbrunsel says.
This, generally speaking, is the basis of our disapproval: They knew. They chose to do wrong.
But Tenbrunsel says that we are frequently blind to the ethics of a situation.
Over the past couple of decades, psychologists have documented many different ways that our minds fail to see what is directly in front of us. They’ve come up with a concept called “bounded ethicality”: That’s the notion that cognitively, our ability to behave ethically is seriously limited, because we don’t always see the ethical big picture.
One small example: the way a decision is framed. “The way that a decision is presented to me,” says Tenbrunsel, “very much changes the way in which I view that decision, and then eventually, the decision it is that I reach.”
Essentially, Tenbrunsel argues, certain cognitive frames make us blind to the fact that we are confronting an ethical problem at all.
Tenbrunsel told us about a recent experiment that illustrates the problem. She got together two groups of people and told one to think about a business decision. The other group was instructed to think about an ethical decision. Those asked to consider a business decision generated one mental checklist; those asked to think of an ethical decision generated a different mental checklist.
Tenbrunsel next had her subjects do an unrelated task to distract them. Then she presented them with an opportunity to cheat.
Those cognitively primed to think about business behaved radically different from those who were not—no matter who they were, or what their moral upbringing had been.
“If you’re thinking about a business decision, you are significantly more likely to lie than if you were thinking from an ethical frame,” Tenbrunsel says.
According to Tenbrunsel, the business frame cognitively activates one set of goals—to be competent, to be successful; the ethics frame triggers other goals. And once you’re in, say, a business frame, you become really focused on meeting those goals, and other goals can completely fade from view.
Typically when we hear about large frauds, we assume the perpetrators were driven by financial incentives. But psychologists and economists say financial incentives don’t fully explain it. They’re interested in another possible explanation: Human beings commit fraud because human beings like each other.
We like to help each other, especially people we identify with. And when we are helping people, we really don’t see what we are doing as unethical.
The article even has some concrete security ideas:
Now if these psychologists and economists are right, if we are all capable of behaving profoundly unethically without realizing it, then our workplaces and regulations are poorly organized. They’re not designed to take into account the cognitively flawed human beings that we are. They don’t attempt to structure things around our weaknesses.
Some concrete proposals to do that are on the table. For example, we know that auditors develop relationships with clients after years of working together, and we know that those relationships can corrupt their audits without them even realizing it. So there is a proposal to force businesses to switch auditors every couple of years to address that problem.
Another suggestion: A sentence should be placed at the beginning of every business contract that explicitly says that lying on this contract is unethical and illegal, because that kind of statement would get people into the proper cognitive frame.
Along similar lines, some years ago Ross Anderson made the suggestion that the webpages of peoples’ online bank accounts should include their photographs, based on the research that it’s harder to commit fraud against someone who you identify with as a person.
Two excellent papers on this topic:
- Nina Mazar and Dan Ariely, “Dishonesty in Everyday Life and its Policy Implications,” Journal of Public Policy and Marketing, 2006, vol. 25, No. 1: 117-126.
- Nina Mazar, On Amir, and Dan Ariely, “The Dishonesty of Honest People: A Theory of Self-Concept Maintenance,” Journal of Marketing Research, 2008, vol. 45: 633-634.
Abstract of the second paper:
Dishonesty plays a large role in the economy. Causes for (dis)honest behavior seem to be based partially on external rewards, and partially on internal rewards. Here, we investigate how such external and internal rewards work in concert to produce (dis)honesty. We propose and test a theory of self-concept maintenance that allows people to engage to some level in dishonest behavior, thereby benefiting from external benefits of dishonesty, while maintaining their positive view about themselves in terms of being honest individuals. The results show that (1) given the opportunity to engage in beneficial dishonesty, people will engage in such behaviors; (2) the amount of dishonesty is largely insensitive to either the expected external benefits or the costs associated with the deceptive acts; (3) people know about their actions but do not update their self-concepts; (4) causing people to become more aware of their internal standards for honesty decreases their tendency for deception; and (5) increasing the “degrees of freedom” that people have to interpret their actions increases their tendency for deception. We suggest that dishonesty governed by self-concept maintenance is likely to be prevalent in the economy, and understanding it has important implications for designing effective methods to curb dishonesty.