Dan Ariely on Dishonesty
Good talk, and I’ve always liked these animators.
Page 7 of 26
Good talk, and I’ve always liked these animators.
Shelly C. McArdle, Heather Rosoff, Richard S. John (2012), “The Dynamics of Evolving Beliefs, Concerns Emotions, and Behavioral Avoidance Following 9/11: A Longitudinal Analysis of Representative Archival Samples,” Risk Analysis v. 32, pp. 744761.
Abstract: September 11 created a natural experiment that enables us to track the psychological effects of a large-scale terror event over time. The archival data came from 8,070 participants of 10 ABC and CBS News polls collected from September 2001 until September 2006. Six questions investigated emotional, behavioral, and cognitive responses to the events of September 11 over a five-year period. We found that heightened responses after September 11 dissipated and reached a plateau at various points in time over a five-year period. We also found that emotional, cognitive, and behavioral reactions were moderated by age, sex, political affiliation, and proximity to the attack. Both emotional and behavioral responses returned to a normal state after one year, whereas cognitively-based perceptions of risk were still diminishing as late as September 2006. These results provide insight into how individuals will perceive and respond to future similar attacks.
In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don’t recommend reading the entire discussion; we spent 14,000 words talking past each other. But what’s interesting is how our debate illustrates the differences between a security engineer and an intelligent layman. Harris was uninterested in the detailed analysis required to understand a security system and unwilling to accept that security engineering is a specialized discipline with a body of knowledge and relevant expertise. He trusted his intuition.
Many people have researched how intuition fails us in security: Paul Slovic and Bill Burns on risk perception, Daniel Kahneman on cognitive biases in general, Rick Walsh on folk computer-security models. I’ve written about the psychology of security, and Daniel Gartner has written more. Basically, our intuitions are based on things like antiquated fight-or-flight models, and these increasingly fail in our technological world.
This problem isn’t unique to computer security, or even security in general. But this misperception about security matters now more than it ever has. We’re no longer asking people to make security choices only for themselves and their businesses; we need them to make security choices as a matter of public policy. And getting it wrong has increasingly bad consequences.
Computers and the Internet have collided with public policy. The entertainment industry wants to enforce copyright. Internet companies want to continue freely spying on users. Law-enforcement wants its own laws imposed on the Internet: laws that make surveillance easier, prohibit anonymity, mandate the removal of objectionable images and texts, and require ISPs to retain data about their customers’ Internet activities. Militaries want laws regarding cyber weapons, laws enabling wholesale surveillance, and laws mandating an Internet kill switch. “Security” is now a catch-all excuse for all sorts of authoritarianism, as well as for boondoggles and corporate profiteering.
Cory Doctorow recently spoke about the coming war on general-purpose computing. I talked about it in terms of the entertainment industry and Jonathan Zittrain discussed it more generally, but Doctorow sees it as a much broader issue. Preventing people from copying digital files is only the first skirmish; just wait until the DEA wants to prevent chemical printers from making certain drugs, or the FBI wants to prevent 3D printers from making guns.
I’m not here to debate the merits of any of these policies, but instead to point out that people will debate them. Elected officials will be expected to understand security implications, both good and bad, and will make laws based on that understanding. And if they aren’t able to understand security engineering, or even accept that there is such a thing, the result will be ineffective and harmful policies.
So what do we do? We need to establish security engineering as a valid profession in the minds of the public and policy makers. This is less about certifications and (heaven forbid) licensing, and more about perception—and cultivating a security mindset. Amateurs produce amateur security, which costs more in dollars, time, liberty, and dignity while giving us less—or even no—security. We need everyone to know that.
We also need to engage with real-world security problems, and apply our expertise to the variety of technical and socio-technical systems that affect broader society. Everything involves computers, and almost everything involves the Internet. More and more, computer security is security.
Finally, and perhaps most importantly, we need to learn how to talk about security engineering to a non-technical audience. We need to convince policy makers to follow a logical approach instead of an emotional one—an approach that includes threat modeling, failure analysis, searching for unintended consequences, and everything else in an engineer’s approach to design. Powerful lobbying forces are attempting to force security policies on society, largely for non-security reasons, and sometimes in secret. We need to stand up for security.
A shorter version of this essay appeared in the September/October 2012 issue of IEEE Security & Privacy.
Interesting anecdote from World War II.
Nice post:
The screaming fear in your stomach before you give a speech to 12 kids in the fifth grade is precisely the same fear a presidential candidate feels before the final debate. The fight-or-flight reflex that speeds up your heart when you’re about to get a speeding ticket you don’t deserve isn’t very different than the chemical reaction in the brain of an accused (but innocent) murder suspect when the jury walks in.
Bigger stakes can’t lead to more fear.
And, in an interesting glitch, more fear often tricks us into thinking we’re dealing with bigger stakes.
1. Probability neglect – people sometimes don’t consider the probability of the occurrence of an outcome, but focus on the consequences only.
2. Consequence neglect – just like probability neglect, sometimes individuals neglect the magnitude of outcomes.
3. Statistical neglect – instead of subjectively assessing small probabilities and continuously updating them, people choose to use rules-of-thumb (if any heuristics), which can introduce systematic biases in their decisions.
4. Solution neglect – choosing an optimal solution is not possible when one fails to consider all of the solutions.
5. External risk neglect – in making decisions, individuals or groups often consider the cost/benefits of decisions only for themselves, without including externalities, sometimes leading to significant negative outcomes for others.
In the short story “A Wayside Comedy,” published in 1888 in Under the Deodars, Kipling wrote:
You must remember, though you will not understand, that all laws weaken in a small and hidden community where there is no public opinion. When a man is absolutely alone in a Station he runs a certain risk of falling into evil ways. This risk is multiplied by every addition to the population up to twelve—the Jury number. After that, fear and consequent restraint begin, and human action becomes less grotesquely jerky.
Interesting commentary on how reputational pressure scales. If I had found this quote last year, I would have included it in my book.
In Liars and Outliers, I talk a lot about social norms and when people follow them. This research uses survival data from shipwrecks to measure it.
The authors argue that shipwrecks can actually tell us a fair bit about human behavior, since everyone stuck on a sinking ship has to do a bit of cost-benefit analysis. People will weigh their options—which will generally involve helping others at great risk to themselves—amidst a backdrop of social norms and, at least in case of the Titanic, direct orders from authority figures. “This cost-benefit logic is fundamental in economic models of human behavior,” the authors write, suggesting that a shipwreck could provide a real-world test of ideas derived from controlled experiments.
Eight ideas, to be precise. That’s how many hypotheses the authors lay out, ranging from “women have a survival advantage in shipwrecks” to “women are more likely to survive on British ships, given the UK’s strong sense of gentility.” They tested them using a database of ship sinkings that encompasses over 15,000 passengers and crew, and provides information on everything from age and sex to whether the passenger had a first-class ticket.
For the most part, the lessons provided by the Titanic simply don’t hold. Excluding the two disasters mentioned above, crew members had a survival rate of over 60 percent, far higher than any other group analyzed. (Although they didn’t consistently survive well—in about half the wrecks, there was no statistical difference between crew and passengers). Rather than going down with the ship, captains ended up coming in second, with just under half surviving. The authors offer a number of plausible reasons for crew survival, including better fitness, a thorough knowledge of the ship that’s sinking, and better training for how to handle emergencies. In any case, however, they’re not clearly or consistently sacrificing themselves to save their passengers.
At the other end of the spectrum, nearly half the children on the Titanic survived, but figures for the rest of the shipwrecks were down near 15 percent. About a quarter of women survived other sinkings, but roughly three times that made it through the Titanic alive. If you exclude the Titanic, female survival was 18 percent, or about half the rate at which males came through alive.
What about social factors? Having the captain order “women and children first” did boost female survival, but only by about 10 percentage points. Most of the other ideas didn’t pan out. For example, the speed of sinking, which might give the crew more time to get vulnerable passengers off first, made no difference whatsoever to female survival. Neither did the length of voyage, which might give passengers more time to get to know both the boat and each other. The fraction of passengers that were female didn’t seem to make a difference either.
One social factor that did play a role was price of ticket: “there is a class gradient in survival benefitting first class passengers.” Another is the being on a British ship, where (except with the Titanic), women actually had lower rates of survival.
Paper here (behind a paywall):
Abstract: Since the sinking of the Titanic, there has been a widespread belief that the social norm of “women and children first” (WCF) give women a survival advantage over men in maritime disasters, and that captains and crew members give priority to passengers. We analyze a database of 18 maritime disasters spanning three centuries, covering the fate of over 15,000 individuals of more than 30 nationalities. Our results provide a unique picture of maritime disasters. Women have a distinct survival disadvantage compared with men. Captains and crew survive at a significantly higher rate than passengers. We also find that: the captain has the power to enforce normative behavior; there seems to be no association between duration of a disaster and the impact of social norms; women fare no better when they constitute a small share of the ship’s complement; the length of the voyage before the disaster appears to have no impact on women’s relative survival rate; the sex gap in survival rates has declined since World War I; and women have a larger disadvantage in British shipwrecks. Taken together, our findings show that human behavior in life-and-death situations is best captured by the expression “every man for himself.”
Horrific events, such as the massacre in Aurora, can be catalysts for social and political change. Sometimes it seems that they’re the only catalyst; recall how drastically our policies toward terrorism changed after 9/11 despite how moribund they were before.
The problem is that fear can cloud our reasoning, causing us to overreact and to overly focus on the specifics. And the key is to steer our desire for change in that time of fear.
Our brains aren’t very good at probability and risk analysis. We tend to exaggerate spectacular, strange and rare events, and downplay ordinary, familiar and common ones. We think rare risks are more common than they are. We fear them more than probability indicates we should.
There is a lot of psychological research that tries to explain this, but one of the key findings is this: People tend to base risk analysis more on stories than on data. Stories engage us at a much more visceral level, especially stories that are vivid, exciting or personally involving.
If a friend tells you about getting mugged in a foreign country, that story is more likely to affect how safe you feel traveling to that country than reading a page of abstract crime statistics will.
Novelty plus dread plus a good story equals overreaction.
And who are the major storytellers these days? Television and the Internet. So when news programs and sites endlessly repeat the story from Aurora, with interviews with those in the theater, interviews with the families, and commentary by anyone who has a point to make, we start to think this is something to fear, rather than a rare event that almost never happens and isn’t worth worrying about. In other words, reading five stories about the same event feels somewhat like five separate events, and that skews our perceptions.
We see the effects of this all the time.
It’s strangers by whom we fear being murdered, kidnapped, raped and assaulted, when it’s far more likely that any perpetrator of such offenses is a relative or a friend. We worry about airplane crashes and rampaging shooters instead of automobile crashes and domestic violence—both of which are far more common and far, far more deadly.
Our greatest recent overreaction to a rare event was our response to the terrorist attacks of 9/11. I remember then-Attorney General John Ashcroft giving a speech in Minnesota—where I live—in 2003 in which he claimed that the fact there were no new terrorist attacks since 9/11 was proof that his policies were working. I remember thinking: “There were no terrorist attacks in the two years preceding 9/11, and you didn’t have any policies. What does that prove?”
What it proves is that terrorist attacks are very rare, and perhaps our national response wasn’t worth the enormous expense, loss of liberty, attacks on our Constitution and damage to our credibility on the world stage. Still, overreacting was the natural thing for us to do. Yes, it was security theater and not real security, but it made many of us feel safer.
The rarity of events such as the Aurora massacre doesn’t mean we should ignore any lessons it might teach us. Because people overreact to rare events, they’re useful catalysts for social introspection and policy change. The key here is to focus not on the details of the particular event but on the broader issues common to all similar events.
Installing metal detectors at movie theaters doesn’t make sense—there’s no reason to think the next crazy gunman will choose a movie theater as his venue, and how effectively would a metal detector deter a lone gunman anyway?—but understanding the reasons why the United States has so many gun deaths compared with other countries does. The particular motivations of alleged killer James Holmes aren’t relevant—the next gunman will have different motivations—but the general state of mental health care in the United States is.
Even with this, the most important lesson of the Aurora massacre is how rare these events actually are. Our brains are primed to believe that movie theaters are more dangerous than they used to be, but they’re not. The riskiest part of the evening is still the car ride to and from the movie theater, and even that’s very safe.
But wear a seat belt all the same.
This essay previously appeared on CNN.com, and is an update of this essay.
EDITED TO ADD: I almost added that Holmes wouldn’t have been stopped by a metal detector. He walked into the theater unarmed and left through a back door, which he propped open so he could return armed. And while there was talk about installing metal detectors in movie theaters, I have not heard of any theater actually doing so. But AMC movie theaters have announced a “no masks or costumes policy” as a security measure.
Sidebar photo of Bruce Schneier by Joe MacInnis.