Entries Tagged "trust"

Page 8 of 16

Kip Hawley Reviews Liars and Outliers

In his blog:

I think the most important security issues going forward center around identity and trust. Before knowing I would soon encounter Bruce again in the media, I bought and read his new book Liars & Outliers and it is a must-read book for people looking forward into our security future and thinking about where this all leads. For my colleagues inside the government working the various identity management, security clearance, and risk-based- security issues, L&O should be required reading.

[…]

L&O is fresh thinking about live fire issues of today as well as moral issues that are ahead. Whatever your policy bent, this book will help you. Trust me on this, you don’t have to buy everything Bruce says about TSA to read this book, take it to work, put it down on the table and say, “this is brilliant stuff.”

I’m hosting Kip Hawley on FireDogLake’s Book Salon on Sunday at 5:00 – 7:00 PM EDT. Join me and we’ll ask him some tough questions about his new book.

Posted on May 18, 2012 at 6:06 AMView Comments

Fear and the Attention Economy

danah boyd is thinking about—in a draft essay, and as a recording of a presentation—fear and the attention economy. Basically, she is making the argument that the attention economy magnifies the culture of fear because fear is a good way to get attention, and that this is being made worse by the rise of social media.

A lot of this isn’t new. Fear has been used to sell products (I’ve written about that here) and policy (“Remember the Maine!” “Remember the Alamo! “Remember 9/11!”) since forever. Newspapers have used fear to attract readers since there were readers. Long before there were child predators on the Internet, irrational panics swept society. Shark attacks in the 1970s. Marijuana in the 1950s. boyd relates a story from Glassner’s The Culture of Fear about elderly women being mugged in the 1990s.

These fears have largely been driven from the top down: from political leaders, from the news media. What’s new today—and I agree this is very interesting—is that in addition to these traditional top-down fears, we’re also seeing fears come from the bottom up. Social media are allowing all of us to sow fear and, because fear gets attention, is enticing us to do so. Rather than fostering empathy and bringing us all together, social media might be pushing us further apart.

A lot of this is related to my own writing about trust. Fear causes us to mistrust a group we’re fearful of, and to more strongly trust the group we’re a part of. It’s natural, and it can be manipulated. It can be amplified, and it can be dampened. How social media are both enabling and undermining trust is a really important thing for us to understand. As boyd says: “What we design and how we design it matters. And how our systems are used also matters, even if those uses aren’t what we intended.”

Posted on April 25, 2012 at 6:51 AMView Comments

Amazing Round of "Split or Steal"

In Liars and Outliers, I use the metaphor of the Prisoner’s Dilemma to exemplify the conflict between group interest and self-interest. There are a gazillion academic papers on the Prisoner’s Dilemma from a good dozen different academic disciplines, but the weirdest dataset on real people playing the game is from a British game show called Golden Balls.

In the final round of the game, called “Split or Steal,” two contestants play a one-shot Prisoner’s Dilemma—technically, it’s a variant—choosing to either cooperate (and split a jackpot) or defect (and try to steal it). If one steals and the other splits, the stealer gets the whole jackpot. And, of course, if both contestants steal then both end up with nothing. There are lots of videos from the show on YouTube. (There are even two papers that analyze data from the game.) The videos are interesting to watch, not just to see how players cooperate and defect, but to watch their conversation beforehand and their reactions afterwards. I wrote a few paragraphs about this game for Liars and Outliers, but I ended up deleting them.

This is the weirdest, most surreal round of “Split or Steal” I have ever seen. The more I think about the psychology of it, the more interesting it is. I’ll save my comments for the comments, because I want you to watch it before I say more. Really.

For consistency’s sake in the comments, here are their names. The man on the left is Ibrahim, and the man on the right is Nick.

EDITED TO ADD (5/14): Economic analysis of the episode.

Posted on April 24, 2012 at 6:43 AMView Comments

A Heathrow Airport Story about Trousers

Usually I don’t bother posting random stories about dumb or inconsistent airport security measures. But this one is particularly interesting:

“Sir, your trousers.”

“Pardon?”

“Sir, please take your trousers off.”

A pause.

“No.”

“No?”

The security official clearly was not expecting that response.

He begins to look like he doesn’t know what to do, bless him.

“You have no power to require me to do that. You also haven’t also given any good reason. I am sure any genuine security concerns you have can be addressed in other ways. You do not need to invade my privacy in this manner.”

A pause.

“I think you probably need to get your manager, don’t you?” I am trying to be helpful.

As I said in my Economist essay, “At this point, we don’t trust America’s TSA, Britain’s Department for Transport, or airport security in general.” We don’t trust that, when they tell us to do something and claim it’s essential for security, they’re tellling the truth.

Posted on April 11, 2012 at 9:57 AMView Comments

Lost Smart Phones and Human Nature

Symantec deliberately “lost” a bunch of smart phones with tracking software on them, just to see what would happen:

Some 43 percent of finders clicked on an app labeled “online banking.” And 53 percent clicked on a filed named “HR salaries.” A file named “saved passwords” was opened by 57 percent of finders. Social networking tools and personal e-mail were checked by 60 percent. And a folder labeled “private photos” tempted 72 percent.

Collectively, 89 percent of finders clicked on something they probably shouldn’t have.

Meanwhile, only 50 percent of finders offered to return the gadgets, even though the owner’s name was listed clearly within the contacts file.

[…]

Some might consider the 50 percent return rate a victory for humanity, but that wasn’t really the point of Symantec’s project. The firm wanted to see if—even among what seem to be honest people—the urge to peek into someone’s personal data was just too strong to resist. It was.

EDITED TO ADD (4/13): Original study.

Posted on April 4, 2012 at 6:07 AMView Comments

Harms of Post-9/11 Airline Security

As I posted previously, I have been debating former TSA Administrator Kip Hawley on the Economist website. I didn’t bother reposting my opening statement and rebuttal, because—even though I thought I did a really good job with them—they were largely things I’ve said before. In my closing statement, I talked about specific harms post-9/11 airport security has caused. This is mostly new, so here it is, British spelling and punctuation and all.


In my previous two statements, I made two basic arguments about post-9/11 airport security. One, we are not doing the right things: the focus on airports at the expense of the broader threat is not making us safer. And two, the things we are doing are wrong: the specific security measures put in place since 9/11 do not work. Kip Hawley doesn’t argue with the specifics of my criticisms, but instead provides anecdotes and asks us to trust that airport security—and the Transportation Security Administration (TSA) in particular—knows what it’s doing.

He wants us to trust that a 400-ml bottle of liquid is dangerous, but transferring it to four 100-ml bottles magically makes it safe. He wants us to trust that the butter knives given to first-class passengers are nevertheless too dangerous to be taken through a security checkpoint. He wants us to trust the no-fly list: 21,000 people so dangerous they’re not allowed to fly, yet so innocent they can’t be arrested. He wants us to trust that the deployment of expensive full-body scanners has nothing to do with the fact that the former secretary of homeland security, Michael Chertoff, lobbies for one of the companies that makes them. He wants us to trust that there’s a reason to confiscate a cupcake (Las Vegas), a 3-inch plastic toy gun (London Gatwick), a purse with an embroidered gun on it (Norfolk, VA), a T-shirt with a picture of a gun on it (London Heathrow) and a plastic lightsaber that’s really a flashlight with a long cone on top (Dallas/Fort Worth).

At this point, we don’t trust America’s TSA, Britain’s Department for Transport, or airport security in general. We don’t believe they’re acting in the best interests of passengers. We suspect their actions are the result of politicians and government appointees making decisions based on their concerns about the security of their own careers if they don’t act tough on terror, and capitulating to public demands that “something must be done”.

In this final statement, I promised to discuss the broader societal harms of post-9/11 airport security. This loss of trust—in both airport security and counterterrorism policies in general—is the first harm. Trust is fundamental to society. There is an enormous amount written about this; high-trust societies are simply happier and more prosperous than low-trust societies. Trust is essential for both free markets and democracy. This is why open-government laws are so important; trust requires government transparency. The secret policies implemented by airport security harm society because of their very secrecy.

The humiliation, the dehumanisation and the privacy violations are also harms. That Mr Hawley dismisses these as mere “costs in convenience” demonstrates how out-of-touch the TSA is from the people it claims to be protecting. Additionally, there’s actual physical harm: the radiation from full-body scanners still not publicly tested for safety; and the mental harm suffered by both abuse survivors and children: the things screeners tell them as they touch their bodies are uncomfortably similar to what child molesters say.

In 2004, the average extra waiting time due to TSA procedures was 19.5 minutes per person. That’s a total economic loss—in –America—of $10 billion per year, more than the TSA’s entire budget. The increased automobile deaths due to people deciding to drive instead of fly is 500 per year. Both of these numbers are for America only, and by themselves demonstrate that post-9/11 airport security has done more harm than good.

The current TSA measures create an even greater harm: loss of liberty. Airports are effectively rights-free zones. Security officers have enormous power over you as a passenger. You have limited rights to refuse a search. Your possessions can be confiscated. You cannot make jokes, or wear clothing, that airport security does not approve of. You cannot travel anonymously. (Remember when we would mock Soviet-style “show me your papers” societies? That we’ve become inured to the very practice is a harm.) And if you’re on a certain secret list, you cannot fly, and you enter a Kafkaesque world where you cannot face your accuser, protest your innocence, clear your name, or even get confirmation from the government that someone, somewhere, has judged you guilty. These police powers would be illegal anywhere but in an airport, and we are all harmed—individually and collectively—by their existence.

In his first statement, Mr Hawley related a quote predicting “blood running in the aisles” if small scissors and tools were allowed on planes. That was said by Corey Caldwell, an Association of Flight Attendants spokesman, in 2005. It was not the statement of someone who is thinking rationally about airport security; it was the voice of irrational fear.

Increased fear is the final harm, and its effects are both emotional and physical. By sowing mistrust, by stripping us of our privacy—and in many cases our dignity—by taking away our rights, by subjecting us to arbitrary and irrational rules, and by constantly reminding us that this is the only thing between us and death by the hands of terrorists, the TSA and its ilk are sowing fear. And by doing so, they are playing directly into the terrorists’ hands.

The goal of terrorism is not to crash planes, or even to kill people; the goal of terrorism is to cause terror. Liquid bombs, PETN, planes as missiles: these are all tactics designed to cause terror by killing innocents. But terrorists can only do so much. They cannot take away our freedoms. They cannot reduce our liberties. They cannot, by themselves, cause that much terror. It’s our reaction to terrorism that determines whether or not their actions are ultimately successful. That we allow governments to do these things to us—to effectively do the terrorists’ job for them—is the greatest harm of all.

Return airport security checkpoints to pre-9/11 levels. Get rid of everything that isn’t needed to protect against random amateur terrorists and won’t work against professional al-Qaeda plots. Take the savings thus earned and invest them in investigation, intelligence, and emergency response: security outside the airport, security that does not require us to play guessing games about plots. Recognise that 100% safety is impossible, and also that terrorism is not an “existential threat” to our way of life. Respond to terrorism not with fear but with indomitability. Refuse to be terrorized.

EDITED TO ADD (3/20): Cory Doctorow on the exchange:

All of Hawley’s best arguments sum up to “Someone somewhere did something bad, and if he’d tried it on us, we would have caught him.” His closing clincher? They heard a bad guy was getting on a plane somewhere. The figured out which plane, stopped it from taking off and “resolved” the situation. Seeing as there were no recent reports of foiled terrorist plots, I’m guessing the “resolution” was “it turned out we made a mistake.” But Hawley’s takeaway is: “look at how fast our mistake was!”

EDITED TO ADD (4/19): German translation of the closing statement.

Posted on March 29, 2012 at 6:53 AMView Comments

Liars and Outliers: The Big Idea

My big idea is a big question. Every cooperative system contains parasites. How do we ensure that society’s parasites don’t destroy society’s systems?

It’s all about trust, really. Not the intimate trust we have in our close friends and relatives, but the more impersonal trust we have in the various people and systems we interact with in society. I trust airline pilots, hotel clerks, ATMs, restaurant kitchens, and the company that built the computer I’m writing this short essay on. I trust that they have acted and will act in the ways I expect them to. This type of trust is more a matter of consistency or predictability than of intimacy.

Of course, all of these systems contain parasites. Most people are naturally trustworthy, but some are not. There are hotel clerks who will steal your credit card information. There are ATMs that have been hacked by criminals. Some restaurant kitchens serve tainted food. There was even an airline pilot who deliberately crashed his Boeing 767 into the Atlantic Ocean in 1999.

My central metaphor is the Prisoner’s Dilemma, which nicely exposes the tension between group interest and self-interest. And the dilemma even gives us a terminology to use: cooperators act in the group interest, and defectors act in their own selfish interest, to the detriment of the group. Too many defectors, and everyone suffers—often catastrophically.

The Prisoner’s Dilemma is not only useful in describing the problem, but also serves as a way to organize solutions. We humans have developed four basic mechanisms for ways to limit defectors: what I call societal pressure. We use morals, reputation, laws, and security systems. It’s all coercion, really, although we don’t call it that. I’ll spare you the details; it would require a book to explain. And it did.

This book marks another chapter in my career’s endless series of generalizations. From mathematical security—cryptography—to computer and network security; from there to security technology in general; then to the economics of security and the psychology of security; and now to—I suppose—the sociology of security. The more I try to understand how security works, the more of the world I need to encompass within my model.

When I started out writing this book, I thought I’d be talking a lot about the global financial crisis of 2008. It’s an excellent example of group interest vs. self-interest, and how a small minority of parasites almost destroyed the planet’s financial system. I even had a great quote by former Federal Reserve Chairman Alan Greenspan, where he admitted a “flaw” in his worldview. The exchange, which took place when he was being questioned by Congressman Henry Waxman at a 2008 Congressional hearing, was once the opening paragraphs of my book. I called the defectors “the dishonest minority,” which was my original title.

That unifying example eventually faded into the background, to be replaced by a lot of separate examples. I talk about overfishing, childhood immunizations, paying taxes, voting, stealing, airplane security, gay marriage, and a whole lot of other things. I dumped the phrase “dishonest minority” entirely, partly because I didn’t need it and partly because a vocal few early readers were reading it not as “the small percentage of us that are dishonest” but as “the minority group that is dishonest”—not at all the meaning I was trying to convey.

I didn’t even realize I was talking about trust until most of the way through. It was a couple of early readers who—coincidentally, on the same day—told me my book wasn’t about security, it was about trust. More specifically, it was about how different societal pressures, security included, induce trust. This interplay between cooperators and defectors, trust and security, compliance and coercion, affects everything having to do with people.

In the book, I wander through a dizzying array of academic disciplines: experimental psychology, evolutionary psychology, sociology, economics, behavioral economics, evolutionary biology, neuroscience, game theory, systems dynamics, anthropology, archeology, history, political science, law, philosophy, theology, cognitive science, and computer security. It sometimes felt as if I were blundering through a university, kicking down doors and demanding answers. “You anthropologists: what can you tell me about early human transgressions and punishments?” “Okay neuroscientists, what’s the brain chemistry of cooperation? And you evolutionary psychologists, how can you explain that?” “Hey philosophers, what have you got?” I downloaded thousands—literally—of academic papers. In pre-Internet days I would have had to move into an academic library.

What’s really interesting to me is what this all means for the future. We’ve never been able to eliminate defections. No matter how much societal pressure we bring to bear, we can’t bring the murder rate in society to zero. We’ll never see the end of bad corporate behavior, or embezzlement, or rude people who make cell phone calls in movie theaters. That’s fine, but it starts getting interesting when technology makes each individual defection more dangerous. That is, fisherman will survive even if a few of them defect and overfish—until defectors can deploy driftnets and single-handedly collapse the fishing stock. The occasional terrorist with a machine gun isn’t a problem for society in the overall scheme of things; but a terrorist with a nuclear weapon could be.

Also—and this is the final kicker—not all defectors are bad. If you think about the notions of cooperating and defecting, they’re defined in terms of the societal norm. Cooperators are people who follow the formal or informal rules of society. Defectors are people who, for whatever reason, break the rules. That definition says nothing about the absolute morality of the society or its rules. When society is in the wrong, it’s defectors who are in the vanguard for change. So it was defectors who helped escaped slaves in the antebellum American South. It’s defectors who are agitating to overthrow repressive regimes in the Middle East. And it’s defectors who are fueling the Occupy Wall Street movement. Without defectors, society stagnates.

We simultaneously need more societal pressure to deal with the effects of technology, and less societal pressure to ensure an open, free, and evolving society. This is our big challenge for the coming decade.

This essay originally appeared on John Scalzi’s blog, Whatever.

Posted on March 2, 2012 at 1:21 PMView Comments

Status Report: Liars and Outliers

Last weekend, I completely reframed the book. I realized that the book isn’t about security. It’s about trust. I’m writing about how society induces people to behave in the group interest instead of some competing personal interest. It’s obvious that society needs to do this; otherwise, it can never solve collective action problems. And as a social species, we have developed both moral systems and reputational systems that encourage people behave in the group interest. I called these systems “societal security,” along with more recent developments: institutional (read “legal”) systems and technological systems.

That phrasing strained the definition of “security.” Everything, from the Bible to your friends treating you better if you were nice to them, was a security system. In my reframing, those are all trust pressures. It’s a language that’s more intuitive. We already know about moral pressure, peer pressure, and legal pressure. Reputational pressure, institutional pressure, and security pressure is much less of a stretch. And it puts security back in a more sensible place. Security is a mechanism; trust is the goal.

This reframing lets me more easily talk directly about the central issues of the book: how these various pressures scale to larger societies, and how security technologies are necessary for them to scale. Trust changes focus as society scales, too. In smaller societies (a family, for example), trust is more about intention and less about actions. In larger societies, trust is all about actions. It’s more like compliance. And as things scale even further, trust becomes less about people and more about systems. I don’t need to trust any particular banker, as long as I trust the banking system. And as we scale up, security becomes more important.

Possibly the book’s thesis statement: “Security is a set of constructed systems that extend the naturally occurring systems that humans have always used to induce trust and enable society. This extension became necessary when society began to operate at a scale and complexity where the naturally occurring mechanisms started to break down, and is more necessary as society continues to grow in scale.”

So the phrase “societal security” is completely gone from the book. (Like the phrase “dishonest minority,” it only exists in old blog posts.) There’s more talk about the role of trust in society. There’s more talk about how security, real security this time, enables trust. It felt like a major change when I embarked on it, but the fact that I did it in three days says how this framing was always there under the surface. And the fact that the book reads a lot more cleanly now says this framing is the right one.

The title remains the same: Liars and Outliers. The cover remains the same. The table of contents is the same, although some chapters have different names. The subtitle has to change, though. Candidates include:

  1. How Trust Holds Society Together—my publisher probably won’t allow me to write a book without the word “security” somewhere in the title.
  2. Security, Trust, and Society—not punchy enough.
  3. How Security Enables the Trust that Holds Society Together—probably too long.
  4. How Trust and Security Hold Society Together—maybe.

Any other ideas?

The manuscript is still due to the publisher at the end of the month, and publication is still set for mid-February. I am enjoying writing it, but I am also looking forward to it being done.

Posted on October 5, 2011 at 7:38 PMView Comments

Selling a Good Reputation on eBay

Here’s someone who is selling positive feedback on eBay:

Hello, for sale is a picture of a tree. This tree is an original and was taken by me. I have gotten nothing but 100% feedback from people from this picture. Great Picture! Once payment is made I will send you picture via email. Once payment is made and I send picture through email 100% feedback will be given to the buyer!!!! Once you pay for the item send me a ebay message with your email and I will email you the picture!

Posted on June 24, 2011 at 1:59 PMView Comments

1 6 7 8 9 10 16

Sidebar photo of Bruce Schneier by Joe MacInnis.