You Can’t Trust Airport Security
But don't worry: It has always been easy to cheat, steal or kill, and few people do
When the plumber knocks at your door, why do you let him in? He’s probably bigger and stronger than you. And he has a wrench. He could easily kill you and steal your money and your stuff, which would certainly be a better deal for him than receiving a moderate payment and having to fix your toilet.
But you trust that he won’t; and trust, that mysterious and invaluable substance, is the subject of Bruce Schneier’s ambitious “Liars and Outliers: Enabling the Trust That Society Needs to Survive,” which starts with the homely parable of the plumber and builds into a treatise on every aspect of trust, from marital fidelity to transnational terrorism.
Mr. Schneier is best known as an expert on computer security, with a long record of needling the powerful about their porous defenses. (He is now something of an insider himself, serving as chief security technology officer of BT, the former British Telecom.) Mr. Schneier’s popular blog is required reading for anyone suffering from excess confidence in electronic voting machines, airport security or hotel-room entry cards. He understands better than almost anyone the many channels by which evildoers can undermine the systems we rely on. Which leads immediately to the question at the center of his book: If it’s so easy to cheat, steal and kill, why do so few people do it?
“Liars and Outliers” is a kind of playbook of the tactics society has developed to make trust possible and cooperation widespread. Why, for instance, do most people agree to chip in and pay taxes? Lots of reasons. Your internal moral compass tells you that cheating on your taxes is wrong. If you don’t have a moral objection, you still might not want to be known to friends and family as a tax cheat. And if you don’t have friends, or if your friends are all tax cheats, then there’s still the small detail that tax evasion carries stiff legal penalties. Over and above all these pressures are systems like withholding by employers. These measures don’t just make cheating less attractive; they make it impossible.
A running theme of Mr. Schneier’s book is that different tactics work at different scales. Moral rules operate at the level of the individual, and concerns about reputation at the level of small personal-acquaintance groups. Only the institutional pressures, like laws and regulations, have a reach that extends to the millions. As society grows bigger, we have to rely on them more heavily. Formal institutions serve as a kind of social prosthetic, using technology to vastly extend the reach of the small-group strategies we have employed for most of human history. Instead of knowing which people in the village can be trusted with a loan, we have credit scores. Instead of restaurant recommendations from friends, we have Yelp.
That’s where the irony kicks in. We can’t do without these large-scale systems—an economy where we only lent money to people we knew would be limited to village size. But institutional systems—characterized by formal rules and regulations, automation and standardization—are inherently brittle and vulnerable to attack. You can’t steal the identity of someone in your own family, and the sort of “flash crash” made possible by high-frequency trading can’t happen at the village market.
It gets worse. Institutional pressures have a way of crowding out the small-scale phenomena they are meant to mimic; “cooperation” starts to mean compliance with institutional demands, and that alone. People ask themselves what’s illegal instead of what’s immoral. Airport security agents put following procedure and box-ticking ahead of human judgments about safety. Students lose interest in material that won’t appear on the test. (The definitive account of this process is political scientist James Scott’s magisterial 1998 book, “Seeing Like a State.”)