Schneier on Security
A blog covering security and security technology.
« Criminals Using Commercial Spamflooding Services |
| Friday Squid Blogging: Tentacle Doorstop »
July 27, 2012
Liars and Outliers Summed Up in Two Comic Strips
I don't know the context, but these strips sum up my latest book nicely.
Posted on July 27, 2012 at 2:17 PM
• 15 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
The first comic is just like me thinking whenever in public. Overall, it really is kind of amazing it sticks together..somewhat. You're close, Bruce, with the "trust" idea. But I think it's not necessarily we "trust" each other, but we have no choice; we have to go to work, eat, run errands, etc. We're usually either too tired or scared or ethical to act out of place.
Funny how the last instance talks about a random "muay thai kick to the head", as I'm sure a Mr. Clive Robinson won't find that very amusing. :/
When I read that second strip last week, your book crossed my mind as well.
I'm not quite sure if society really needs "the rotten ones", though. I agree it needs some amount of defectors (to drive and cope with change), but not all defection is equal. Say, scoundrels, perhaps, but not villains.
On the other hand 'rotten' is in the eye of the beholder. And in either case, it's easier to defect 'for the better' if you can get others to join you, for whatever reasons they themselves may have (though history tells that does not always end well).
Hmm, I usually stand well back from the curb and in a solid stance.
I prefer to avoid sitting with my back to the door at work.
Crazed lane jumping drivers? I got nothing.
Y'know, perhaps one function of those who disturb our ability to trust others-- which is a key pre-requisite for civilization-- is to provide that reality check we need to keep from being too credulous (despite political efforts to maximize credulity in the electorate).
Kind of like a vaccine... or pathogens in general.
Also... those who reject regimentation are the ones who have made the future... and, in some ways, dreamers are often the "rotten" ones.
James Burke in the first of the original "Connections" series did point out that, in a civilization, we all have to trust others to do their jobs.
The most frustrating thing I've found in driving recently crystalized into the realization: when someone (especially driving in front of us) doesn't make the same choices *we* would, the frustration level makes a jump.
The same kind of message showed up in the first episode of Burke's "Day the Universe Changed", too... though, when multiple cultures collide, we have to lower our expectations that others will react as we would.
It's funny, I find my thinking wandering in these directions frequently.
Really, it's just one rung up from "If I were securing this place, where would I put the security cameras" when waiting in line at the bank. It pretty much automatically leads to thinking "So, that would mean that the blind spots are here, and here…"
The problem is, as soon as you suggest to someone that they're not secure, by explaining your observations, they equate that with an attack. You are the one who made them insecure in that way. Shoot the messenger. Ignorance is bliss.
So, there is enormous social pressure to not learn from people who know better until it's too late. That's your ultimate social security-hole, right there.
@The Imp: "So, there is enormous social pressure to not learn from people who know better until it's too late. That's your ultimate social security-hole, right there."
No one likes to hear that they made a mistake... and, really, how many people WANT to listen to divergent observations or opinions? Most folks "running things" prefer having sycophants who are always in agreement with their ideas and tend to dislike dissent, but, hey, who ever learned something new from someone who agrees with them on everything?
Having worn the hat of Systems Analyst, I have learned that the most valuable thing to have is QUESTIONS. Good questions can suggest multiple solutions while an answer might be a solution to the wrong question even if it sounds right. (Hmm, sounds like the myopia of DHS & TSA, don't it?)
Thinking about security and ways to break it is just like what sports coaches do analyzing their sport.
You analyze the other sides defense and change your offence to get through it. Then you might change your defense to minimize holes.
If you talk to a sports fan, this analysis is natural. But when it's security, most people are not "fans" like most of us here.
That's an interesting observation Tom. The difference I would suggest is the perceived stakes. Not to be flippant, but look at the cheating scandal in the NFL. Yes, there were penalties but overall a collective yawn because it's "just a game" and honestly a monopoly too. On the other hand the stakes are perceived to be a lot higher when it comes to a bank or an airplane.
Notice I said perceived. I think that in there is a cogent argument that cheating in sport is more damaging than robbing a bank but the reality is that whatever damage is done by cheating in sports is mostly an externality to the sport.
I'll repeat what I've said before. Fundamentally humans are bad at evaluating risk and systemic risk worst of all.
@Daniel: "I'll repeat what I've said before. Fundamentally humans are bad at evaluating risk and systemic risk worst of all."
I wonder about the evolutionary path that made the poor risk evaluation something widespread in the population. While it may appear to be a memetic issue, there is likely a genetic component in routing the neurons.
Yet feedback control systems - hands-down more effective and reliable than alternatives - are error-driven...
"I wonder about the evolutionary path that made the poor risk evaluation something widespread in the population"
Perhaps your metric for what's poor and what's good is wrong. Perhaps "poor" risk evaluation is the best survival strategy given the evolutionary context and our physical and cognitive limitations.
At the very least we can say it was good enough, because our ancestors managed to survive to produce us. I also suspect "good" risk evaluation is expensive, and not worth the investment at the time (and I'm not sure it is now).
Security at all costs == bankruptcy.
@ John Campbel,
I wonder about the evolutionary path that made the poor risk evaluation something widespread in the population
It has little to do with Evolution in general.
Evolution is mostly about changes in individuals or groups that favour a measurable change in offspring for them against other individuals or groups in any given environment.
The important thing is the environment needs to,
1, apply equally to all,
2, for a reasonable period of time,
3, for a measurable change,
to make it through a series of population cycles.
In most cases risk is about the probability of a short term rare and very localised event happening and not as such about a wide spread continuous environmental preasure.
Which does not preclude rare but high effect events we see some evolutionary response to the "results" of events such as droughts, famine, forest fires, volcanoes and even impacts from large space objects.
Now I'll argue for a moment that all environmental changes even very localised ones have an effect on the evolution of a group if they change or reduce breeding potential for an individual or group. But how do you measure it? and how do you unambiguously show it's effects when compared to other changes and their effects?
Now there is another aspect to this which is most easily seen in bulding codes and other safety regulattion/legislation. It is that rare risks are not worth mittigating against directly but only indirectly. For instance fire in buildings in the first world is not as such a significant risk (when compared to other non natural deaths) even less so is earthquakes and bomb threats etc. However a general purpose response "evacuation drill" mitigates all of them. Evolution also favours this sort of generalised reponse to threats.
Evolution also likewise favours non specialism and arguably this results in inefficiency. The reality is it favours robustness thus it likewise works against groups of threat not individual instances of threat because it is more efficient overall to do so.
Also evoloution is sometimes counter intuative. We know it is dangerous to live near active volcanoes, in flood plains and areas where fire, flood, avalanche etc happen. However these areas also usually offer rich soil etc so promote rapid and sustainable growth over a reasonable number of breeding cycles. We also know that if you are small and weak in a preditor rich environment then camouflage is seen as the way to go, or is it for some creatures being brightly coloured and easily seen is a better strategy because it acts as a warning they are (potentialy) leathal to eat.
So which ever way you look at it, an evolutionary response to cheating is not going to be easy to find let alone measure because it's very likely a signal below the noise floor or one that does not change the area under the graph measurably.
I went camping at Timpanogas cave in Utah over the weekend, and on Friday, hiked up for a cave tour.
The ranger who gave the safety lecture at the bottom of the mountain was reading Liars and Outliars.
And each person's survival on the trail up and down depended on no stranger being the type to "accidentally" bump another person off the side of the non-railed path and down the sheer cliffs.
@The_Imp: That's pretty much one of the stories Feynman told from his time at Los Alamos. He found a way to crack the safes with the secret papers, alerted the responsible guys from the military, and security procedure got changed. The new procedure was: "Don't let Feynman get near the safes."
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.