Schneier on Security
A blog covering security and security technology.
« Hawley Channels His Inner Schneier |
| Forever-Day Bugs »
April 17, 2012
Outliers in Intelligence Analysis
From the CIA journal Studies in Intelligence: "Capturing the Potential of Outlier Ideas in the Intelligence Community."
In war you will generally find that the enemy has at any time three courses of action open to him. Of those three, he will invariably choose the fourth.
—Helmuth Von Moltke
With that quip, Von Moltke may have launched a spirited debate within his intelligence staff. The modern version of the debate can be said to exist in the cottage industry that has been built on the examination and explanation of intelligence failures, surprises, omissions, and shortcomings. The contributions of notable scholars to the discussion span multiple analytic generations, and each expresses points with equal measures of regret, fervor, and hope. Their diagnoses and their prescriptions are sadly similar, however, suggesting that the lessons of the past are lost on each succeeding generation of analysts and managers or that the processes and culture of intelligence analysis are incapable of evolution. It is with the same regret, fervor, and hope that we offer our own observations on avoiding intelligence omissions and surprise. Our intent is to explore the ingrained bias against outliers, the potential utility of outliers, and strategies for deliberately considering them.
Posted on April 17, 2012 at 6:15 AM
• 13 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
As it's about "Outliers" which you have recently done so much to bring to the attention of "the masses", I have to wonder if,
The contributions of notable scholars to the discussion span multiple analytic generations
As you are also now a "notable scholar" is this a way of saying you are several generations old?
Just kidding you've a way to go to catch up to me ;)
I like that paper. A few comments:
(1) It's too bad that they conflate "weird evidence or data" with "weird hypotheses" under the same term, "outliers". The two should be kept logically and terminologically distinct, to avoid risk of obscuring the important distinction between evidence and conclusions.
(2) I really liked this quote about "foxes", who "...thought very differently... they had no template. Instead, they drew information and ideas from multiple sources and sought to synthesize it. They were selfcritical, always questioning whether what they believed to be true really was. And when they were shown they had made mistakes, they didn't try to minimize, hedge, or evade. They simply acknowledged they were wrong and adjusted their thinking accordingly. Most of all, these experts were comfortable seeing the world as complex and uncertain -- so comfortable that they tended to doubt the ability of anyone to predict the future. That resulted in a paradox: the experts who were more accurate than others tended to be much less confident they were right."
This is a perfect description of the mindset of a successful scientist. It is interesting that the same mindset is the ideal of what an intelligence analyst should aspire to.
(3) The authors discuss the "groupthink" phenomenon that has been rehashed ad nauseam in connection with the 2002-2003 analyses of Iraqi WMD. I really dislike and deplore this formulation, because it puts the onus of the failure entirely on the intelligence community, and disregards the very clear effect of the policy principals -- from Cheney and Rumsfeld on down -- who already knew the answers they expected the IC to provide, and were prepared to discredit any analysis --- and punish any analyst --- contradicting those answers. To the extent that an atmosphere could have existed in the IC that considered outlier opinion, it was systematically poisoned by the political supervision of that community. The responsibility for this poisoning is omitted, perhaps deliberately, in discussions of the analysis failure that focus on community "groupthink".
@Carlo Graziani: (2) is also a very good example of the Dunning-Kruger effect. Unfortunately it is usually those sure of themselves and their views that raise to poser. The Dunnin-Kruger effect basically ensures that they are incompetent.
That should read "power", of course.
This might go some distance in explaining the CIA's 60 years of bungling and gold-medal incompetence.
Read Tim Weiner for more:
@Gweihir: I would hope that whatever the intellectual failures of intelligence analysts may be, they don't fall into the category of Dunning-Kruger-style incompetence. The failures of their political masters may be another story, though.
One reason I'm interested in the analogy to the scientific mindset is that outlier data and outlier hypotheses play an important, and even famous, sociological role in science: Thomas Kuhn's paradigm shifts. These are cataclysmic changes in scientific models driven sometimes by data that doesn't fit existing models, sometimes by models that groan under the burden of internal inconsistencies.
Typically, Kuhn argued, it takes a while for the community to perceive the implications of the outlier data, or to realize the necessity of substituting an outlier model for the existing defective one (or even to realize the existence and importance of the defects). Eventually, however, a new set of ideas sweeps away or subsumes the old ones, and the world suddenly looks totally different to its students, despite the fact that the data themselves are unchanged --- the "paradigm shift".
There seems to me to be a useful potential connection between the sociology of scientific revolutions and that of the assimilation or rejection of outlier data and hypotheses in intelligence analysis. In both cases, there is consensus on what counts as the principal data, and on which principal hypotheses are most likely to explain that data. In both cases a potentially revolutionary role is played by the intrusion of other, apparently inconsistent data, and of inconsistencies in the stories told to explain the data. And in both cases, it is necessary for the intellectuals at work to consider rejecting part or all of cherished, firmly-entrenched world-views, in order to bring about an intelligible new world-view that is more coherent and better capable of assimilating the available data.
That sounds like another paper, though.
@Gweihir at April 17, 2012 9:09 AM
No, I think "poser" is exactly right. Change "raise to" to "become a", however.
I think Kuhn's paradigm shifts are a bit incorrect. Any new scientific theory must agree with all the data, old and new. Einstein's relativity was "revolutionary" but did not discard Newton's mechanics, it merely extended it. Relativity limits to Newtonian mechanics as velocity becomes much smaller than c. The world looks mostly the same, except in some rather extreme cases.
Any new theory must make the same correct predictions as the old theory. If it doesn't, the new theory has failed to account for existing data.
Helmuth Von Moltke in my mind has always been one of the most brilliant strategists . However, im pop culture, he's overshadowed by his mentor Carl von Clausewitz. Ironically though, most of his quotes are now adays misquoted as being from Clausewitz, for example the famous quote; "No battle plan survives contact with the enemy". (regarding both offensive and defensive strategies)
Another paradigm that is true in this field.
@Clive Robinson from the first post.
There is something fishy about your post, clearly too short. An outlier?
I remember a scene from Blackhawk Down where the base commander suddendly says "we just lost the initiative".
I surmise that whatever conditions that anyone see fit as preferrable for acting upon at any time, is what would be a situation understood as anyone "having the initiative".
This reminds me of the common story about various warfighting exercises, where any red team commander who employs seriously unorthodox tactics finds his victory rolled back and the exercise restarted to play out along conventional lines.
(And of course even von Moltke's maxim can be misused by someone who is convinced that a particular fourth alternative is the one that should be planned for.)
I have one simple but enormous problem with this paper. Before I explain it, I need to define some terminology.
Situational: Pertaining to the physical or social context. Situational psychology is the study of how that context shapes personal behavior. This is the discipline that brought you such celebrated classics as the Milgram obedience studies and the Stanford Prison Experiment.
Dispositional: Attributed to inherent features of personality rather than the situation. Humans have a bias toward seeing actions by strangers as the result of dispositional rather than situational factors.
So here's my criticism: this study relies on too dispositional a model of analyst behavior. It assumes that there are people who will always produce unusual but correct analyses, and those who will always stick with the crowd. It does so even after noting a change in behavior: "When the groundhog went back and read the original reports, the analyst started to demonstrate fox-like qualities."
Groupthink has been well-studied and it is mostly down to social factors. The same person can be encouraged to keep their head down or to take risks by the social environment they are placed in.
Rather than trying to detect a special magic sub-population, someone trying to encourage braver analysis would do better to train analysts in the process that results in accurate outlier analysis, and to address the social factors that work to suppress it.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.