Estimating the Probability of Another 9/11

This statistical research says once per decade:

Abstract: Quantities with right-skewed distributions are ubiquitous in complex social systems, including political conflict, economics and social networks, and these systems sometimes produce extremely large events. For instance, the 9/11 terrorist events produced nearly 3000 fatalities, nearly six times more than the next largest event. But, was this enormous loss of life statistically unlikely given modern terrorism’s historical record? Accurately estimating the probability of such an event is complicated by the large fluctuations in the empirical distribution’s upper tail. We present a generic statistical algorithm for making such estimates, which combines semi-parametric models of tail behavior and a non-parametric bootstrap. Applied to a global database of terrorist events, we estimate the worldwide historical probability of observing at least one 9/11-sized or larger event since 1968 to be 11-35%. These results are robust to conditioning on global variations in economic development, domestic versus international events, the type of weapon used and a truncated history that stops at 1998. We then use this procedure to make a data-driven statistical forecast of at least one similar event over the next decade.

Article about the research.

Posted on September 13, 2012 at 1:20 PM21 Comments

Comments

Stephen September 13, 2012 3:25 PM

Hard to say but once the new target in NYC is up and running, I bet it will be high on the list. I hope nothing ever happens to it.

Someone September 13, 2012 3:54 PM

Wait, they estimate the probability of at least one event between 1968 and 2012 to be only 11-35% (worldwide), but they’re expecting another one within the next decade?

nobodyspecial September 13, 2012 4:06 PM

Aren’t scaling laws wonderful?

i took the St Bartholomew’s day massacre and the Holocaust to calculate if there will be mass murder at the church picnic this sunday.

BrianD September 13, 2012 4:20 PM

This study is completely wrong. As everyone knows, the world ends on 12/21/2012 which is this year. 9/11 has already passed this year, so there can’t be another 9/11 as the world will have ended before then. Duh.

David Leppik September 13, 2012 4:57 PM

So they’re using common terrorist events to predict the probability of extremely rare terrorist events? This is the same sort of blind number crunching that brought us mortgage-backed securities and credit default swaps.

In the old days, computer experts coined the term GIGO, for “Garbage In = Garbage Out” because some people believed that a computer was so high-tech that it could produce the right calculation even when given the wrong data. The only thing that’s changed is we’ve found better ways to obfuscate the input.

For example, after 9/11, smaller terrorist events are both more likely to be counted in a database, and more things are classified as terrorist events. Without controlling for these factors (if that’s even possible), the source database is garbage.

What’s more, it assumes that a large terrorist event is even the same kind of beast as a smaller one. Maybe it is, maybe it isn’t. Number crunching won’t tell you– even if you find a distribution that seems to fit your data.

Hari Seldon September 13, 2012 5:33 PM

This is an intriguing science. I wonder if it can be developed into some sort of “psychohistory” to examine probable futures on a societal scale.

Jon September 13, 2012 7:02 PM

I wonder – do events like Srebrenica (1995) and Rawanda (1994) count as mega-mass-casualty terrorist events? If they do, then potentially once-per-decade might be understating things, at least at a global scale.

Jon September 13, 2012 7:14 PM

Addendum: neither Srebrenica (1995) nor Rawanda (1994) seem to appear in the GTD, so I guess they aren’t considered to be ‘terrorism’.

JP September 13, 2012 7:31 PM

High impact low frequency events (HILF) are incredibly difficult to forecast. How to get around it? Superset and analyze the broader class of events then narrow it down.

The authors could have treated terrorism as an attack resulting from war or other conflict. What is the probability the U.S. would be involved in a conflict? What is the probability of a “successful” attack in any war resulting in 2500+ deaths? What is the probability an attack would be publicly/politically considered a “terrorist attack”?

Still a low probability. But now we are measuring 3 events we can reasonably estimate rather than trying to infer the probability of a High Impact Low Frequency event.

Steve September 16, 2012 10:37 PM

Once per decade. . . or, perhaps more often if we keep killing people by remote controlled drone or cruise missile attack.

Revenge begets revenge.

Mark September 25, 2012 1:36 PM

Bruce, why are you so reluctant to debate the science of 9/11? I’m most surprised you cannot even offer an answer and chose instead to erase my comment. You have gone down in my estimation.

Moderator September 25, 2012 9:09 PM

Actually, Mark/Tony, I removed your comments because they were the most pitifully obvious example of sockpuppetry I’ve seen in quite some time.

If you really care about the “science of 9/11,” the best thing you could do is quit trying to promote it in blog comments. You’re not a very skilled troll, and it’s really just embarrassing.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.