More Behavioral Profiling

I’ve seen several articles based on this press release:

Computer and behavioral scientists at the University at Buffalo are developing automated systems that track faces, voices, bodies and other biometrics against scientifically tested behavioral indicators to provide a numerical score of the likelihood that an individual may be about to commit a terrorist act.

I am generally in favor of funding all sorts of research, no matter how outlandish—you never know when you’ll discover something really good—and I am generally in favor of this sort of behavioral assessment profiling.

But I wish reporters would approach these topics with something resembling skepticism. The false-positive rate matters far more than the false-negative rate, and I doubt something like this will be ready for fielding any time soon.

EDITED TO ADD (10/13): Another comment.

Posted on October 15, 2007 at 6:16 AM27 Comments

Comments

Joshua October 15, 2007 6:45 AM

If reporters would approach any topic with even a little skepticism, I’d probably fall on my knees and give my life to Jesus, because only direct intervention from on high could possibly bring about such a miracle.

J. October 15, 2007 7:25 AM

If you know nothing about a topic, leave it to a bunch of statistical filters, a neural network or the like of them. There is no indication that they are doing anything else than that, wrapping it up in marketing speech and fishing for funding.

What’s more, how do you gather a bunch of liars and terrorists to actually willingly help test the device? Well, you cannot.

This is polygraph nonsense again.

Arrie October 15, 2007 8:41 AM

There is a very similar software setup in public pools in the UK to detect drowning individuals. http://blog.softtechvc.com/2005/09/when_technology.html . Although I would think detecting a lifess body on the bottom of a pool would be a lot easier than detecting someone who is acting like a terrorist.
Not sure how you would go about profiling a terrorist. There are too many factors and motivations behind why people do what they do to provide an accurate profile I would have thought.
To detect if someone is lying or hiding the truth as is said in the article I believe would be possible, but to say they are a terrorist just doesn’t make sense.

Dave Page October 15, 2007 8:41 AM

“I have to admit that I was initially pretty skeptical of U. Buffalo’s proposed automatic terrorist threat assessment tool but now that Cory Doctorow – my lodestar to the reflexive geek position – has rubbished it (he compares it to phrenology), I figured I’d take another look.”

Glad I’m not the only one who scans BoingBoing’s headlines and inverts the reaction!

Dale October 15, 2007 9:09 AM

The research might be valuable, but it looks like someone wanted to pump up their grant proposal. This makes me question their ethics (or at least their judgment). You may have the fantasy your research will Change The World As We Know It, but please, leave it out of the grant proposals.

Along with the criminal predictor scan, I’m also waiting for my flying car.

RC October 15, 2007 9:21 AM

Concerning the false positives, how do I defend myself against a claim of criminal terrorist intent based on a computer analysis of behavioral indicators? It’s like being accused of a thought-crime; there is no way to exhonerate victims of the false positive.

Roy October 15, 2007 9:31 AM

Would this technology be able to detect government agents intent on flying to foreign countries to kidnap people there and sneak them out of the country to secret foreign prisons for torture?

No? Then no, this would be useless for finding terrorists.

Rick Auricchio October 15, 2007 10:49 AM

A numeric result is totally useless.

A red/amber/green indicator, however, would be perfect.

Airlines could perhaps be convinced to offer lower fares on flights with one or more “amber” passengers.

“Ladies and gentlemen, we have good news for passengers of flight 403. We’ve just been advised that we have TWO amber passengers aboard, so we’ll be giving all our green passengers a $40 rebate coupon!”

Carlo Graziani October 15, 2007 10:52 AM

“The false-positive rate matters far more than the false-negative rate”

Technically, this is not accurate. Both factors enter the analysis. “Sensitivity Curves” — the true-positive rate (which is the complement of the false negative rate) as a function of false positive rate — are the crucial tool for investigating the effectiveness of this kind of screening device.

The point is that all such systems (polygraphs are the reference standard) have adjustable sensitivity settings. You can dial up or down the probability that the system will correctly identify whatever it’s looking for (a terrorist, a spy, etc). The higher the sensitivity, the higher the false-positive rate, and vice-versa. The burden of proof on these systems (which none of them meet, so far as I am aware) is to show that there exists a “sweet spot” — a sensitivity setting with a non-useless true-positive rate corresponding to an acceptable false-positive rate.

I hate to sound like a broken record on this, but the NAS study on polygraphing is really an excellent and comprehensive treatment that supplies the intellectual framework for understanding how to judge all these person-screening technologies — not just polygraphs. It should be required reading for any government official considering procurement of such systems, and for those who offer comment on them.

paul October 15, 2007 10:58 AM

Let’s imagine for the moment that this stuff works. And that the reason we’re not all being called downtown for an “interview” to see if we’re terrorists is that there aren’t enough security investigators. So how would the logistics proceed? Every year (or maybe every month, just to be on the safe side) would everyone in the country be called down to an “interview” to determine whether they had developed terrorist leanings? Would the interview just be added to important security checkpoints like airports, border crossings, bus stations and laundromats?

Even if something like this worked (which seems extraordinarily unlikely, both in general terms and because people passing through security have so much else to be deceptive about) you would need such a huge infrastructure of surveillance and isolation to get real results that you would be better off economically just letting the terrorists through…

Brian S October 15, 2007 11:59 AM

“Concerning the false positives, how do I defend myself against a claim of criminal terrorist intent based on a computer analysis of behavioral indicators? It’s like being accused of a thought-crime; there is no way to exhonerate victims of the false positive.”

You could of course request a re-test while thinking “happy thoughts”.

You’d either be cleared or be able to fly off to Neverland, I forget which …

jmr October 15, 2007 12:59 PM

Bruce,

So, what, you’re asking for responsible reporting?

Sorry, I haven’t seen any lately.

jmr

Avery October 15, 2007 1:19 PM

Where did they get their data that they’re using to make their predictions?

In the medical field you can screen thousands of people and then follow up on the ones who, say had a heart attack within the next year to look for warning signs. How many millions of people would you have to screen to get even one person who is “about to commit a terrorist act.”

Anonymous October 15, 2007 1:45 PM

If these automated systems really worked, or even had a hope of working, wouldm’t we have already seem the techology being used in Israel? After all, they’ve had more experience with terrorist than the US ever has, and they have a solid track recod in hi-tech, and are usually ahead of the rest of the world in anti-terrorist measures..

FP October 15, 2007 1:58 PM

My problem with behavioral profiling is that they generally flag anything “out of the ordinary.”

Over time, every individual behaving unusually will be tired of being inconvenienced as a false positive, and change their behavior.

Who gets to decide what “ordinary” is?

jammit October 15, 2007 3:41 PM

Trying to create a program that can catch “hinky” actions better than a human has me very skeptical. Although it probably wouldn’t hurt to put all “positives” on the same airplane.

Stefan Wagner October 15, 2007 5:08 PM

My first question was too: Where did they get the patterns of terrorist behaviour?

On the other hand: Why not evolve a system for general behavioural patterns?
Imagin queing up at the airport, and a voice from the voicespeaker yelling: “Mr. Smith! The toilet is in floor 3, Room 301a”. “Mrs. Johnson – calm down, please! You’ll get breakfast on board.”

Steve October 15, 2007 10:08 PM

The worst thing about all of these automated screening systems is their susceptibility to “Garbage in, Gospel out” syndrome. People, especially the sort of GED-flunkouts who wind up in bottom-level security jobs, tend to trust anything that came out of a computer, even if it looks like nonsense. The people who program these systems are aware of their limitations, but any suggestion that The Computer might be wrong will convince the two goons who just stepped up behind you that you must be a terrorist.

TheDoctor October 16, 2007 2:34 AM

Those guys just put the right sticker to their project.
In these days every research even remotely in the security sector can gain much more money if they can catch TERRORISTS.
Beside that [paul] got to the center of the problem: what to do if such thing works ?

Colossal Squid October 16, 2007 9:12 AM

“Beside that [paul] got to the center of the problem: what to do if such thing works ?”

And if a frog had wings it wouldn’t bump its arse hopping.

Chris S October 16, 2007 9:49 AM

More people need to read the whole article.

The entire premise of the system is that you are being interviewed in the first place. And – that the interview will take long enough to build the profile during the interview.

The biggest problem I see deploying this for the U.S. is the lack of sufficient interview rooms and interviewers. It tried to dig up some stats, and found that in 2006, U.S. carriers enplaned 85,000,000 international passengers. Think about the logistics of handling all those interviews. That – and that the interviewers likely need to have considerably more training than the searchers and x-ray watchers.

The media isn’t doing a good job, because they’re burying that key point – and vastly underemphasizing it. I suspect that Cory either didn’t get that far before deciding it was worth blogging, or missed the underwhelming refererence.

I agree that Cory is way over-reacting — but this still strikes as something that won’t work for us because we won’t pay the price of deploying it. If we were willing to pay that price, we could already have detailed interviews with well-trained interviewers.

Jersey October 16, 2007 10:04 AM

“…automated systems that track faces, voices, bodies and other biometrics against scientifically tested behavioral indicators to provide a numerical score of the likelihood that an individual may be about to commit a terrorist act.”

Given the rarity of actual terrorist acts, I’d be very curious to see exactly what the “scientifically tested behavioral indicators” are. Oh, and the false positive and negative rates.

TheDoctor October 17, 2007 5:52 AM

@Colossal Squid: Currently you’re right.

But I have seen too much things come true (or become computable by raw force).
So at least it’s useful to think about implications.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.