Schneier on Security
A blog covering security and security technology.
« Truth and Photographs |
| TSA Warns of Terrorist Dry Runs »
July 25, 2007
MRI Lie Detectors
Long and interesting article on fMRI lie detectors.
I was particularly struck by this paragraph, about why people are bad at detecting lies:
Maureen O'Sullivan, a deception researcher at the University of San Francisco, studies why humans are so bad at recognizing lies. Many people, she says, base assessments of truthfulness on irrelevant factors, such as personality or appearance. "Baby-faced, non-weird, and extroverted people are more likely to be judged truthful," she says. (Maybe this explains my trust in Steve Glass.) People are also blinkered by the "truthfulness bias": the vast majority of questions we ask of other people -- the time, the price off the breakfast special -- are answered honestly, and truth is therefore our default expectation. Then, there's the "learning-curve problem." We don't have a refined idea of what a successful lie looks and sounds like, since we almost never receive feedback on the fibs that we've been told; the co-worker who, at the corporate retreat, assured you that she loved your presentation doesn't usually reveal later that she hated it. As O'Sullivan puts it, "By definition, the most convincing lies go undetected."
EDITED TO ADD (8/28): The New York Times has an article on the topic.
Posted on July 25, 2007 at 6:26 AM
• 28 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
a majority do not want to hear the truth nor could they deal with it.
>a majority do not want to hear the truth nor could they deal with it.
What's this based on? I'm sure it's true for isolated incidents, but I think people are more afraid of telling others the truth, than hearing it themselves.
I go even further and I affirm that fMRI lie detectors are absoltely useless against "spin doctors" like Communications Directors, Press Agents, Press Secretaries and the like...
I'm just speculating, but I very much doubt that a fMRI would be much more of a fool proof lie detector than a polygraph.
It might be able to distinguish between a person recalling memories and making things up on the spot, but not be able to distinguish between recalling a true description of events or a pre-planned fictional description.
But while it would probably be fairly inaccurate in determining the truthfulness of verbal responses, it might be much more accurate in detecting a person's responses to statements, images etc. presented by examiners.
Most people want the truth about most things, I think: it benefits me not at all to be handed a dishonest weather forecast (or, worse, current report), to be told my plane has left when it hasn't, or to be deceived about the amount of money in my bank account.
The assertion "most people do not want the truth" usually means "I know how the world works, and the people who disagree with me are wilfully deluded," rather than being mistaken but willing to learn otherwise, and of course disregarding the possibility that the speaker is the one who is deluded. It may be the case that most people who say most people do not want the truth do not, themselves, want the truth--they want their beliefs and prejudices reinforced, and assume others are like them in that regard.
>>a majority do not want to hear the truth nor could they deal with it.
> What's this based on?
"A few good men"
There's also the problem (not addressed in the article) of brain pathologies. All standard lie-detection methods fail on sociopaths, pathological liars, and those with a variety of brain disorders.
it's tough to get our minds around the distinction between objective truth and subjective belief. that's where the polygraph breaks down, it measures emotion but not what causes it. a liar's fear of detection isn't clearly distinguished from an innocent's fear of false conviction.
it seems the science behind the MRI lie detector isn't well enough developed to elucidate the mechanisms by which it purports to distinguish truth from lies. Until that's done it's a matter of belief or faith that it works, not demonstrable truth.
incidentally, statements like this one from the article Bruce referenced are striking illustrations of the general confusion about truth vs belief: "...traditional modes of ascertaining the truth—such as the jury system..." The jury doesn't determine truth, it votes on which conflicting version of the truth is to be believed, and how to reconcile all the different conflicting versions of the truth with provisions of our legal system regarding guilt or innocence. Truth is not something mutable to be shaped by a jury, or a machine!
One step closer to the science fictional mind probe device which people will use instead of torture and interrogation to obtain the information they want. Soon followed by the mind wipe, quite probably. Hopefully, we are 100 years away from such systems.
"...a majority do not want to hear the truth nor could they deal with it."
So they lie to cover their ass.
fMRI will show you the extra "work" put into fabricating a lie that fits with a web of lies. I expect it would be very accurate against most people (think test subjects). I'd be interested in how you'd be able to get good test subjects since your best subjects would no doubt lie on their data sheets...
the greatest trick the devil ever played on the world was convincing them that he doesn't exist
>>a majority do not want to hear the truth nor could they deal with it.
>What's this based on? I'm sure it's true for isolated incidents, but I think people are more afraid of telling others the truth, than hearing it themselves.
The popularity of religion is proof of it.
So as far as we know, Bruce Schneier - which may not be his name - is a conservative Libertarian working for the NSA?
Now I'm left with only being able to believe God, my wife, and Chuck Norris.
there's a funny bit about beating a polygraph in the movie Ocean's 13
1. Brain-scan lie detection is predicated on the idea that lying requires more cognitive effort, and therefore more oxygenated blood, than truthtelling.
This really screws people for whom the truth really matters. We have to work hard to make sure we get it right and not make mistakes. Truth-telling does not come easily. We can be careless when telling a joke because mistakes are unimportant, but if people are counting on us to be right our statements have to be crafted carefully.
If we are asked what we sense is a tricky question, it requires abnormal effort to answer truthfully, to our own standards of truth, which the gadget will no doubt declare to be a sign of deception.
Look around at pointy-haired bosses to see how easily lying comes to those people. Often they lie all the time because they don't know what the truth is about anything.
2. "Guilt carries fear always about with it ..."
This is another childish idea, that guilt is a burden and the guilty worry about being found out. Spend a little time looking at our national politicians and thereby disabuse yourself of this silly notion.
3. Larson equated "Do you dance?" with "Did you steal the money?" even though any idiot would know the one is a question of fact and the other is an accusation. Acuse somebody much bigger than yourself of a horrible act and see if they stay dead calm or not, but take off your glasses first.
4. "We hope we can get better."
There's a kernel of truth there: behind all this is a lot of wishful thinking.
What is glaringly obvious is that none of the workers in this field have anything better than a childish idea of what lying is. How many of them would understand there are lies of commission and lies of omission?
Do I ever lie? Of course I do. I'm lying right now. And so is everyone else, whether they are speaking, writing, or staying silent. If we are all always lying, how can a machine detect 'truth'?
Like the polygraph, this gadget may turn out to be nothing more than a dowsing rod with a power supply
As the article points out, "People who are afraid of being disbelieved, even when they are telling the truth, may well look more nervous than people who are lying."
So, Bruce, what does this say about guards who identify people to search more closely, simply because there is something "hinky" about them?
I find this blog entry fascinating.
An interesting corollary to the "truthfulness bias" - apparently police officers have a higher rate of successfully detecting lies, because they become used to assuming people are lying to them. However, their rate of successfully detecting truth is lower by an exactly corresponding amount.
That is, the average person has false-positive and false-negative rates for detecting lies that come out barely better than random chance. Trained police officers do no better, they're just shifted more toward false-positives and away from false-negatives.
In other words, whether you have a "truthfulness bias" or "crookedness bias" doesn't make you any better or worse at determining whether someone is lying to you - it just influences what sort of mistakes you'll tend to make. And police training in questionig techniques doesn't change that. (not that it doesn't help with anything, just not with discerning a subject's truthfulness)
I wish I could recall where I read this, but I can't cite a source...
@dmc: "So, Bruce, what does this say about guards who identify people to search more closely, simply because there is something "hinky" about them?"
That, surprisingly enough, there may be false positives.
> > > a majority do not want to hear the truth nor could they deal with it.
> > What's this based on? I'm sure it's true for isolated incidents, but I think people are more afraid of telling others the truth, than hearing it themselves.
> The popularity of religion is proof of it.
Unless one of them is true.
dragonfrog: The book "Lies, Lies, Lies" by Charles Ford has a chapter on lie detection capabilities of several groups. Apparently the secret service guys are excellent at it.
The author's explanation is that the secret service personnel continuously look out for suspicious behavior in many subjects but don't find anything wrong normally.
There is a problem with testing lie detectors: you want to test them in a real situation, where the subject has potentially really done something wrong and which has real consequences for them if caught. However, to evaluate the results you also need to know who was and wasn't lying.
So here's an idea: use them when looking for drug-cheats in sport. Store blood and urine samples. Over the following years, the actual drug cheats will be revealed either through the normal current testing procedures, or by reanalysis of the stored samples with new detection technology.
"Baby-faced, non-weird, and extroverted people are more likely to be judged truthful," she says.
Speaking as an introvert, all I can say is, "Sigh." It would be interesting to know if this is culturally mediated. The U.S., a nation of salesmen, is extrovert-oriented. Would the introvert-oriented Japanese be more likely to see extroverts as liars? I know as an introvert, I tend to doubt the veracity of extroverts, suspiciously seeing them as either salesmen (who lie for a living) or overly-optimistic (i.e., they tend not to observe negative information).
In the current fall-out from the housing bubble, for instance, were the real estate industry spokesmen who kept assuring the world that everything was fine, and then when the cracks started to appear, that things weren't all that bad, either (a) lying through their teeth (they knew things were much worse than they were saying), or (b) buying their own bullshit (they'd sold themselves on the idea that the bubble was the natural state of things, and couldn't stop believing in it).
Which raises the question, how do you test for a lie that the liar believes in?
From the article:
> At the National Academy of Sciences committee meeting,
> [Cephos' C.E.O.] said, ``I can say we'e not at
> ninety-per-cent accuracy. And I have said, if we were not
> going to get to ninety per cent, we're not going to sell
> this product.'' (Nobody involved in fMRI lie detection
> seems troubled by a ten-per-cent error rate.)
I wonder how the acceptance would be if a disinterested
3rd party (want to moonlight, Bruce?) generated a random
number from [0, 1), and if the the result is in the range
o For job tests, the HR type asking for the scan
was fired and forced to flip burgers for a year,
o in criminal cases, the prosecutor was
automatically convicted of the crime.
Hey, fair's fair!
> So, Bruce, what does this say about guards who identify
> people to search more closely, simply because there is
> something "hinky" about them?
> That, surprisingly enough, there may be false positives.
It says some honest people will get searched, and sent on
their way when their non-smuggling is affirmed. _Very_ different
from convicting someone being ``hinky'' with no way to know
the cause thereof.
A false positive isn't _too_ bad if its falsity can be
detected. (If you've ever seen the state of a car after the
DEA/INS has searched it _thoroughly_, you know it can still
be bad.) The trouble with lie-detector tests is that, by the
very nature of their use, false positives cannot be shown to
One additional fundamental problem is that if you really care about what "truth" is, you find there is no workable definition. First, all "truth" is relative. Second it is subjective. Third, it depends on time. And fourth, it depends on abstraction used, which itself is multi-dimensional.
Bottom line: Lie detectors can only work for simplistic people. All others understand that the concept of "truth" is pretty meaningless. You can even define a "lie" as a low validation quality true statement, which is perfectly workable. Also for non-simplistic people a "I don't know" is always a true statement, as was already known to Socrates.
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.