MRI Lie Detectors

Long and interesting article on fMRI lie detectors.

I was particularly struck by this paragraph, about why people are bad at detecting lies:

Maureen O’Sullivan, a deception researcher at the University of San Francisco, studies why humans are so bad at recognizing lies. Many people, she says, base assessments of truthfulness on irrelevant factors, such as personality or appearance. “Baby-faced, non-weird, and extroverted people are more likely to be judged truthful,” she says. (Maybe this explains my trust in Steve Glass.) People are also blinkered by the “truthfulness bias”: the vast majority of questions we ask of other people—the time, the price off the breakfast special—are answered honestly, and truth is therefore our default expectation. Then, there’s the “learning-curve problem.” We don’t have a refined idea of what a successful lie looks and sounds like, since we almost never receive feedback on the fibs that we’ve been told; the co-worker who, at the corporate retreat, assured you that she loved your presentation doesn’t usually reveal later that she hated it. As O’Sullivan puts it, “By definition, the most convincing lies go undetected.”

EDITED TO ADD (8/28): The New York Times has an article on the topic.

Posted on July 25, 2007 at 6:26 AM29 Comments


Val July 25, 2007 7:49 AM

a majority do not want to hear the truth nor could they deal with it.

What’s this based on? I’m sure it’s true for isolated incidents, but I think people are more afraid of telling others the truth, than hearing it themselves.

jcb July 25, 2007 8:17 AM

I go even further and I affirm that fMRI lie detectors are absoltely useless against “spin doctors” like Communications Directors, Press Agents, Press Secretaries and the like…

Anonymous July 25, 2007 8:29 AM

I’m just speculating, but I very much doubt that a fMRI would be much more of a fool proof lie detector than a polygraph.

It might be able to distinguish between a person recalling memories and making things up on the spot, but not be able to distinguish between recalling a true description of events or a pre-planned fictional description.

But while it would probably be fairly inaccurate in determining the truthfulness of verbal responses, it might be much more accurate in detecting a person’s responses to statements, images etc. presented by examiners.

Vicki July 25, 2007 8:33 AM

Most people want the truth about most things, I think: it benefits me not at all to be handed a dishonest weather forecast (or, worse, current report), to be told my plane has left when it hasn’t, or to be deceived about the amount of money in my bank account.

The assertion “most people do not want the truth” usually means “I know how the world works, and the people who disagree with me are wilfully deluded,” rather than being mistaken but willing to learn otherwise, and of course disregarding the possibility that the speaker is the one who is deluded. It may be the case that most people who say most people do not want the truth do not, themselves, want the truth–they want their beliefs and prejudices reinforced, and assume others are like them in that regard.

Joe July 25, 2007 8:39 AM

a majority do not want to hear the truth nor could they deal with it.
What’s this based on?

“A few good men”

aikimark July 25, 2007 8:59 AM

There’s also the problem (not addressed in the article) of brain pathologies. All standard lie-detection methods fail on sociopaths, pathological liars, and those with a variety of brain disorders.

guvn'r July 25, 2007 9:04 AM

it’s tough to get our minds around the distinction between objective truth and subjective belief. that’s where the polygraph breaks down, it measures emotion but not what causes it. a liar’s fear of detection isn’t clearly distinguished from an innocent’s fear of false conviction.

it seems the science behind the MRI lie detector isn’t well enough developed to elucidate the mechanisms by which it purports to distinguish truth from lies. Until that’s done it’s a matter of belief or faith that it works, not demonstrable truth.

incidentally, statements like this one from the article Bruce referenced are striking illustrations of the general confusion about truth vs belief: “…traditional modes of ascertaining the truth—such as the jury system…” The jury doesn’t determine truth, it votes on which conflicting version of the truth is to be believed, and how to reconcile all the different conflicting versions of the truth with provisions of our legal system regarding guilt or innocence. Truth is not something mutable to be shaped by a jury, or a machine!

John Moore July 25, 2007 9:11 AM

One step closer to the science fictional mind probe device which people will use instead of torture and interrogation to obtain the information they want. Soon followed by the mind wipe, quite probably. Hopefully, we are 100 years away from such systems.

Tom Grant July 25, 2007 9:26 AM

“…a majority do not want to hear the truth nor could they deal with it.”

So they lie to cover their ass.

-ac- July 25, 2007 9:26 AM

fMRI will show you the extra “work” put into fabricating a lie that fits with a web of lies. I expect it would be very accurate against most people (think test subjects). I’d be interested in how you’d be able to get good test subjects since your best subjects would no doubt lie on their data sheets…

usual suspects July 25, 2007 9:28 AM

the greatest trick the devil ever played on the world was convincing them that he doesn’t exist

billswift July 25, 2007 9:33 AM

a majority do not want to hear the truth nor could they deal with it.

What’s this based on? I’m sure it’s true for isolated incidents, but I think people are more afraid of telling others the truth, than hearing it themselves.

The popularity of religion is proof of it.

desufnoc July 25, 2007 9:39 AM

So as far as we know, Bruce Schneier – which may not be his name – is a conservative Libertarian working for the NSA?

Now I’m left with only being able to believe God, my wife, and Chuck Norris.

FooDooHackedYou July 25, 2007 9:53 AM

there’s a funny bit about beating a polygraph in the movie Ocean’s 13

Roy July 25, 2007 10:01 AM

  1. Brain-scan lie detection is predicated on the idea that lying requires more cognitive effort, and therefore more oxygenated blood, than truthtelling.

This really screws people for whom the truth really matters. We have to work hard to make sure we get it right and not make mistakes. Truth-telling does not come easily. We can be careless when telling a joke because mistakes are unimportant, but if people are counting on us to be right our statements have to be crafted carefully.

If we are asked what we sense is a tricky question, it requires abnormal effort to answer truthfully, to our own standards of truth, which the gadget will no doubt declare to be a sign of deception.

Look around at pointy-haired bosses to see how easily lying comes to those people. Often they lie all the time because they don’t know what the truth is about anything.

  1. “Guilt carries fear always about with it …”

This is another childish idea, that guilt is a burden and the guilty worry about being found out. Spend a little time looking at our national politicians and thereby disabuse yourself of this silly notion.

  1. Larson equated “Do you dance?” with “Did you steal the money?” even though any idiot would know the one is a question of fact and the other is an accusation. Acuse somebody much bigger than yourself of a horrible act and see if they stay dead calm or not, but take off your glasses first.
  2. “We hope we can get better.”

There’s a kernel of truth there: behind all this is a lot of wishful thinking.

What is glaringly obvious is that none of the workers in this field have anything better than a childish idea of what lying is. How many of them would understand there are lies of commission and lies of omission?

Do I ever lie? Of course I do. I’m lying right now. And so is everyone else, whether they are speaking, writing, or staying silent. If we are all always lying, how can a machine detect ‘truth’?

Like the polygraph, this gadget may turn out to be nothing more than a dowsing rod with a power supply

dmc July 25, 2007 10:13 AM

As the article points out, “People who are afraid of being disbelieved, even when they are telling the truth, may well look more nervous than people who are lying.”

So, Bruce, what does this say about guards who identify people to search more closely, simply because there is something “hinky” about them?

dragonfrog July 25, 2007 12:05 PM

An interesting corollary to the “truthfulness bias” – apparently police officers have a higher rate of successfully detecting lies, because they become used to assuming people are lying to them. However, their rate of successfully detecting truth is lower by an exactly corresponding amount.

That is, the average person has false-positive and false-negative rates for detecting lies that come out barely better than random chance. Trained police officers do no better, they’re just shifted more toward false-positives and away from false-negatives.

In other words, whether you have a “truthfulness bias” or “crookedness bias” doesn’t make you any better or worse at determining whether someone is lying to you – it just influences what sort of mistakes you’ll tend to make. And police training in questionig techniques doesn’t change that. (not that it doesn’t help with anything, just not with discerning a subject’s truthfulness)

I wish I could recall where I read this, but I can’t cite a source…

Eam July 25, 2007 12:13 PM

@dmc: “So, Bruce, what does this say about guards who identify people to search more closely, simply because there is something “hinky” about them?”

That, surprisingly enough, there may be false positives.

Bob July 25, 2007 3:41 PM

a majority do not want to hear the truth nor could they deal with it.

What’s this based on? I’m sure it’s true for isolated incidents, but I think people are more afraid of telling others the truth, than hearing it themselves.

The popularity of religion is proof of it.

Unless one of them is true.

Koray July 25, 2007 3:56 PM

dragonfrog: The book “Lies, Lies, Lies” by Charles Ford has a chapter on lie detection capabilities of several groups. Apparently the secret service guys are excellent at it.

The author’s explanation is that the secret service personnel continuously look out for suspicious behavior in many subjects but don’t find anything wrong normally.

Michael Woodhams July 25, 2007 6:41 PM

There is a problem with testing lie detectors: you want to test them in a real situation, where the subject has potentially really done something wrong and which has real consequences for them if caught. However, to evaluate the results you also need to know who was and wasn’t lying.

So here’s an idea: use them when looking for drug-cheats in sport. Store blood and urine samples. Over the following years, the actual drug cheats will be revealed either through the normal current testing procedures, or by reanalysis of the stored samples with new detection technology.

will lee July 25, 2007 7:31 PM

“Baby-faced, non-weird, and extroverted people are more likely to be judged truthful,” she says.

Speaking as an introvert, all I can say is, “Sigh.” It would be interesting to know if this is culturally mediated. The U.S., a nation of salesmen, is extrovert-oriented. Would the introvert-oriented Japanese be more likely to see extroverts as liars? I know as an introvert, I tend to doubt the veracity of extroverts, suspiciously seeing them as either salesmen (who lie for a living) or overly-optimistic (i.e., they tend not to observe negative information).

In the current fall-out from the housing bubble, for instance, were the real estate industry spokesmen who kept assuring the world that everything was fine, and then when the cracks started to appear, that things weren’t all that bad, either (a) lying through their teeth (they knew things were much worse than they were saying), or (b) buying their own bullshit (they’d sold themselves on the idea that the bubble was the natural state of things, and couldn’t stop believing in it).

Which raises the question, how do you test for a lie that the liar believes in?

Terry Cloth July 29, 2007 2:09 PM

From the article:

At the National Academy of Sciences committee meeting,
[Cephos’ C.E.O.] said, “I can say we’e not at
ninety-per-cent accuracy. And I have said, if we were not
going to get to ninety per cent, we’re not going to sell
this product.” (Nobody involved in fMRI lie detection
seems troubled by a ten-per-cent error rate.)

I wonder how the acceptance would be if a disinterested
3rd party (want to moonlight, Bruce?) generated a random
number from [0, 1), and if the the result is in the range
[0, 0.1):

 o For job tests, the HR type asking for the scan
   was fired and forced to flip burgers for a year,

 o in criminal cases, the prosecutor was
   automatically convicted of the crime.

Hey, fair’s fair!

Terry Cloth July 29, 2007 2:21 PM


So, Bruce, what does this say about guards who identify
people to search more closely, simply because there is
something “hinky” about them?


That, surprisingly enough, there may be false positives.

It says some honest people will get searched, and sent on
their way when their non-smuggling is affirmed. Very different
from convicting someone being “hinky” with no way to know
the cause thereof.

A false positive isn’t too bad if its falsity can be
detected. (If you’ve ever seen the state of a car after the
DEA/INS has searched it thoroughly, you know it can still
be bad.) The trouble with lie-detector tests is that, by the
very nature of their use, false positives cannot be shown to
be such.

Gweihir August 10, 2011 1:43 AM

One additional fundamental problem is that if you really care about what “truth” is, you find there is no workable definition. First, all “truth” is relative. Second it is subjective. Third, it depends on time. And fourth, it depends on abstraction used, which itself is multi-dimensional.

Bottom line: Lie detectors can only work for simplistic people. All others understand that the concept of “truth” is pretty meaningless. You can even define a “lie” as a low validation quality true statement, which is perfectly workable. Also for non-simplistic people a “I don’t know” is always a true statement, as was already known to Socrates.

Miguel Lahunken July 1, 2018 2:21 PM

  Misleading articles have been published denying the ability to increase percentages of brain use. Of course 90% of the brain doesn't stay unused. Conscious brain use is traced by an MRI by showing volumes of light in the active parts of the brain. It was even suggested to be used as a lie detector. It was written that when the truth is told there would be only one region of light. When a lie is told there would be two regions of light. It was spread that by memorizing tables of analogous correspondences there would be multiple areas of light, truth or lie. That put an end to that form of lie detector.
  With normal people, the total volume of these volumes of light is not over 10% brain use at any one instant. LSD blocks seratonin the neurotransmitter of the inhibitory neurons of the brain which keep brain use down to 10% brain use. By LSD blocking seratonin the brain wakes up to more than 10% brain use, depending upon the amount of LSD ingested.
  Parasympathetic nervous system stimulation also wakes up the brain, to more than 10% brain use, by over riding the inhibitory neurons. The largest nerves of the parasympathetic nervous system are the vagus nerves, which emerge from the brain by the vagus trunk through the nasopharynx, which branches down as right and left vagus nerves down into the body, not through the spine, so that they are easily stimulated. Vagal stimulation is as effective as LSD.
  The psychiatric profession secretly calls more than 10% brain use "psychosis". Why secretly? This secret, today easily expressed medically, "Vagal stimulation is as effective as LSD", has been the most suppressed knowledge in history. But, it too is being spread, just like the knowledge of the futility of the MRI lie detector.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.