Comments

jmdesp August 9, 2011 6:42 AM

As has been said in your previous blog post, it’s important to realize this kind of tools, even if perfected someday to a zero error rate, won’t work with people who are stating lies without being aware they’re lying, this includes mythomaniacs, sociopaths, people with fake reminiscences, etc.

mashiara August 9, 2011 7:15 AM

@jmdesp
Just to nitpick: It’s a lie only if you know/think it’s not true.

On the opposite an objectively true statement still is a lie if the speaker thinks it’s subjectively false. (getting a bit philosophical here…)

Natanael L August 9, 2011 8:21 AM

Grant on Mythbusters managed to cheat on one of these. He kept thinking about just about everything else during the test.
I’m not sure on how good they will get.

chirol August 9, 2011 8:49 AM

See the book “Daemon” and “Freedom TM” for more on MRI machines as lie detectors. very cool stuff

David Leppik August 9, 2011 10:02 AM

The article involves experiments with neurotypical, generally honest adults, who are consciously conspiring to lie. I hope that, before this becomes admissible as evidence in courts, they will study it on compulsive liars, for whom it takes conscious effort to tell the truth. Similarly, I wonder about “white lies” which often slip out without even realizing (or questioning) their truth, e.g. “nice to see you!”

Richard Steven Hack August 9, 2011 10:15 AM

From the article: “Her most interesting claim, though, is that her measurements can predict a lie before the liar has decided on it. Thus, she sees the first changes in the person’s EEG about 250 milliseconds after the statement appears on the computer screen, while it takes between 400 and 600 milliseconds before the pattern showing a decision appears.”

This doesn’t surprise me. This phenomena was discovered decades ago by Gray Walter. He hooked up an EEG to a TV set and trained the subjects to think of turning the set on and off. He determined that the set actually came on or off BEFORE the subject actually decided to think about it.

Your “conscious” mind that is aware that you made a decision is not running the show. It’s merely the monitor to the computer which is your unconscious mind which makes the actual decision.

An MRI scanner won’t be very useful unless it can become external. As long as the right to refuse to self-incriminate is in law, no one in their right mind would submit to such a thing voluntarily. So a lie detector needs to be external and usable regardless of the subject’s cooperation.

If a lie detector is external and does not require a user to cooperate, it can be used in all cases and the results submitted as evidence in court (once the scientific testing proves it really works, unlike polygraphs). The issue of self-incrimination would no longer apply because the suspect is no longer required to cooperate. It becomes “external evidence”, not “interrogation”.

Despite what the article said about not being able to have a functioning society without lying, that is exactly what I’d like to see – a way to perfectly detect lies making lying no longer a feasible human behavior. If we could detect lies externally perfectly – even through media such as TV or radio as well as in person – the religions and governments and corporations of the world – not to mention most relationships – would last five minutes. This would mandate major changes in human behavior, most of which would be beneficial to society.

I remember last season’s British con show, “Hustle”, had an episode where a blow to the head rendered Ash Morgan unable to lie – a problem since he was in the middle of a con. Funny stuff.

A.C. August 9, 2011 10:44 AM

I strongly dislike this line of research. First of all, I think it’s overstepping boundaries to try to create mandatory windows into someones mind.

Secondly, the public places way too much credit in images from fMRI. For example, there’s this poster showing up all over the subway to prevent teen drinking. Teen drinking, it says, can cause a decrease in though capacity of a developing teen. For illustrating evidence, it shows a picture of a brain slice ‘lit up’ with colorful blobs (labeled ‘non drinking teen’) and a brain slice with just a few bright patches (labeled ‘drinking teen’). If you’re familiar with fMRI at all, you’ll know that this is a total nonsense way to show things. This image is of a non-drinking-teen doing what? And is the drinking-teen doing the same thing? The drinking-teen could have been told to close their eyes and try not to think about anything while the non-drinker was asked to do complex mathematics in their head. Or the thresholds for deciding a point was significant activation could have been different in the two cases.

Not that I’m saying teen drinking is something that we want, or that posters educating parents about the risks of their children drinking is bad, but the methods of dragging in fMRI images without the proper explanation of the technique or even labeling of the data and telling the public what the ‘task’ was seems a dirty way to push your point through, no matter how good your point.

My worry is that presenting just one ‘lie’ image to a jury will bias it, should fMRI lie-detectors be permitted in normal court proceedings. I’m not saying the public is incapable of understanding fMRI, but it has been shown before that people can be easily swayed by having any kind of ‘scientific evidence’ shown to them, regardless of it’s accuracy and even being unaware of its nuances. We’re talking about black and white images and measures of blood flow which we then express in colored blobs after deciding it’s the levels in that region are higher than baseline. And the higher the level of the process you are trying to study, the more difficult the baseline becomes to identify.

In closing, I believe that this kind of technology and ‘evidence’ collected with it will be abused, and don’t think research in this direction should be conducted.

phred14 August 9, 2011 11:05 AM

What about “self-brainwashing”? The references did talk about the detectable difference between spontaneous lies and rehearsed lies. But I’m wondering about when the lie is repeated to the point where the liar actually starts to believe it. In the real world I doubt this ever happens either in one jump or as a deliberate strategy. Rather I suspect it happens as a self-defense mechanism. Start at the edges and reduce one’s guilt by “adjusting the facts” slightly. Repeat long enough and the “Yeah, that’s it!” starts to kick in, so you start to believe your own story. Lather, rinse, repeat, each time getting closer and closer to the core of the matter.

This is from observing kids both as peer and adult. Of course this would NEVER be from personal experience! One could also argue that the political “echo chamber” one hears about from both parties is another example of this.

What happens to detectability when the liar comes to actually believe the lie? Of course in the DHS context it may not matter, unless there’s a post-hypnotic suggestion or other movie-plot mechanism afoot.

Richard Steven Hack August 9, 2011 11:56 AM

Phred14’s comment about hypnosis is not out of the range of consideration. “Self-hypnosis” is quite easy to do, even in a near awake state, and I wonder if any attempts have been made to adjust for its effects.

The problem with these sorts of studies is that the observers are watching EFFECTS, not actual brain mechanisms. While these effects may be repeatable, not knowing the brain mechanisms involved can prove problematic if other causes can produce the same effects using the same or different mechanisms.

That is, if you get a specific MRI effect by lying, how do they know you can’t also get it by, say, putting a thumb tack in your shoe, a la how the polygraph gets beaten? Maybe putting a thumb tack in your shoe allows you to get the MRI effect of NOT lying.

The question is: “How do you eliminate all the possible causes and effects if you don’t know the mechanisms involved?” “Black-box” testing only goes so far and only with relatively less complex systems. The brain doesn’t qualify as a “less complex” system.

Doug Coulter August 9, 2011 12:03 PM

@RSH

Your “conscious” mind that is aware that you made a decision is not running the show. It’s merely the monitor to the computer which is your unconscious mind which makes the actual decision.

/////////

This is one of the big truths out there — right on!
I trade stocks (my own money) for a living now, and knowing this is the key to success, as it drives markets. Overcoming it personally is how you do well, actually. He who controls his emotions better than the next guy, while understanding what the other guy is feeling, wins every time. Even the computers trading the markets are programmed by humans….

BF Skinner August 9, 2011 12:54 PM

@RSH, Doug Coulter ‘Your “conscious” mind that is aware that you made a decision is not running the show. It’s merely the monitor to the computer which is your unconscious mind which makes the actual decision. ‘

If I’m not concious of deciding to lie then how can I really be responsible for the lie? Isn’t it instead some sub-process within the legion that is our brain?

“it’s the devil I tell you the devil which made me do it.”

Ouija Board Master August 9, 2011 1:06 PM

Lie detectors are not interesting, they are idiotic. Let’s see some research on Ouija board accuracy too.

Doug Coulter August 9, 2011 1:14 PM

@BF Skinner

Good one! My point is that with some training, man (or at least some of us) can become a rational, rather than a rationalizing, animal. And it’s worth the effort. It allows one to prevail in many situations others fail in, without them realizing why they are failing. Not as relevant to the current topic, but well worth knowing for the other applications of the concept.

Most people spend all too much of their conscious effort rationalizing decisions they’ve taken through another path to try to justify what was irrational at the base. Being awake to that process in yourself and others is a definite “edge” in life.

NobodySpecial August 9, 2011 3:00 PM

Since fMRI machines are quite large and expensive their use is going to be limited to small select groups of suspects.

There are 650 (?) MPs so a coupe of these machines should be able to process the new intake at each election.

However one point in the article does raise a concern:

whole prefrontal cortex; an indication that there was more thinking activity

  • might be a problem for MPs !

Clive Robinson August 9, 2011 3:06 PM

I’ve raised two point’s every time we have discussed this in the past, so here I go again,

Firstly as I’ve always maintained the way to lie is to actually tell the truth, but with a twist.

Secondly, in all the fMRI studies and other brain activities studies I’ve seen they have made it very clear that it is for “right handed” people only.

The thing about telling a lie is it’s a falsehood that can be disproved in other ways, telling the truth but from “your perspective” cannot be disproved in other ways. The simple fact is if ten people see the same surprise event a number of things happen. Firstly there are ten different viewpoints none of which actually reflect the reality of the event in anything but a circumstantial way.

Then two effects start happening, the first is the image in the persons mind is very maliable and can be changed simply by the way a uestioner asks questions, secondly even if not questioned but simply asked to tell what they saw their brain fills in gaps to try and make sense of what they saw just to be able to actually “tell it in their own words”.

Most experianced police officers are well aware of this and it is not unknow for eye witness evidence to be at compleate varience with video tape evidence within 24hours of the event.

The court system is well aware of this as well which is why “minors” and those with low IQ’s in many places have a protected status when it comes to being questioned by the police or other figure of authority.

As for the difference between left handed and right handed people, I’m not sure what it actualy is that precludes the use of “southpaws” in brain function studies, I was once told half humorously by some one involved “It’s because you lefties don’t have your brains wired right”.

tommy August 9, 2011 7:59 PM

@ Richard Steven Hack:

You’re ugly, and your mother dresses you funny.

KIDDING! — but how long would any human society last without the little “white lies” that grease the social wheels? If you told everyone exactly what you think of them, and everyone else did the same ….

@ ALL:

I’ve been asked not to cite fictional TV shows as “sources”, but this isn’t a source, it’s a question for the proponents of the MRLie-DetectorTM. (I just trademarked that name. 🙂 The MRI of the person showed no activity in the areas of the brain said to be used in creating lies. But they were found out to be lies. Conclusion: The person was a sociopath.

This is a real-world problem with this idea (along with all the other problems mentioned): Sociopaths and psychopaths, who are usually the most dangerous, get off free. And who knows whether the “truth” reading comes because the subject is in fact telling the truth, or because the subject is a psycho?

This is a dangerous field.

Besides, if you have a metal plate in your head from the War …..

Daniel August 9, 2011 11:11 PM

“When you open up the brain’s processes in this way, you violate, in a heavy-handed way, the individual’s right to keep his or her thoughts and feelings private.”

What right is that? IMO the genie is out of the bottle and it won’t get put back in again. Your notion of a right to privacy is quaint; it amuses me.

I’d argue that the real issue here is not that the machine can detect a lie. The real issue is that person X will have such a machine and person Y will not. Does anyone really think that the military general who uses it on a “terrorist” would allow himself to be subject to the same machine. Don’t be silly. It’s already legal for person X (the police officer) to lie with impunity while person Y (the suspect) is not.

It’s also going to be a much cheaper too. No more sending the police out with a radar gun. It will be every citizen’s duty to report to the “hood” once a week and be subject to the question “Have you sped illegally?” $1000 fine for you if the machine thinks you lied.

greg August 10, 2011 12:57 AM

The bit i don’t like about “lie detecting” in general, is that some “expert” just asserts that you lied. There is little more than pure subjectivity, like cops that know a “perp” when they see one.

How do you challenge that? You just gave someone who is not even a judge the power of a judge and a jury.

Lets not forget that all current lie detectors don’t have any data that they work at all (ie proper blind tests –not on people told to fake it).

AC2 August 10, 2011 1:15 AM

@A.C.

“The drinking-teen could have been told to close their eyes and try not to think about anything while the non-drinker was asked to do complex mathematics in their head. ”

Hmmm both seemingly impossible demands, wonder what the MRI scan would look like…

A.C. August 10, 2011 9:49 AM

@AC2

Awww… you mean you don’t routinely do integrals in your head?

Fair point though… I should have said something that involved more regions of the brain like ‘staring at complex scenes with high emotional content while trying to determine the presence or absence of a target object’ for the non-drinker.

echowit August 10, 2011 11:29 AM

I’d like to see test data on really good liars (sociopaths, actors, lawyers, etc..) compared to; 1) that from “average” Joes and 2) that from known miscreants.

@hah: I know many would want to but I can’t put your politicians in with the ‘good liars group’, most of the are so emarrassingly bad at it.

B. D. Johnson August 11, 2011 10:23 AM

@Natanael L: Actually Grant beat it by basically overwhelming it. He did some homework about what thought processes occurred in which part of the brain and practiced rapidly switching between them so they couldn’t pinpoint when he was lying.

Richard Steven Hack August 11, 2011 11:14 PM

Tommy: “how long would any human society last without the little “white lies” that grease the social wheels? If you told everyone exactly what you think of them, and everyone else did the same …”

Society would adapt…or disintegrate.

Either of those works for me. 🙂

Seriously, the point is two-fold:

1) it’s highly unlikely to occur unless you first train everyone to be highly rational and not emotional and if you did that the effect of never lying would be mostly beneficial; a statement such as “You’re ugly, and your mother dresses you funny” would be received rationally as a matter of opinion and thus irrelevant to one’s own perception or an attack on one’s self-esteem which would be ignored in terms of effect on one’s own perception and then used as a judgment of the attacker;

and 2) if it occurred before such training of society, it would either force society to develop rationality on pain of disintegration if it didn’t.

In short, like most things, it can’t be taken out of the surrounding context of the state of human and social development. But that said, in the context of “what would we really like a rational society to be like”, it would be entirely beneficial to never lie.

tommy August 14, 2011 12:06 AM

@ Richard Steven Hack:

What if I were to change my point to, “Without the little white lies, no guy would ever get laid?” ;-D

(End of human race?)

Moderator August 16, 2011 3:17 AM

Tommy, I’ve removed your comment. Stop trying to find loopholes. If I have to deal with this again, I will simply ban your domain from appearing on the site.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.