Schneier on Security
A blog covering security and security technology.
« Interview with Me for LinuxWorld |
| Random Number Humor »
February 15, 2007
Scanning People's Intentions
Here's an article on a brain scanning technique that reads people's intentions.
There's not a lot of detail, but my guess is that it doesn't work very well. But that's not really the point. If it doesn't work today, it will in five, ten, twenty years; it will work eventually.
What we need to do, today, is debate the legality and ethics of these sorts of interrogations:
"These techniques are emerging and we need an ethical debate about the implications, so that one day we're not surprised and overwhelmed and caught on the wrong foot by what they can do. These things are going to come to us in the next few years and we should really be prepared," Professor Haynes told the Guardian.
The use of brain scanners to judge whether people are likely to commit crimes is a contentious issue that society should tackle now, according to Prof Haynes. "We see the danger that this might become compulsory one day, but we have to be aware that if we prohibit it, we are also denying people who aren't going to commit any crime the possibility of proving their innocence."
More discussion along these lines is in the article. And I wrote about this sort of thing in 2005, in the context of Judge Roberts' confirmation hearings.
Posted on February 15, 2007 at 6:32 AM
• 52 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Two of the problems with brain scanning are that we don't know the causal connections between behaviour and mind states and between mind states and what we see in the scans. And the scans are really, really noisy. This is an active research area, but one where almost all of research is still to be done.
My personal suspicion, as a computational neuroscientist, is that brain scanning will never be reliable enough to stand up as courtroom evidence.
It doesn't have to be effective, or even work at all, to become a law-enforcement/security screening tool.
Polygraphs are totally ineffective for screening, but that's what the government uses them for. Datamining of Phone records for terrorist calling trees is apparently a source of pure investigative noise, at least according to the FBI, but that hasn't prevented it from becoming a budget growth center at NSA.
Effectiveness is neither here nor there. Some government agencies will see this stuff as an opportunity to stick a tap in the great security budget river, and roll out their own anti-terrorist brain scan programs. It's just a matter of time.
Is thinking about commiting a crime a crime?
Planing a crime in some cases is a crime, but just thinking about it? what about thinking of copying a CD.
Or that what about the hot girl accross the road-- *&%$!!! how old is she? I hope the mind scanner on this street isn't working! Or that she is old enough.
I really can't see how this could ever work in a system where you are inocent untill proven guilty.
There has been a lot of science fiction written about this sort of thing, and the far reaching effects on human society.
Most of them sidestep what to do about abnormal brains: "I didn't intend to kill them, it was a compulsion".
Then (in the USA) there is the constitutional issue -- I think that our right to avoid self incrimination was intended to preclude torture as a tool of criminal investigation, but that's not how it's written.
Bad Men do what Good Men dream.
"if we prohibit it, we are also denying people who aren't going to commit any crime the possibility of proving their innocence."
Why should I have to prove my innocence?
Just make sure that you "do no evil" in the future...
@Kristine: Why should I have to prove my innocence?
You shouldn't, in most cases. However, even with perfect defense of personal rights there will be mistakes with circumstantial evidence and deliberate false evidence. People will want to be able to say "I didn't do this, go look elsewhere".
Though it isn't exactly the same technology as the article discusses, Damn Interesting ran an article awhile back about brain fingerprinting.
Apparently Iowa allowed evidence from brain fingerprinting to be admitted in court during the retrial of convicted killer Terry Harrington.
I call to immediately deploy this technology at airports around the country.
Also, everybody shall take a test where they are shown the picture of a child in a suggestive pose. If the brain scanner reveals that the person finds the picture "attractive", then you are registered with the sex offender warning list, and must carry a GPS transmitter for life. Think of the children!
It hard to not compare this to the movie "Minority Report" http://www.imdb.com/title/tt0181689/
(based on the short story by Philip K. Dick)
While I read the short story I don't remember it's details. In the movie however, a man considered killing his wife while she was in bed with another man. But he did not act out the killing. Hence when he was caught for his "crime", he kept arguing he had not done anything. He wanted too, but he knew he could not carry it through. Thus if he would not do so, did he commit a crime?
The key seems to be "revealing what a person planned to do in the near future". If you scan someone in the morning and you can determine that they are going to have lunch after the test, is that a breakthrough? Is that even information?
I have lots of things I intend to do: that vacation this summer, write a book, pick up the laundry after work. Scan again tomorrow and you'll get different things. You'd have to invest a lot of money over a long period of time to get anything useful. Investing that sort of effort in me would change my intentions, depending on the side effects.
While I love Dick, and it was a great story, this is science fiction. No need to get it confused with reality.
Thoughtcrimes? Philip K. Dick was right again.
Why am I getting the "Minority Report" creeps? My biggest concern is that this doesn't have to work for some zealot a la Poindxter to glom on to it and convince an Alberto Gonzalez that law enforcement *must* have the ability to deploy this immediately. So what if it's half-baked...arrgghhh.
Aside from the possible technical relevance of the research, there is an underlying idea that we have bad and good people:
From an object oriented point of view I would say this view is a bad design, as people may do bad and good things. We all do things we are proud of and other things (not necessaryly crimes) we are not so glad we did. In fact our system use to label as bad people those that did something wrong and were caught red handed. (How many millionaires were just lucky not to get caught? )
If you believe in destiny then you may believe that act will follow thought. I just do not know what I will have for dinner. I hope I won't kill anybody tonight.
Seriously, this looks like a new high-tech scam to drain budget using a plausible excuse: "Should not we do it if it can prevent a crime?"
I think this is techology taken off the deep end.
If the subject, while being scanned, was not following directions, could they tell? He might be trying to solve a topological problem in his head, wondering whether a function is differentiable, considering whether to go to a movie tonight, remembering a childhood bully, or noting that the chair is uncomfortable. And then, at the last second, he decides to add or subtract. Would they be able to tell what's what?
If they want to tell whether he intends to lie, and then scan him before he answers, could they tell that his mind is focused on how bad the one tech's breath is and should he inform the guy and if so how? What if he needs to belch and has to decide to try sneaking it out, or just letting it fly?
If we ever see this technology turned to some 'lie-detector' use, my bet is the rationale will be "It works if people think it works." Just like polygraphs, dowsing rods, crystal balls, and astrological charts.
Piper's stories included a "veridicator" to identify falsehoods from truth.
That being said, technology does not stand still and this kind of technology-- or some kind of spin-off-- is likely to materialize.
The problem, really, is that for all of the good it can do in understanding ourselves, there are people who can find some form of evil to do with it. As long as those exercising the laws believe they will be above suspicion (who watches the watchers?) this is a nascent technology that will be abused.
Imagine, thirty years hence, anyone can buy a little scanner, kind of like a stud-finder, hold it to their spouse's head, and conduct an interrogation. (Hmmmm... story idea!)
The issue I see is that it can be used as a "loyalty" scan by any political/religious group, even if there is no further probing.
We have too many people who can find ways to mis-use such technologies, but, at the same time, it's a USEFUL area of research, if only to find better means of interfacing between people and 'puters.
Mind you, corporations will mis-use this kind of technology as well.
Finally, consider that this tech, if it reaches a mature stage, may end up like AntiLock Braking systems-- people think that they're "better brakes", which they aren't-- and the scanners may have people believe that intentions equal action.
Yeah, right. Talk to a procrastinator some time.
WAR IS PEACE
FREEDOM IS SLAVERY
IGNORANCE IS STRENGTH
This all amounts to what I call "The Mismeasure of Mind". Our brains excel at creating futures to select from before taking one as an action. Stephen King thinks of hundreds of horrific acts while writing his latest novel, but that doesn't turn him into Jack Nicholson carrying an axe in real life.
Carlo Graziani sez "It doesn't have to be effective, or even work at all, to become a law-enforcement / security screening tool".
Case in point: in the 50s (the "Red Scare") and 60s (Cold War) homosexuality was illegal in Canada, and homosexuals were thought to be susceptible to blackmail and other pressure from Soviet agents. The RCMP commissioned the development of what came to be called "The Fruit Machine", a device that was supposed to determine sexual orientation by measuring pupillary dilation and perhaps other physiological changes in response to visual stimuli. There's no evidence that it did, or could, work, but it played a significant role in thousands losing their jobs in the public sector, from the military to Canada Post, and hundreds of lives ruined.
Long before it has any reliable capability of reading "intention", functional MRI will become a tool of choice for national security agencies and LEOs.
"I really can't see how this could ever work in a system where you are inocent untill proven guilty."
Well, it would certainly "work" under jursisdictions which follow Napoleanic law, where one must prove one's innocence.
And in case you haven't been paying close attention, certain countries that toute "innocent until proven guilty" have slowly and subtly been changing away from that mantra as well...
It's hard to debate the legality.
Rarely is the law ahead of the times. In fact, that is the point of the common law. As cases arise, you decide which cases the principal case is most like and which it is not like. Then you draw conclusions about what the law should be based on that.
Legislatively, the law can try to be ahead of the times (as in outlawing human cloning), but that can also be thorny. If the the technology gets better so as to be able to differentiate itself from vaguely written, broadly applied statute written without much understanding of technology or the technology at hand, the law can be made to no longer apply (vague... overbroad).
Ethically, sure we can debate it. And we can even decide that scientfically speaking it is unethical to conduct such brain screenings. But if the technology truly existed to read people's minds (and could be proven in a scientific manner), then the genie is out of the bottle. It doesn't just go away.
I have a feeling, like most things, this will be dealt with when the time comes.
In case they're reading - Thought Police are doubleplus bellyfeel goodthinkful.
I am not an expert in this stuff but I supect that some of the ideas may be workable for new forms of interrogation. As I understand it, the basic idea is that different bits of the brain are used when we recall information instead of see it for the first time. Also, different bits if the brain are used when you recall a testimony for a sequence of events instead of making up a new story. If this technology takes off, I predict that it will be used to catch out lies by suspects. Here is a related URL.
This so beings back my thoughts about "Citizen, no one is innocent, we are determining your level of guilt, remain where you are, do not attempt to evade interrogation."
Hmm. When I read this, I didn't think of Minority Report. I thought of David Brin's "Uplift" series. In that setting, all people (presumably at a certain age) are tested for "violent tendencies". The ones who pass become full citizens, and the rest (a small minority) are placed under a permanent surveillance called Probation. Probationers don't get a vote, are locked out of many jobs (particularly government jobs), and have an implanted radio transmitter to track their location.
Brin's book _Sundiver_ mentions this, and contains a description of how the test would work.
I think we'll have to deal with the ethics of that, long before we have reliable "intent scanning".
Never mind using it for mere "mindreading" tricks. It's also the ultimate biometric.
In a world where cloning and plastic surgery are becoming common, every brain is still unique. It can be mangled, but it can't be made to look exactly like someone else's brain. It will probably be a while before something comes along that can change our brains (probably some form of nanotechnology) - and when it does we'll face bigger problems that identify theft.
Of course, using the brain as an identifier requires storing info in a database somewhere. So criminals don't have to change brains, just hack in and switch a couple of labels in the database.
Securing the database I leave as an exercise for the student. :-)
Just to clarify, I'm not talking about the physical appearance of the brain but the ebb and flow of chemicals and electricity within it.
Absent M., what about deja vu? Keep in mind also that people often confabulate memories of events, which is what makes "eyewitness" testimony less than reliable, especially if asked suggestive questions. I don't see how it's possible for a brain scan to distinguish between real and false memories.
Richard B., we already have a test for "violent tendencies" called Driving While Black. No doubt a revised version of the Bell Curve that has the lastest research on black brain scans is in the offing.
This reminds me of the hooplah over lobotomies which swept the American press in the 1940s. Even though there was precious little evidence that lobotomies did anything other than make the patients more tractable, articles appeared in magazines such as Life and newspapers praising its potential for "curing" just about every mental disorder out there.
Given that history, I am loathe to put much credence in this. I doubt that this will stop the procedure from being carried out much by law enforcement agencies and medical officers eager to have a somatic means of predicting the brain's next move.
We are not reading, but do not think your foolish attempt at disguise has gone unnoticed.
Have a nice day.
Oh what a tangled web we weave!
Whatever safe guards can be put it place can also be removed. Just like you removed so many rights and safe guards in the name of fighting "terror".
Strong legal framework is obviously required.
Imagine the harebrained use of this technology for detecting 'guilty knowledge'. If the brain regions of Set N light up, it's novelty; if Set F light up, it's familiarity. They hook you up, calibrate for you, then show you a picture of the murder scene.
You look and your first thought is, "Hey, my sister has that same refrigerator. I wonder if they have problems with the icemaker, too."
Guilty! Guilty! Guilty! Now they only have to torture a confession out of you, and case closed.
If it ever does work well, it'll probably reveal how rarely people
end up doing what they intended.
I'm perfectly happy for this to be deployed and will quietly volunteer to be subjected to it, however etiquette moves me to politely suggest "politicians first..."
Lately, George Orwell and Aldous Huxley are spinning in their graves at ever increasing RPM...
I remember reading (a long time ago) some medieval Christian philosopher who claimed that thoughts alone can't be considered sins. Its the act that makes the sin. Or something like that.
I think it was Aquinas or Augustine?
Suffice to say, I agree with that line of thought.
Also, the thinking behind this device presupposes that human minds are deterministic - highly unlikely, and that no intervening events could change the outcome (by influencing thought patterns - our brains literally change with each thought).
There are so many holes in this kind of study that it's hard to know where to begin. And just like polygraphy and voice stress analysis, that won't make any difference at all.
(If you consider it from a security point of view, it's not just the innocents who will be skewered, but the sociopaths or whoever else has the odd brain structure that doesn't work with such pseudoscientific scans, who will be given a free pass.)
The perfect gift for every mom who has ever said: "What do you think I am, a mind reader?"
"Lately, George Orwell and Aldous Huxley are spinning in their graves at ever increasing RPM..."
We should connect generators to them.
Not to worry Bruce. Once we get the pre-cogs we won't need this sort of technology at all...
This reminds me of the book "Truth Machine" by James Halperin. He wrote the book in 1997, and it was creepy then. Now as each year passes I see us getting closer to the world he envisioned.
From the amazon review:
Imagine a world in which no one can lie. Now try to imagine the consequences. Halperin has written this generation's 1984, and rarely have our customers praised a book more highly. (Click on the title, and find out what they have to say ... assuming they are telling the truth!) And only time will tell whether Halperin's book is speculative fiction, or inverse history. Very Highly Recommended.
I think people will adapt and find ways to hack their brains by then.
"It's not who you are underneath...it's what you do that defines you." -- Batman Begins
In fact, this philosophy is used in investigation/interrogation. Considering committing a crime is not a crime, until you actually act upon it. I'm not a lawyer, so I don't know where all the lines are drawn. For example, what if you _consider_ robbing a bank by writing down elaborate plans?
There was actually quite a bit of information written about this a couple of years ago. A FOIA request after September 11 revealed that NASA and Northwest Airlines had been working on using fMRI technology to "read" passenger's minds, by looking for brain wave patterns that signaled a terrorist - nervousness, etc. The scientists claim that the brainwave reading part was easy, but the hard part was developing a machine that was the size of current metal detectors, and that was fast enough to not cause longer delays. The kicker, of course, was that Northwest also shared a bunch of passenger information with NASA as well, so they could develop a better terrorist profile. I read about it as part of my Masters thesis, which has all of the links to the Washington Post and other sites that reported it, but you can find information by just Googling terms like "NASA Northwest", "NASA mind-reading", etc.
I'm amazed at the (mis)interpretations of Minority Report here. (NOTE: SPOILER)
It wasn't about intent, it was about determinism. The whole premise of the story - what a "minority report" was - depended upon that.
The deal was that the precogs came out with their verdicts at slightly different times (on the order of milliseconds, IIRC), which ordinarily makes no difference. However, in this particular case, the person charged was in a position to see the report. The first saw the future without the report, the second saw the future with the report, and the third saw the future with the two conflicting reports. This doesn't work if they were sensing intent, for obvious reasons.
Ultimately, of course, the main character winds up following through on the murder *so that precrime won't be wrong*, which is classic PKD.
The movie, while entertaining, did not do it justice.
@anonhymous: "Considering committing a crime is not a crime, until you actually act upon it."
Actually, IIRC, it was pointed out that the whole anti-terrorist driven PATRIOT act is meant to PREVENT... which is NOT the way law enforcement usually works. I'm not sure if it was "The Bruce" or David Brin that made a remark about this wrinkle back around the time of 9/11, but, when you get right down to it, digging through the aftermath showed that the then-existing mechanisms allowed the crime to be solved fairly quickly.
The whole drive for intentions-driven exclusion is to avoid another "debacle".
What law enforcement's real mission is, is to keep specific individuals from being able to repeatedly commit crimes, be it jay-walking or homicide. I suspect the problem with walking not-so-smart (due to the lack of the human interest in survival) bombs is that we don't have to worry about that ONE person flying another plane into another building.
So law enforcement finds the culprits and connections adequately but then what is the way to avoid having someone repeat the crime?
How do we deal with this? By looking at intentions instead of actuals, by implementing a "thought police" at least as repressive as the more harsh incarnations of Islamic Law, and making sure that everyone is indoctrinated into becoming a "good little robot"?
New Zealand is starting to look pretty stable, y'know?
"Considering committing a crime is not a crime, until you actually act upon it."
No, they call that conspiracy.
Common people, it should be clear that we will never be able to get much from these signatures. We can't even profile our cars and planes to check whether they will crash in 5 seconds, and the problem of understanding what that huge neural network inside your head is about to do is just crazy.
But sounds like you guys don't have problems with getting funding for all sorts of useless things like this :)
Back before the 2000 election there was an Isreali company who claimed to be introducing a software product (both 'professional' and home versions) that could allegedly detect lying. As with many such items it disappeared into the vortex but...
Imagine if it were a big seller (even if it didn't really work). How would the knowledge that potentially millions of people were analyzing their performance affect the candidate debates?
If the science prooves out what could be better than the ability to determine truth from lie in critical situations? We already know eyewitness testimony is a terrible thing to rely on according to Dr. Elizabeth Loftus and others and it is in the courtroom today! The tip of the iceberg is the number of wrongful convictions found out by DNA testing which applys to only 1% of the cases going to court. It also wastes tremendous public resources trying to get at the truth as it stands which could go to better use. I think the courts will be ready for the fMRI when the fMRI is ready for the courts.
You can tell to some degree what someones intentions are by looking at their facial expressions and body language is that unethical?
legal or not I want one in my secret lab right next to my cloning machine.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.