Schneier on Security
A blog covering security and security technology.
« Boston Police Consider Using Linux to be Ground for Suspicion |
| Friday Squid Blogging: Squid T-Shirt »
April 17, 2009
New Frontiers in Biometrics
Ears? Arm swinging? I guess biometrics is now the "it" thing to study.
Posted on April 17, 2009 at 5:41 AM
• 27 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Imaging a calling service using such method:
"Please put your ear to the microphone after the beep" ;)
Super sensitive microphones on my cell phone? I can't hear the other end of my conversations NOW with all the background hash.
Sounds more like a NWO way to plant bugs on everyone.
The arm swing thing sounds mostly useless. Furthermore I've never heard of "equal error rate". False positives and false negatives do not have equal weight in almost any application I can think of. Sounds like someone made up a measurement to make unacceptable failure rates sound more acceptable. (kind of like that stock fund index they compare stock funds to so that people wont notice that most stock funds dont do as well as the overall random stock market [DJIA, S&P etc] indices themselves)
On the other hand building a Wii into a cellphone sounds like a neat idea.
@bob: EER is a common perfomance indicator for Biometrics systems... and is used to make Receiver Operating Characteristic (ROC) charts more intuitive. Of course, noone will want to use a system with equal FAR and FRR
Putting aside issues of uniqueness, there are practical challenges to both of these. First the ear. Ambient noise and the way the phone is held up the the ear must have an effect on results.
Second, the arm. The arm swing pattern is part of the process of identity verification. So does this mean phones can only be swung in the dark or in an isolation room? Wouldn't swinging your arm in public be like providing part of your password or a half fingerprint?
Body rhythms have been used for biometric authentication for some time, such as typing patterns, etc. I vote for tummy rumbles and the under/overtones generated with a belch. I'm sure they are unique to each individual (hic!).
@bob: The idea is that there will always be a trade-off between false positives and false negatives. By fiddling with the sensitivity you can get very few false positives at the expense of many false negatives, or vice versa. Which technology is better at a given application may well depend on how the trade-off is made.
But if you don't have a particular policy choice in mind yet, and still have to decide which ideas you consider interesting enough to investigate further, you really want to reduce the performance differences to a one-dimensional metric somehow. It's an oversimplification, of course, but a necessary one.
One might argue that it would be a better one-dimensional metric to ask, say "how many false negatives must we accept to get false positives below 0.27 percent". But if the inventors of a new system choose to compare their idea with competitors on that base, the choice of 0.27 as a threshold is immediately suspect -- did they chose it because their system performs particularly well with a 0.27 percent threshold but would be beaten by other systems at the 1 percent or 0.02 percent points? Who gets to choose which thresholds to measure all systems at so they can be meaningfully compares?
In contrast EER is a parameterless metric which makes it much easier to accept it as non-biased.
Enough already! We can make identifications on fingerprints, DNA, iris scans, gait of walking, now ears tones and arm swing. And, we have RFID plus GPS, plus cell phone and internet. We have no problem with identification. We have a problem with FINDING. How many criminals (read those who deploy worms, spyware, & computer trojans) have these tools found? How many terrorists (read bin Laden) have these tools captured?
Much of this is just FUD for the purpose of intimidation and control of the population, and lend little to solving the real problems.
@bob: You may have heard of Crossover Error Rate (CER). EER sounds like another name for the same thing.
@Tom Olzak: The argument seems to be that it is hard to reproduce somebody's shaking pattern even if you have seen them do it, or even if you know which pattern the device looks for.
I don't think these two projects aim (or claim) to escape the common boundary condition of all biometrics, which is that you cannot expect the pattern you look for to be (or stay!) a secret. Therefore you need to trust the integrity of your sensor. Merely having somebody show you the right bits over a communications channel authenticates nothing unless they can prove that the bits came from a sensor you trust.
The title of that New Scientist article--"Our Ears May Have Built-In Passwords"--shows that yet another person has written about biometrics without understanding that "biometrics can serve as a replacement for typing your username but are not a replacement for your password." ( http://airlied.livejournal.com/62072.html?... )
Honestly, it's not that complicated. I know how tempting it is to think that you can replace passwords, because nobody likes passwords... but honestly, you're just replacing usernames, and my car insurance company already gets that over the phone by looking at the incoming number, no fancy mammer-jammer required.
Bah. Take a shot.
What will they think of next for biometrics? Smegma?
That arm swinging test would be fun on a packed train!
Also, are any biometrics compatible with "hands free" use in a noisy environment, such as most types of travel?
Since it contains DNA, I suspect it would be quite a good biometric.
Here's a biometric for you:
The truth will set you free. So will pork and beans.
Well, how *do* you identify those Arabs who go around wrapped head-to-toe in blankets all the time? Or bank robbers in Ronald Reagan masks and Mickey Mouse gloves? It's driving law-enforcement nuts... and it appears that it's either biometrics, or nothing.
"biometrics is now the "it" thing"
why? biometrics and human behavior have always been studied in security. is the rate or percentage of studies changing? you only cite two.
@grendlkhan - "but honestly, you're just replacing usernames"
No, you're not. Your username can be provided by anyone who knows it. Your biometric data can only be provided to a reader device by you (or by someone who can fake out the device... but that's an implementation issue, not a fundamental limitation inherent to the concept of biometrics).
As the quote from Bruce says, biometrics are not secrets, which is why they aren't passwords. But they're more than just unique identifiers (i.e. usernames) ... they are unique identifiers that are physically tied to an individual.
And contra your assertion, biometrics certainly can replace passwords. The only reason passwords are secret is because their secrecy is the method by which they are tied to you. Biometrics are tied to you without needing to be secret. And "being tied to you" is the property we are care about... it's what allows it to act as an authenticator.
Biometrics aren't *perfect* replacements for passwords. With biometrics, the reader device has to be absolutely trusted by the authentication system. With passwords, you can have untrusted intermediaries between the user and the authenticator. Regardless, biometrics are replacements for passwords, suitable for a great many applications.
> Well, how *do* you identify those Arabs
With a name?
> It's driving law-enforcement nuts.
What law are we breaking when we "go around wrapped head-to-toe in blankets all the time?"
> Or bank robbers in Ronald Reagan masks and Mickey Mouse gloves?
How does fingerprinting gloved robbers or scanning masked retinas help law enforcement?
"our biometric data can only be provided to a reader device by you (or by someone who can fake out the device... but that's an implementation issue, not a fundamental limitation inherent to the concept of biometrics)."
Au contraire mon ami - it is absolutely a fundamental limitation.
Biometrics would loosely translate to "measuring a living thing". Once somebody has those measurements, they have the password. Maybe some implementations make it a costly hassle to input that password, but motivated criminals will figure something out.
Christ. Any machine-administered biometric authentication mechanism is no better than the next. Diagnostic uses of this tech for hearing deficiencies, however, potentially really very cool.
Here they are:
* useful motions when you want to call a jerk
* authenticating motions when you want to jerk around others (BOFH)
* brings new meaning to flipping off someone
* orchestrate a symphony of security -- ?marketing slogan?
* endorsed by Conan O'Brien's masterbating bear -- ?celebrity endorsement?
* sponsored by 900+ phone sex lines
More seriously, this would be a useful second authentication.
Unlike fingerprint scanners, a detached appendage (~digit) won't do the criminals any good. In fact the bruising/breaking/damaging of appendages would serve to alter the motion, rendering it useless.
* completely unusable by neuro-muscular disorder patients (Parkinsons, MD, CP)
* down the road, it will create a security problem for gang members, since all their motions are their gang signs. (Yo!)
* possibly unsuable on unsteady platforms like trains and cars. Wait. Maybe that is a good thing.
"Stolen cellphones could also be rendered useless by programming them to disable themselves if they detect that the user of the phone is not the legitimate owner."
Except you can usually just reset the thing and register as the new owner. (As int he case when it's protected with a pin number)
And it'd also make it difficult to lend out your phone to anyone else. Or use a hands free set.
"What will they think of next for biometrics?"
Do you watch spoof SiFi?
In an early Futurama episode (fish full of dollars) Fry goes to the back teller with his ATM card. She looks it up and says,
"We don't have your retinal or rectal scan on file do you remember your pin?"
Kind of summed it up in that it acknowledges that banks are an eye watering pain in the a55 especialy as they always default to the weakest security mode for "customer conveniance"...
@ Tangerine Blue,
"Maybe some implementations make it a costly hassle to input that password, but motivated criminals will figure something out."
Having worked out how to do exactly that for most bi-info readers that where around fifteen or so years ago, I'm not very impressed with the technology at all.
The only one I have not thought up a workable "fake" for is an iris/retina scan with "dynamic" tests (like puff of air or flash of light etc).
The problem with a lot of bio-metric systems is illustrated by finger print readers.
The first generation could be "faked out" in all sorts of simple ways (including just breathing on the sensor in one case).
When told about these deficiences the industry invariably went into denial mode even when shown how to defeat their systems with commonly available "household items".
It was only when it was made glaringly obvious (remember "gummy fingers") that measures where taken to stop some of the "fake out" methods.
To do this they simply "bolted on" extras which invariably made the systems less reliable (just like early victorian steam boiler artisans).
In the case of some modern fingerprint readers it would appear on examining the hardware, that the least important reading is the actual finger print.
It's the "body heat"/"IR blood flow-pulse"/"skin resistance"/"skin pores" tests to stop the "fake out" attempts that have more technology devoted to them...
And that is realy just one of the issues with bio-metrics, they have a lot of difficulty in recognising the difference between a fake and real person.
Even those that work on things like facial bone structure need "extras" to ensure it's a real live flesh and blood face that it is measuring and not an outright fake, hybrid or augmented individual...
Oh and the second issue is, it is not the real readings that are used. It is a "hash" of the reading that is compared to another "hash" kept in a file or DB. Which brings into play all sorts of questions about false matches based on deficiencies in the hashes (not the real readings).
@ Davi Ottenheimer,
"why? biometrics and human behavior have always been studied in security. is the rate or percentage of studies changing? you only cite two."
I tend to agree with Bruce these ar just "two new" biometrics to be "considered for security".
Yes we have been measuring bit's of humans (and making false assumptions) for something like a couple of centuries. Think of phrenology (reading the bumps on your head), fisogography (reading facial features) ear lobes etc. A lot of this false thinking went forward into eugenics (which sadly is still alive and well these days and being practiced by pedigree dog breeders 8(
The "why" is that there is now "technology" at sufficiently low cost to automate the (sometimes very difficult) measurments. Oh and a willingness by people to throw money at developing the next best technology to sell on large Government contracts.
(Conspiracy theories of the week :-
1, The "war on terror" is a way for the Government to buy the economy out of the "credit crunch" by stimulating hi-tech development in ID managment.
2, Large scale Government identity managment systems were started by the Bush Administration as a new way to limit the number of "other party" voters at future elections.
Two name but two 8)
This very afternoon my boss is doing a long workshop on gesture for mobile. http://design4mobile.com/sessions/workshop-1.html This is just gesture and biometrics overlapping. So we're stomping on each other's terminology and understanding of use.
The KDDI one especially is the sort of thing not so useful for security in the typical sense, but in making the device contextually intelligent. Simple use of this sensing to me would be that the device keypad is locked until your gesture is recognized by pulling to the ear, or to your typical eye level for manipulation or browsing or whatever.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.