Eventually, it will work. You’ll be able to wear a camera that will automatically recognize someone walking towards you, and a earpiece that will relay who that person is and maybe something about him. None of the technologies required to make this work are hard; it’s just a matter of getting the error rate down low enough for it to be a useful system. And there have been a number of recent research results and news stories that illustrate what this new world might look like.
The police want this sort of system. I already blogged about MORIS, an iris-scanning technology that several police forces in the U.S. are using. The next step is the face-scanning glasses that the Brazilian police claim they will be wearing at the 2014 World Cup.
A small camera fitted to the glasses can capture 400 facial images per second and send them to a central computer database storing up to 13 million faces.
The system can compare biometric data at 46,000 points on a face and will immediately signal any matches to known criminals or people wanted by police.
In the future, this sort of thing won’t be limited to the police. Facebook has recently embarked on a major photo tagging project, and already has the largest collection of identified photographs in the world outside of a government. Researchers at Carnegie Mellon University have combined the public part of that database with a camera and face-recognition software to identify students on campus. (The paper fully describing their work is under review and not online yet, but slides describing the results can be found here.)
Of course, there are false positives—as there are with any system like this. That’s not a big deal if the application is a billboard with face-recognition serving different ads depending on the gender and age—and eventually the identity—of the person looking at it, but is more problematic if the application is a legal one.
In Boston, someone erroneously had his driver’s licence revoked:
It turned out Gass was flagged because he looks like another driver, not because his image was being used to create a fake identity. His driving privileges were returned but, he alleges in a lawsuit, only after 10 days of bureaucratic wrangling to prove he is who he says he is.
And apparently, he has company. Last year, the facial recognition system picked out more than 1,000 cases that resulted in State Police investigations, officials say. And some of those people are guilty of nothing more than looking like someone else. Not all go through the long process that Gass says he endured, but each must visit the Registry with proof of their identity.
[…]
At least 34 states are using such systems. They help authorities verify a person’s claimed identity and track down people who have multiple licenses under different aliases, such as underage people wanting to buy alcohol, people with previous license suspensions, and people with criminal records trying to evade the law.
The problem is less with the system, and more with the guilty-until-proven-innocent way in which the system is used.
Kaprielian said the Registry gives drivers enough time to respond to the suspension letters and that it is the individual’s “burden’” to clear up any confusion. She added that protecting the public far outweighs any inconvenience Gass or anyone else might experience.
“A driver’s license is not a matter of civil rights. It’s not a right. It’s a privilege,” she said. “Yes, it is an inconvenience [to have to clear your name], but lots of people have their identities stolen, and that’s an inconvenience, too.”
IEEE Spectrum and The Economist have published similar articles.
EDITED TO ADD (8/3): Here’s a system embedded in a pair of glasses that automatically analyzes and relays micro-facial expressions. The goal is to help autistic people who have trouble reading emotions, but you could easily imagine this sort of thing becoming common. And what happens when we start relying on these computerized systems and ignoring our own intuition?
EDITED TO ADD: CV Dazzle is camouflage from face detection.
Posted on August 2, 2011 at 1:33 PM •
View Comments