Entries Tagged "face recognition"

Page 4 of 4

No Smiling in Driver's License Photographs

In other biometric news, four states have banned smiling in driver’s license photographs.

The serious poses are urged by DMVs that have installed high-tech software that compares a new license photo with others that have already been shot. When a new photo seems to match an existing one, the software sends alarms that someone may be trying to assume another driver’s identity.

But there’s a wrinkle in the technology: a person’s grin. Face-recognition software can fail to match two photos of the same person if facial expressions differ in each photo, says Carnegie Mellon University robotics professor Takeo Kanade.

Posted on May 29, 2009 at 11:19 AMView Comments

Improvements in Face Recognition

Ignore the laughable “100% accurate” claim; this is an interesting idea:

Mike Burton, Professor of Psychology at Glasgow, and lecturer Rob Jenkins say they achieved their hugely-improved results by eliminating the variable effects of age, hairstyle, expression, lighting, different camera equipment etc. This was done by producing a composite “average face” for a person, synthesised from twenty different pictures across a range of ages, lighting and so on.

Not useful when you only have one grainy photograph of your target, but interesting research nonetheless.

Posted on February 11, 2008 at 7:18 AMView Comments

Face Recognition Comes to Bars

BioBouncer is a face recognition system intended for bars:

Its camera snaps customers entering clubs and bars, and facial recognition software compares them with stored images of previously identified troublemakers. The technology alerts club security to image matches, while innocent images are automatically flushed at the end of each night, Dussich said. Various clubs can share databases through a virtual private network, so belligerent drunks might find themselves unwelcome in all their neighborhood bars.

Anyone want to guess how long that “automatically flushed at the end of each night” will last? This data has enormous value. Insurance companies will want to know if someone was in a bar before a car accident. Employers will want to know if their employees were drinking before work — think airplane pilots. Private investigators will want to know who walked into a bar with whom. The police will want to know all sorts of things. Lots of people will want this data — and they’ll all be willing to pay for it.

And the data will be owned by the bars thatcollect it. They can choose to erase it, or they can choose to sell it to data aggregators like Acxiom.

It’s rarely the initial application that’s the problem. It’s the follow-on applications. It’s the function creep. Before you know it, everyone will know that they are identified the moment they walk into a commercial building. We will all lose privacy, and liberty, and freedom as a result.

Posted on February 28, 2006 at 3:47 PMView Comments

Technological Parenting

Salon has an interesting article about parents turning to technology to monitor their children, instead of to other people in their community.

“What is happening is that parents now assume the worst possible outcome, rather than seeing other adults as their allies,” says Frank Furedi, a professor of sociology at England’s University of Kent and the author of “Paranoid Parenting.” “You never hear stories about asking neighbors to care for kids or coming together as community. Instead we become insular, privatized communities, and look for
technological solutions to what are really social problems.” Indeed, while our parents’ generation was taught to “honor thy neighbor,” the mantra for today’s kids is “stranger danger,” and the message is clear — expect the worst of anyone unfamiliar — anywhere, and at any time.

This is security based on fear, not reason. And I think people who act this way make their families less safe.

EDITED TO ADD: Here’s a link to the book Paranoid Parenting.

Posted on August 3, 2005 at 8:38 AMView Comments

World Series Security

The World Series is no stranger to security. Fans try to sneak into the ballpark without tickets, or with counterfeit tickets. Often foods and alcohol are prohibited from being brought into the ballpark, to enforce the monopoly of the high-priced concessions. Violence is always a risk: both small fights and larger-scale riots that result from fans from both teams being in such close proximity — like the one that almost happened during the sixth game of the AL series.

Today, the new risk is terrorism. Security at the Olympics cost $1.5 billion. $50 million each was spent at the Democratic and Republican conventions. There has been no public statement about the security bill for the World Series, but it’s reasonable to assume it will be impressive.

In our fervor to defend ourselves, it’s important that we spend our money wisely. Much of what people think of as security against terrorism doesn’t actually make us safer. Even in a world of high-tech security, the most important solution is the guy watching to keep beer bottles from being thrown onto the field.

Generally, security measures that defend specific targets are wasteful, because they can be avoided simply by switching targets. If we completely defend the World Series from attack, and the terrorists bomb a crowded shopping mall instead, little has been gained.

Even so, some high-profile locations, like national monuments and symbolic buildings, and some high-profile events, like political conventions and championship sporting events, warrant additional security. What additional measures make sense?

ID checks don’t make sense. Everyone has an ID. Even the 9/11 terrorists had IDs. What we want is to somehow check intention; is the person going to do something bad? But we can’t do that, so we check IDs instead. It’s a complete waste of time and money, and does absolutely nothing to make us safer.

Automatic face recognition systems don’t work. Computers that automatically pick terrorists out of crowds are a great movie plot device, but doesn’t work in the real world. We don’t have a comprehensive photographic database of known terrorists. Even worse, the face recognition technology is so faulty that it often can’t make the matches even when we do have decent photographs. We tried it at the 2001 Super Bowl; it was a failure.

Airport-like attendee screening doesn’t work. The terrorists who took over the Russian school sneaked their weapons in long before their attack. And screening fans is only a small part of the solution. There are simply too many people, vehicles, and supplies moving in and out of a ballpark regularly. This kind of security failed at the Olympics, as reporters proved again and again that they could sneak all sorts of things into the stadiums undetected.

What does work is people: smart security officials watching the crowds. It’s called “behavior recognition,�? and it requires trained personnel looking for suspicious behavior. Does someone look out of place? Is he nervous, and not watching the game? Is he not cheering, hissing, booing, and waving like a sports fan would?

This is what good policemen do all the time. It’s what Israeli airport security does. It works because instead of relying on checkpoints that can be bypassed, it relies on the human ability to notice something that just doesn’t feel right. It’s intuition, and it’s far more effective than computerized security solutions.

Will this result in perfect security? Of course not. No security measures are guaranteed; all we can do is reduce the odds. And the best way to do that is to pay attention. A few hundred plainclothes policemen, walking around the stadium and watching for anything suspicious, will provide more security against terrorism than almost anything else we can reasonably do.

And the best thing about policemen is that they’re adaptable. They can deal with terrorist threats, and they can deal with more common security issues, too.

Most of the threats at the World Series have nothing to do with terrorism; unruly or violent fans are a much more common problem. And more likely than a complex 9/11-like plot is a lone terrorist with a gun, a bomb, or something that will cause panic. But luckily, the security measures ballparks have already put in place to protect against the former also help protect against the latter.

Originally published by UPI.

Posted on October 25, 2004 at 6:31 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.