Comments

jgreco December 16, 2009 12:15 PM

Great, now my identical evil twin can steal my stuff.

On a more serious note, you hear about systems like this being fooled by lifesized photos a lot, this one says it recognizes your face in 3D, one has to wonder exactly how that works. Would a model of my head with the proper dimensions work? How hard would it be for an artist to construct a sufficiently accurate model with nothing but a few pictures of me?

I’m assuming that if you think you need a $456 lock, you have a good reason to be secure…

Alan Kaminsky December 16, 2009 12:30 PM

It says the unit has an Ethernet port. Presumably this allows the unit to be hooked up to whatever controls the door look. So just rip the unit off the wall, plug the Ethernet cable into your laptop, and send a “face recognized” message.

Scared December 16, 2009 12:55 PM

It has two lenses on the front, so you’d have to present a photo to each. My guess would be that it’ll still focus at closer range, so small photo’s would do.

Now, if it asked you to change you facial expression or utter one of many phrases, then it would be hard to defeat…

Team America December 16, 2009 1:20 PM

It appears there is a lot of misunderstanding about how the system works. What it actually does is is show you 16 (4×4-grid) pictures and you have to figure out which one is you. After 16 false tries the system’s AI will determine that you are a terrorist, show 16 pictures of Osama Bin-Laden and send an email to the DHS if you choose one of them.

BF Skinner December 16, 2009 1:22 PM

Binocular pattern recognition and discrimination for < 500$?

My parents always told me never trust something that thinks unless you can see where it keeps its brain.

Nik December 16, 2009 1:23 PM

Even easier it is using the wiegand protocol …… the green wire connector at the back has two wires. I think you just have to bridge the right ones and the door opens.

felonius December 16, 2009 1:50 PM

It has the same problem all door locks have: Someone who wants in is just going to find another way in.

This is obviously not a “house lock” solution. Criminals are going to break a window, or the door frame, to get in. At $500 (please forgive my rounding), it’s not really aimed at the average consumer anyway. It’s technically a corporate security device, which isn’t really going to appeal to too many corporations. Really, who wants to go through the kind of work it would take to use this regularly?

I’m sure it’s fine for the “time attendance” aspect, but ya gotta wonder if they’re charging more on the back end on top of that for software licensing…

RT December 16, 2009 2:09 PM

“Even easier it is using the wiegand protocol …… the green wire connector at the back has two wires. I think you just have to bridge the right ones and the door opens.”

Ah, no…. Wiegand is 26 bit ( or higher ) serial data. Uses 3 wires (Data0, Data1 and ground ). Just shorting the connector won’t work.

That said, I wonder what the False Acceptance rate is verses the False Rejection rate. You’d absolutely have to have some sort of bypass key. Biometric gun safes have them just in case. Plus the lock is only as good as the construction of the door, and frankly most doors and frames won’t stand up to much.

Jonathan December 16, 2009 2:15 PM

Yeah, as a door lock, this may not be effective, but as a time clock device, it might be worth it. I worked for a theme park way back in the day – large numbers of low-paid employees with lots of different schedules. One department pulled a big time-clock scam with everybody memorizing everybody else’s punch numbers and surreptitiously clocking in their buddies a couple hours early, or on their days off.

It was eventually caught by audit, but it’s hard to audit effectively and rapidly in such a chaotic scheduling environment. The company lost tens of thousands of dollars, because they weren’t able to prove that those employees hadn’t genuinely worked that day.

After that incident, they installed a hand geometry reader on the time clock. This has the advantage of being touch-free (faster, more sanitary) and potentially more accurate.

Timmy303 December 16, 2009 2:59 PM

And the FIRST time I came home after getting in a fight I’d have to break into my house (again) and have the police called on me (again) and my wife would have to bail me out (again) and then it would be brought up at Christmas dinner (again) and nobody would believe me that it was a machine’s fault (again) …

clvrmnky December 16, 2009 3:36 PM

This was totally on a TV show I watched last night. “Burn Notice” had a guy hold a photocopy of an acceptable person’s face to the face recognition security system to gain entry.

Isn’t there similar tech (similarly fooled) used to control who can buy cigs and alcohol from vending machines in Japan?

Nick P December 16, 2009 6:08 PM

“Extensive consumer surveys show that individuals prone to getting sunburns overwhelmingly opposed this product.” (source: manufactured, but believable)

Xyz December 16, 2009 6:57 PM

Anyone else notice the Engrish on-screen instructions? “Welcome On Duty! Adjust your actions!” And that impressive radar-screen graphic overlay; it’s like watching CSI!

I’m guessing the numpad is for the bypass key.

Ben Hutchings December 16, 2009 7:53 PM

If you can’t fool it with a photograph, you probably don’t need to rip it off the wall – there’s a convenient USB port on the bottom so you can exploit any bugs found in its USB stack.

Scott K December 17, 2009 12:32 AM

I worked for a company that occasionally installed, and used as their own time clock, an iris scanner. Not tied to a door lock in the instances I saw, but I’d trust those a lot more than facial recognition.

sortkatt December 17, 2009 1:25 AM

But, but, it says so right in the ad:
“…perfect for […] companies who simply wish to impress their clients”.

It’s not meant to keep someone who really wants in out, its meant to impress gullible clients. Which it probably will manage quite effectively.

W December 17, 2009 2:05 AM

“What a coincidence. 456 is the passcode on my doorlock!”
You shouldn’t have posted that here. This block doesn’t use SSL, so your data is not safe.

TexasDex December 17, 2009 8:02 AM

snortkatt hit the nail on the head. The sole purpose of this device is to look awesome for visiting clients. I remember interviewing with SunGARD and they took me to tour the server room. All the doors had thumbprint locks and I thought it was pretty awesome. I realized later (like, a while later) that the actual security it provided was eclipsed by the impression of security it gave prospective clients.

BF Skinner December 17, 2009 9:01 AM

“time attendance” aspect”

Is not all that trivial and depending what’s at stake time keeping can have a high integrity objective. DCAA has removed contractors and demanded refund of all monies payed to companies that fail to adequately protect their billing systems from fraud.

felonius December 17, 2009 10:12 AM

@BF Skinner:

I wasn’t trying to be disparaging towards the time keeping aspect, nor was I was trying to downplay its importance. My big question is actually whether or not they’re charging for the software solution that would surely tie in to this system. Traditional punch clocks tend to hover around $100 to $150, some of the “better” clocks get up to the $400’s pretty easily. I’d almost expect something like this to run a little higher than they’re charging, so I’m wondering if they’re “compensating” on the other end via software licensing fees.

AppSec December 17, 2009 10:24 AM

@W:

Yes it does… just change the URL to be HTTPS and you’ll see the post for comments is https 🙂

Curt Sampson December 17, 2009 10:47 PM

It seems to me, that, with the addition of a camera monitoring the area where this is used, facial recognition false positives from a photograph aren’t a big issue. Anybody who cheats that way will be caught when you see them on the monitoring system’s video holding up a photograph to the device.

Whether you’d really want to use the door unlock functionality, I’m not sure. Perhaps to unlock one of two locks on the door.

dozer December 18, 2009 8:38 AM

JT and DJ don’t worry about giving out your door codes, everyone can see which numbers are dirty and worn, but youve already tricked them because they believe that such locks use 4 numbers and they will be trying doubles of one of your numbers. women with makeup may have some problems, cameras only see light and dark, television personalities wear makeup to brighten the ridge of the nose and darken the sides to increase contrast, if you have dark bags under your eyes, some light makeup will change the cameras perspective. all those arbitrary points of measurement, are never actually accurate, since each of the single points that they try to base it on
are once again reflections of light and this can be manipulated, and for general facial recognition, in places where lighting conditions change constantly,
and the points measured are not exactly in the same place every time you face the camera.

Roger Fleming December 18, 2009 5:57 PM

@Curt Sampson:

It seems to me, that, with the addition of a camera monitoring the area where this is used, facial recognition false positives from a photograph aren’t a big issue. Anybody who cheats that way will be caught when you see them on the monitoring system’s video holding up a photograph to the device.

Sorry Curt, this doesn’t make a lot of sense. There are two possibilities here: audit after the fact is not sufficient to prevent harm (e.g. a masked person has broken in and made off with valuable goods), or audit after the fact is sufficient to (perhaps partially) prevent harm (e.g. someone can be fired for “clocking on” his co-worker before he really arrived.)

When audit does not recover the damages, placing cameras is almost useless. It will be able to exonerate the person whose likeness was used, but not much more. (And actually, you can’t even be completely sure of exonerating that person: maybe he used a photograph of his own face, knowing the cameras would appear to exonerate him! )

On the other hand, if audit after the fact is used to deter or recover from damage, then the expensive facial recognition gizmo serves no purpose; you might as well just have the employees writing their times in a logbook by hand, as the real security comes from the Mk I human eyeball doing much more sophisticated facial recognition on the recordings.

By the way, fooling the system with photographs isn’t a “false positive”, it’s an attack. A false positive is when some random person accidentally reads as an authorised user, without any malice aforethought. And yes, with facial recognition, false positives are a big issue. But the biometrics industry has had so much trouble coming up with an affordable gadget that works at all, they they have spent very little effort on actual security research against deliberate, malicious attacks. Unsurprisingly, many cheap, simple, obvious methods to subvert their systems have been found to work. Like photographs.

It is true that monitoring the biometric reader makes most of these attacks much harder to carry out. But that monitoring requires monitoring with immediate response — that is, a guard presence — not just auditing after the fact. But having a human guard monitoring each biometric reader completely destroys their touted advantages. This is one of several reasons why many readers of this blog consider most biometric security systems to be snake oil.

Curt Sampson December 21, 2009 3:14 AM

@Roger:

Perhaps I wasn’t clear: I’m talking about using this as a time clock. In that light, a lot of your comments don’t make sense at all. (For example, we’re incriminating, not exonerating, the person whose image was used, as they’re the one that’s profiting from the use of that image by being paid while not actually being there.) If you first reason how a time card system must work, and what the costs are, and then fit this device into it, I think you’ll find it’s starting to look like a pretty good fit.

Thinking about it further myself, I now realise that the lock is an important part of this system: that’s how you try to compel the entering employee to clock in. This will reveal a friend clocking in an employee who then arrives an hour later. Of course, you also need a lock to compel clocking out.

For my use of the term “false positive,” see http://en.wikipedia.org/wiki/False_positive#Statistical_error:_Type_I_and_Type_II .

Roger December 21, 2009 5:20 AM

@Curt Sampson:

Perhaps I wasn’t clear: I’m talking about using this as a time clock. In that light, a lot of your comments don’t make sense at all.

Sorry, no. Use of the system as a time clock falls under my case 2 (“if audit after the fact is used to deter or recover from damage”) and in fact is the specific example I was using.

For example, we’re incriminating, not exonerating, the person whose image was used, as they’re the one that’s profiting from the use of that image by being paid while not actually being there.

Yes indeed; the monitoring does prevent cheating in that case. [1] BUT then the expensive facial recognition gadget plays no role in the security of the system. With the CCTV monitoring, the system would work just as well with employees simply writing their name and arrival time by hand in a logbook within the cameras field of view. (Or waving a simple barcode in front of a scanner, if you want machine readable data.)

For my use of the term “false positive,” see http://en.wikipedia.org/wiki/… .

I’m aware of that, and my point is not inconsistent with it. That discussion is about statistical errors, i.e. random events.

Some writers have carelessly used the various phrases (Type I and Type II errors, false positive rate and false negative rate sensitivity and specificity) when referring to deliberate deception, but one should not as those terms refer to random events, and attacks are not random events.

The distinction is by no means semantic: as powerful and useful as that sort of analysis is, the maths becomes seriously incorrect when you allow the possibility that errors do not simply occur randomly at some set of rates, but are introduced intelligently and maliciously to deceive you. That is (one reason) why statisticians are so often fooled by “psi” experiments, and why very competent engineers often produce truly lousy security systems.


  1. Almost; it would actually still permit “joe jobbing.” In fact the bar code version would be marginally more resistant to joe jobbing as a false collaborator would have to somehow obtain the real card to photocopy it.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.