Fooling Facial Recognition Systems

This is some interesting research. You can fool facial recognition systems by wearing glasses printed with elements of other people's faces.

Mahmood Sharif, Sruti Bhagavatula, Lujo Bauer, and Michael K. Reiter, "Accessorize to a Crime: Real and Stealthy Attacks on State-of-the-Art Face Recognition":

ABSTRACT: Machine learning is enabling a myriad innovations, including new algorithms for cancer diagnosis and self-driving cars. The broad use of machine learning makes it important to understand the extent to which machine-learning algorithms are subject to attack, particularly when used in applications where physical security or safety is at risk. In this paper, we focus on facial biometric systems, which are widely used in surveillance and access control. We define and investigate a novel class of attacks: attacks that are physically realizable and inconspicuous, and allow an attacker to evade recognition or impersonate another individual. We develop a systematic method to automatically generate such attacks, which are realized through printing a pair of eyeglass frames. When worn by the attacker whose image is supplied to a state-of-the-art face-recognition algorithm, the eyeglasses allow her to evade being recognized or to impersonate another individual. Our investigation focuses on white-box face-recognition systems, but we also demonstrate how similar techniques can be used in black-box scenarios, as well as to avoid face detection.

News articles.

Posted on November 11, 2016 at 7:31 AM • 11 Comments

Comments

WmNovember 11, 2016 11:53 AM

States like CA, NY, and MA will know just what to do;
Wearing fake glasses:
First degree felony, punishable by 10 years in prison, $50,000 fine.

CuriousNovember 11, 2016 12:59 PM

Heh, I don't understand the Boing Boing article there. If there is something in the images, I don't see it. :|

TedNovember 11, 2016 1:43 PM

Since 2014, NTIA has been conducting meetings on privacy and facial recognition technology. In June 2016, a privacy best practices guideline was released.

To summarize, these recommendations apply to covered entities who collect, store, or process facial template data. They do not, however, apply to security applications (eg: loss prevention uses), law enforcement, national security, intelligence, or military agency uses of the technology, which are beyond the scope of this guidance.

The best practices encourage covered entities to make their facial recognition policies available to consumers. However, as outlined, these recommendations do not apply to aggregate, non-identifying data, for example, the use of the technology to count visitors or to determine the ages or genders of consumers.

Here’s a link to NTIA’s ongoing multi-stakeholder process meetings for facial recognition technology, including the recent privacy best practices document, and stakeholder comments.

https://www.ntia.doc.gov/other-publication/2016/privacy-multistakeholder-process-facial-recognition-technology

Later this June, a coalition of 45 organizations wrote a letter to congress requesting an oversight hearing regarding “The FBI’s Use of Facial Recognition and Proposal to Exempt the Bureau’s Next Generation Identification Database from Privacy Act Obligations”

https://epic.org/privacy/fbi/NGI-Congressional-Oversight-Letter.pdf

According to documents obtained by EPIC, the FBI's NGI database "accepted a 20% error rate for facial recognition matches."

ChevNovember 11, 2016 5:11 PM

Apparently the funny colored printouts being glued to eyeglass frames works because that's how facial recognition software converts a person's face into computer data.

In other words, those weird colors is how your face will look to the computer after being run through facial recognition software. The researches just pre-rendered this data and stuck it onto glass frames and fed this "false data" into the computer program.

What those funny colors probably makes the computer think it's seeing is depth perception. For example, the nose and cheeks stick out further than the recessed eye sockets.

I imagine this has to do with facial recognition software trying to translate a 2D image (picture) into a 3D model. The glasses are feeding false 3D data to the software.

That's my best guess anyway. A very interesting article Bruce. Thanks for sharing!

RobertNovember 12, 2016 9:17 PM

Or, you can just jam the camera taking the picture using an array of small, bright infrared LED's.

Nicki HalflingerNovember 13, 2016 5:57 PM

Robert: I've sideways wondered how those would work on license plate readers.

A Nonny BunnyNovember 18, 2016 2:39 PM

@Robert

Or, you can just jam the camera taking the picture using an array of small, bright infrared LED's.
That won't make you look like someone else to a face recognition system. It'll just blind it. So it won't get you into Wuzhen for free ( https://www.newscientist.com/article/2113176-chinese-tourist-town-uses-face-recognition-as-an-entry-pass/ )

@Chev

In other words, those weird colors is how your face will look to the computer after being run through facial recognition software.
It's more that processing transforms your face and that pattern into things that will look similar.
The face recognition tries to extract features at a number of levels (it's probably a deep neural network), and the patterned glasses screw this process up by overstimulating specific features to point it in the wrong direction.
On still images, you can even do this by altering it in a way unnoticeable by a human.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.