Human-Machine Trust Failures

I jacked a visitor’s badge from the Eisenhower Executive Office Building in Washington, DC, last month. The badges are electronic; they’re enabled when you check in at building security. You’re supposed to wear it on a chain around your neck at all times and drop it through a slot when you leave.

I kept the badge. I used my body as a shield, and the chain made a satisfying noise when it hit bottom. The guard let me through the gate.

The person after me had problems, though. Some part of the system knew something was wrong, and wouldn’t let her out. Eventually, the guard had to manually override something.

My point in telling this story is not to demonstrate how I beat the EEOB’s security—I’m sure the badge was quickly deactivated and showed up in some missing-badge log next to my name—but to illustrate how security vulnerabilities can result from human/machine trust failures. Something went wrong between when I went through the gate and when the person after me did. The system knew it but couldn’t adequately explain it to the guards. The guards knew it but didn’t know the details. Because the failure occurred when the person after me tried to leave the building, they assumed she was the problem. And when they cleared her of wrongdoing, they blamed the system.

In any hybrid security system, the human portion needs to trust the machine portion. To do so, both must understand the expected behavior for every state—how the system can fail and what those failures look like. The machine must be able to communicate its state and have the capacity to alert the humans when an expected state transition doesn’t happen as expected. Things will go wrong, either by accident or as the result of an attack, and the humans are going to need to troubleshoot the system in real time—that requires understanding on both parts. Each time things go wrong, and the machine portion doesn’t communicate well, the human portion trusts it a little less.

This problem is not specific to security systems, but inducing this sort of confusion is a good way to attack systems. When the attackers understand the system—especially the machine part—better than the humans in the system do, they can create a failure to exploit. Many social engineering attacks fall into this category. Failures also happen the other way. We’ve all experienced trust without understanding, when the human part of the system defers to the machine, even though it makes no sense: “The computer is always right.”

Humans and machines have different strengths. Humans are flexible and can do creative thinking in ways that machines cannot. But they’re easily fooled. Machines are more rigid and can handle state changes and process flows much better than humans can. But they’re bad at dealing with exceptions. If humans are to serve as security sensors, they need to understand what is being sensed. (That’s why “if you see something, say something” fails so often.) If a machine automatically processes input, it needs to clearly flag anything unexpected.

The more machine security is automated, and the more the machine is expected to enforce security without human intervention, the greater the impact of a successful attack. If this sounds like an argument for interface simplicity, it is. The machine design will be necessarily more complicated: more resilience, more error handling, and more internal checking. But the human/computer communication needs to be clear and straightforward. That’s the best way to give humans the trust and understanding they need in the machine part of any security system.

This essay previously appeared in IEEE Security & Privacy.

Posted on September 5, 2013 at 8:32 AM31 Comments

Comments

Harvey MacDonald September 5, 2013 9:32 AM

It took me a while to figure out what “jacked” means. Now my mental picture of Bruce looks like a thug every time he explains something. Depressing.

bob September 5, 2013 9:37 AM

Nice. A more sophisticated version of: If I set the alarm off 4 times and run away without trace, they’ll switch the alarm off. The 5th time, I can just walk in.

David September 5, 2013 9:47 AM

Timely story.

I had an interesting exchange with a Washington state ferry terminal gate attendant recently. I mistakenly went through the turnstile before I was authorized to do so. The turnstiles were apparently supposed to remain locked until the ferry docked, but when I scanned my ticket, it allowed me through.

The attendant stopped me, and was absolutely sure I’d “jumped” the turnstile by forcing it backward. He was good natured about it and didn’t make it a big deal (I guess I look harmless), but he definitely didn’t believe me. There are cameras everywhere, so I’m sure he could see it for himself if he chose, but the exchange ended there with a chuckle.

Even better: when I returned with my ticket at the proper time, it let me in again. The same ticket isn’t supposed to be valid twice.

NobodySpecial September 5, 2013 10:02 AM

@david – that’s the problem with allowing low-level employees autonomous decision making capabilities.

The proper response should have been to tazer you, lock down the terminal and call a SWAT team. Since you had obviously violated a Transport Security System you must be a terrorist

Chris W September 5, 2013 10:04 AM

A conundrum: Did Bruce’s action cause the malfunction, or would it have malfunctioned anyway?

This can only be answered if we know more about the system in question. Particularly whether it’s smart enough to detect if a badge was dropped off.

I presume that visitors and permanent badge holders both go through the same gate, but obviously the permanent badge holders don’t drop off their badge. Thus requiring different operation procedures.
It’s more likely there is a gate that doesn’t operate differently for those badges, and that the drop-off box has no intelligence whatsoever. (except for the guard that checks whether visitors drop off their badges)
If that is the case, the malfunction was unrelated to Bruce ‘jacking’ a badge, and couldn’t be used as an catchy introduction for this essay. And that all because of an assumption.

ATN September 5, 2013 10:14 AM

@NobodySpecial:

The proper response should have been to tazer you, lock down the terminal and call a SWAT team.

“you” being the next person trying to exit, obviously.

Rick Auricchio September 5, 2013 10:29 AM

There’s a wonderful example of this system failure in Roberto Benigni’s 1994 comedy The Monster, where Benigni shoplifts by hiding items in other shoppers’ pockets and purses.

As they leave, the exit-alarm system keeps alerting. By the time he leaves with pockets full of stolen goods, the store employees simply wave him through.

Uhu September 5, 2013 10:29 AM

@Chris W
I thought exactly the same thing. I was recently at an IBM building where they actually had a separate turnstile for visitors, and I kept my badge as well. The person behind me didn’t have any problems going through.

sparkygsx September 5, 2013 11:04 AM

Bruce, I’m rather disappointed you didn’t attempt to read the contents of the badge. Are the returned badges re-used? Do they contain any kind of crypto, or are they simple writable tags, or even just read-only tags? If they are writable, are they re-written with some long, unpredicable numbers or something similar?

Bat September 5, 2013 11:07 AM

I think this is a case of hearing hoofprints and thinking zebras. There’s really not anything in the anecdote to suggest otherwise (it’s a real leap to assume a container with a slot in it for badges is actually a card reader). In my experience these badges are simply consumables, while the readers are quite expensive, so the assumed exit scanning system simply makes no sense.

SJ September 5, 2013 11:13 AM

@Chris W:

The system could depend on the change in weight of the badge-return box, after a Visitor Badge is swiped out.

Thus, the next Visitor Badge leaving causes an error, because the machine hasn’t confirmed the previous Visitor Badge entering the badge-return box.

(I base this set of observations on a system used in self-checkout at some big-box grocery stories. The buyer scans the item, and is supposed to place the item in a bag, or on the bagging platform. If the scales in the bagging platform don’t report that an item was placed there, the checkout terminal displays an error screen with “Please place the item in the bag.”)

sparkygsx September 5, 2013 11:29 AM

Weighing the badges seems rather unlikely, if only because the chain will be much heavier then the badge itself.

Reading the badges would make sense, because that way it’s easy to record if and when a visitor has left the building, so you always know who is still in the building, can check invoices against visitor logs, etc.

It doesn’t make any sense to disguise the reader as a regular bin with a slot in the top, so it probably isn’t a reader at all.

Even if it really was a reader, I can’t really think of a plausible reason why the person behind Bruce would have been stopped, instead of Bruce himself.

I was going to instantiate Occam’s razor, but on the other hand, somehow the “system” decided to stop the lady behind him. If it wasn’t because of the badges, what did it respond to?

Gweihir September 5, 2013 11:33 AM

Bad, Bruce, bad! No cookie! 😉

Nice example for an all-to common problem. I have had the problem in the other direction numerous times: I was clearly authorized to get into a high-security data-center, but the guards had to eventually let me in through the equipment-gate, as the computer would not let me in.

Brett September 5, 2013 12:48 PM

It doesn’t make sense to me that the person behind Bruce (Mr. Bruce now), got stopped because he didn’t turn in his badge. If the exit gate had a badge reader built in it would have stopped him – for having a badge when he shouldn’t have, not for the next person not having one.

But then, as it has been pointed out many times, system designers tend to get some things really wrong.

Anyway – Mr. Bruce – if you will say, what was the reason you decided to jack the card? That is really the question here.

Mike Cotton September 5, 2013 1:28 PM

It caused a problem for the person behind him, most likely, because the system tried to initiate a badge-return procedure before the previous one had been completed. There’re a bunch of different ways the system could be implemented for that to be the case, any one of which will produce that behavior.

Similarly, a large majority of syntax errors in code will occur in the statement after the location of error. There’s nothing wrong with that statement, but the malformed statement above it prevents the working code from being parsed correctly.

Clive Robinson September 5, 2013 1:40 PM

@ Bruce,

Such “pass problems” are all to well known to Londoners with Transport for London’s oyster card system.

One very problematic place for reasons TfL have failed to explain is Wimbledon station, where two rail operators the London Underground and The Croydon Tram-link all converge.

When a pasenger trys to “touch in” or “touch out” if the barrier gates don’t open a two digit code comes up in the corner of a display that is difficult to read if “heavens above the sun is shining”.

Now this is where the fun starts the “gate droids” by and large have no idea what the numbers mean thus asking what the problem is does not help and often rather than try and sort things out they just open the gate and let you through…

Problem solved you might think? Not a chance, if they let you out the computer system used to charge you a “full fare” to the last payment zone, if they let you in when you get to your destination it won’t let you out.

But it gets better the system does not know train routes and connections, So if you get on the London Overground at West Croydon and go round the long way to Clapham Junction and then SWT to Wimbldon the system is so confused it either charges you for a Journy via Central London (expensive) or a tramfare (cheaper) based on

Daniel September 5, 2013 1:46 PM

Bruce writes, “Humans and machines have different strengths. Humans are flexible and can do creative thinking in ways that machines cannot…..”

There is an easier way of expressing the idea in this paragraph. American law draws a distinction between rules and standards. Humans are good at implementing standards but bad at following rules. Machines are good at implementing rules but go haywire when trying to follow standards.

One of the major reasons for machine-trust breakdown is that humans conceive their task as enforcing a standard while the machine is enforcing a rule. So when the guards had a problem with the machine they didn’t think, “what rule just got broken” instead they thought “what standard do I need to apply to this situation.” And the first way to meet a standard for a guard is always stopping someone (either in or out). So they stopped the person in front of them.

Clive Robinson September 5, 2013 1:48 PM

@ Bruce,

Such “pass problems” are all to well known to Londoners with Transport for London’s oyster card system.

One very problematic place for reasons TfL have failed to explain is Wimbledon station, where two rail operators the London Underground and The Croydon Tram-link all converge.

When a pasenger trys to “touch in” or “touch out” if the barrier gates don’t open a two digit code comes up in the corner of a display that is difficult to read if “heavens above the sun is shining”.

Now this is where the fun starts the “gate droids” by and large have no idea what the numbers mean thus asking what the problem is does not help and often rather than try and sort things out they just open the gate and let you through…

Problem solved you might think? Not a chance, if they let you out the computer system used to charge you a “full fare” to the last payment zone, if they let you in when you get to your destination it won’t let you out.

But it gets better the system does not know train routes and connections, So if you get on the London Overground at West Croydon and go round the long way to Clapham Junction and then SWT to Wimbldon the system is so confused it either charges you for a Journy via Central London (expensive) or a tram fare (cheaper) based on some strange combination of what type of ticket (if any) you have on your card, the time of day and I suspect which way the wind is blowing.

Untill recently TfL’s default position was it’s the card holders fault for whatever excuse they could come up with and hit you with Max Fare. However they have made so much money out of it that some passengers started legal action for recovery and now they have “different rules” rather than go into court and admit just how badly mucked up they oyster card system is…

And the problems are still cropping up as regularly as a fox raiding the trash bins.

Bat September 5, 2013 2:06 PM

@sparkygsx – without more information, it really is impossible to give a reason why the gate might have failed for the person behind. For all we know, the exit gate is manually operated by the guard and he wanted an excuse to flirt with the woman exiting. Assuming that one has discovered the failings of a system solely based on an n=1 observation doesn’t make any sense at all.

In fact, I think it is more likely that the “electronic” badge is used in that environment the same way that a paper tag would be used – simply as a way to get everyone to sign in and out at the front desk.

mc September 5, 2013 2:38 PM

I think the issue of trust here is that they don’t always trust the people who programmed the machine. These systems are complicated, and hard to get right 100% of the time. The person programming needs very in depth knowledge of all threats and risks, and must be meticulous about preventing them. These are not the qualities of a typical wage slave, where getting things done fast is often prioritized to getting things right. But the customer (of the security vendor) should demand the system should have a more verbose method of telling what the failure was than a general alarm, which goes a long way toward making sure it was more carefully created.

Black Luigi September 5, 2013 3:44 PM

Exorbitantly expensive and inordinately complex systems, built by the lowest bidder, and watched over by minimum-wage drones. What could possibly go wrong?

Secret Service - Schneier Task Force September 5, 2013 7:25 PM

Did Bruce just admit to violating 18 USC 641 (Theft of Public money, property or records) on his blog? We are coming after you, Bruce! We have convened a task force of 50 special agents from the FBI, USSS, Postal Inspectors, and the U.S. Marshals to hunt you down for your stunt! And also, we really have nothing better to do, so why not make a federal case out of this right?

Dirk Praet September 5, 2013 7:54 PM

We’ve all experienced trust without understanding, when the human part of the system defers to the machine, even though it makes no sense: “The computer is always right.”

Trust without understanding is called faith, which hardly ever makes for rational decisions. If my GPS is telling me to take a bridge that isn’t there, there is no way I’m gonna plunge my car into the river.

@ NobodySpecial

The proper response should have been to tazer you, lock down the terminal and call a SWAT team. Since you had obviously violated a Transport Security System you must be a terrorist

+1

Particular Random Guy September 6, 2013 10:14 AM

If my GPS is telling me to take a bridge that isn’t there, there is no way I’m gonna plunge my car into the river.

However, you wouldn’t be the first one by far.

Michael Sullivan September 7, 2013 1:53 AM

“The more machine security is automated, and the more the machine is expected to enforce security without human intervention, the greater the impact of a successful attack.”

What does this say about NSAs stated intention to automate 90% of their system administrative functionality?

Jon September 9, 2013 4:08 PM

That the guard had and knew how to use a manual override procedure implies to me that the system fails quite frequently. That it happened just after Mr. Schneier strikes me as pure chance.

Jon

Jon September 9, 2013 4:16 PM

Furthermore, for all those accusing Mr. Schneier of theft, recall the language. They gave him the badge. It was not a rental, or a loan, it was a gift. That the guard expected everyone to give him their badge back else he’d not let them out reeks of extortion.

Jon

bob September 10, 2013 9:44 AM

The best vector to attack anything is the junction between disparate systems.

Ie not the card reader, the (data travelling along the) wires where it goes to the computer. Not the gate, the hinges where it attaches to the wall. Not the ATM vault, the phone line talking to the mainframe that authorizes the money dispenser.

This goes all the way back to medieval armies attacking the boundary between allied opponents.

Vlad September 21, 2013 2:06 PM

It is highly unlikely to have a reader in the drop bin because there are many badges in it and it will not be able to distinguish between all of them.

But the same situation we observe with automated test system where when there is a failure the primary suspect is not the product under test but the system testing it. Lack of trust is a huge issue in cases like that.
The problem is that the lack of trust is a consequence of inaccurate results or in other words – events that led to the conclusion that the machine cannot be trusted.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.