Real-World Access Control

Access control is difficult in an organizational setting. On one hand, every employee needs enough access to do his job. On the other hand, every time you give an employee more access, there’s more risk: he could abuse that access, or lose information he has access to, or be socially engineered into giving that access to a malfeasant. So a smart, risk-conscious organization will give each employee the exact level of access he needs to do his job, and no more.

Over the years, there’s been a lot of work put into role-based access control. But despite the large number of academic papers and high-profile security products, most organizations don’t implement it—at all—with the predictable security problems as a result.

Regularly we read stories of employees abusing their database access-control privileges for personal reasons: medical records, tax records, passport records, police records. NSA eavesdroppers spy on their wives and girlfriends. Departing employees take corporate secrets

A spectacular access control failure occurred in the UK in 2007. An employee of Her Majesty’s Revenue & Customs had to send a couple of thousand sample records from a database on all children in the country to National Audit Office. But it was easier for him to copy the entire database of 25 million people onto a couple of disks and put it in the mail than it was to select out just the records needed. Unfortunately, the discs got lost in the mail and the story was a huge embarrassment for the government.

Eric Johnson at Dartmouth’s Tuck School of Business has been studying the problem, and his results won’t startle anyone who has thought about it at all. RBAC is very hard to implement correctly. Organizations generally don’t even know who has what role. The employee doesn’t know, the boss doesn’t know—and these days the employee might have more than one boss—and senior management certainly doesn’t know. There’s a reason RBAC came out of the military; in that world, command structures are simple and well-defined.

Even worse, employees’ roles change all the time—Johnson chronicled one business group of 3,000 people that made 1,000 role changes in just three months—and it’s often not obvious what information an employee needs until he actually needs it. And information simply isn’t that granular. Just as it’s much easier to give someone access to an entire file cabinet than to only the particular files he needs, it’s much easier to give someone access to an entire database than only the particular records he needs.

This means that organizations either over-entitle or under-entitle employees. But since getting the job done is more important than anything else, organizations tend to over-entitle. Johnson estimates that 50 percent to 90 percent of employees are over-entitled in large organizations. In the uncommon instance where an employee needs access to something he normally doesn’t have, there’s generally some process for him to get it. And access is almost never revoked once it’s been granted. In large formal organizations, Johnson was able to predict how long an employee had worked there based on how much access he had.

Clearly, organizations can do better. Johnson’s current work involves building access-control systems with easy self-escalation, audit to make sure that power isn’t abused, violation penalties (Intel, for example, issues “speeding tickets” to violators), and compliance rewards. His goal is to implement incentives and controls that manage access without making people too risk-averse.

In the end, a perfect access control system just isn’t possible; organizations are simply too chaotic for it to work. And any good system will allow a certain number of access control violations, if they’re made in good faith by people just trying to do their jobs. The “speeding ticket” analogy is better than it looks: we post limits of 55 miles per hour, but generally don’t start ticketing people unless they’re going over 70.

This essay previously appeared in Information Security, as part of a point/counterpoint with Marcus Ranum. You can read Marcus’s response here—after you answer some nosy questions to get a free account.

Posted on September 3, 2009 at 12:54 PM26 Comments

Comments

christian September 3, 2009 1:37 PM

Ah, I remember the good old times when I was working in IT support for a big company. Me and a couple of other 20-somethings (or younger) had, for support purposes, complete access to all the files on the network shares. More access than the management itself had.

For physical access it was easy too: Just visit a 1-day first aid course and you got a badge which worked on every door.

The worst thing here: We were so low in the hierarchy, no-name working drones, so we got zero emotional connection to the job or the company. We didn’t care if we abused our priviledges and lost our jobs… We were probably the most dangerous potential security risks in the entire organisation.

HJohn September 3, 2009 1:43 PM

I’ve spent most my career auditing. I’m a Certified Information Systems Auditor (CISA) and Certified Internal Auditor (CIA). While I (obviously) think my role is critical in detecting abuse and in detering it through my work, I still am amazed how many of my colleagues have a hard time grasping trade-offs.

Common buzz phrases are “least access required” and “least privilege.” Though that auditor, understandably, errs on the side of the most secure or restrictive, I’ll blow the whistle on some of my brethren and say that any restrictions need to be considered in conjunction with the cost of the restriction, both in dollars and productivity.

Is the hiring an extra staff person really worth further segregating this function? Is the reduction of this person’s productivity really worth it in comparison to what we are trying to prevent?

In many of these cases the answers are yes. Sometimes the risks are significant enough to reduce productivity.

In many cases, the answer is no, and this is why some in my profession unwittingly do more harm.

2 cents.

partdavid September 3, 2009 1:44 PM

I believe I’m with the author. In a company that can make employees care, accountability seems to be much more important that prospective access control. Prospective access control means knowing exactly what access you need to do everything you might need to do, before day 1. That’s unrealistic, and why so many organizations live with a very annoying hybrid of “under-” and “over-” entitlement–lots of people have “too much” access, while access control continues to be at the very least an annoyance to people who have to actually get things done.

Easy self-escalation is probably one way to fight this, but I think it goes to the heart of a mistaken way of thinking about security. The idea that people need only the “minimum privileges” required to “do their job” implies that you have a perfect understanding of what their job is, and that that job is the best one they could be doing for your organization.

That’s just not true. “Minimum privilege” would be like hiring people with “minimum capabilities” that only “know enough” to “do their job.” The latter philosophy is obviously absurd, so why do we invest so much in the assumption that the former is also always correct? Organizations get a lot of value out of star performers with above-average capability, and a lot of value from the serendipity of people who can go beyond “their job” and do things differently or better.

For example, I’m a software developer. I have access to some production systems, which is a violation of “minimum privileges” because I don’t “need” access to them to do my job (of developing new software). Yet in some circumstances my being able to directly examine the systems’ operation, even to modify it on the fly, results in vastly improved service.

We see this also in our internal wiki management. In general, all our documentation pages are open not only for viewing, but editing, by any employee. Why should we deny ourselves the value of a reader who can correct a section number or grammatical error? No one can anonymously edit or even view documentation pages, and I’m confident that ordinary corporate standards of accountability are sufficient to “protect” that content, except in particular special cases.

HJohn September 3, 2009 1:47 PM

I like the speeding ticket analogy. I’d also apply the logic to speed limits themselves:

Controls are like speed limits: if too restrictive, you’ll never get where you are going. If too lax, you’re more likely to get killed along the way.

nick September 3, 2009 1:47 PM

Managing least privilege or role-based access control is so expensive to actually do, it’s cheaper to just give out access like candy and accept the risk that someone may abuse it.

Marc September 3, 2009 1:55 PM

The best role based access control I have seen, allows users to have multiple roles, which they select from at login (with a default role specified). This prevents role scope creep as jobs evolve or absorb other roles. It also allows people to login with the minimum functionality required to do their job – every user I have worked with like this feature as it prevented (especially super users) from doing things inadvertently. Very few systems have this feature, I don’t why.

Billy Crook September 3, 2009 1:55 PM

My advice is that at the moment any person is granted access to any resource, an expiration time is set for that grant. This will cause each grant to be reviewed at a regular basis for necessity. Auditing can show how frequently such access is used, and managers can asses the need to renew it during yearly employee evaluations.

HJohn September 3, 2009 2:00 PM

Three answers to the question “will they need access?”
1. Yes. – No brainer, give it to them.
2. Never. – No brainer, prevention more cost effective than detection.
3. Maybe. -this is where detection is usually more cost effective than prevention.

Three types of access risk (the DAD of information security):
1. Disclosure – what is the risk and potential impact of someone viewing information they should not view?
2. Alteration – what is the risk and potential impact of someone changing information that should not be changed?
3. Destruction – what is the risk and impact of the destruction

The controls (preventative or detective) need to address the needs, the risk, and the impact. The Yes, Never, and Maybe apply not just to the data as a whole, but the level of access to the data. Don’t let them read what they can never see, don’t let them alter what they only need to read, etc. And always have a back up plan should the wrong thing happen (rollback, backups, history, etc.).

Sorry for the ObviousSecurityLessons 101, but I have to suspect some managers read this and don’t think in the terms most of us seasoned security folks do.

Impossibly Stupid September 3, 2009 4:41 PM

I’m going with the “auditing is more important than control” camp. Most of the time a trusted individual is not going to betray you if you give them incrementally more access than normal. But if anything ever goes down, you had better have a way to tell if you were screwed by one person or by some other person who had access at the same level.

Derek September 4, 2009 6:10 AM

The scary thing about the HMRC story is that the “copy the whole database to a CD and send it in the post” used to be (and maybe still is) the government approved procedure for sending data to the auditors.

Typically, the less-than-computer-literate manager of a regional department has instructions to insert the cd, press a button, wait for the cd to pop out again then stick it in the mail.

I have not heard of any change to this procedure after the story broke when a disk went missing.

Steven Hoober September 4, 2009 9:09 AM

In my experience, role based (technical, vs. physical access-control) security is not that hard to design into a system. Take some time to learn about the users, their actual roles in the system that is their job (vs. the technical system) and design a workflow-based structure that acts like the job they perform, granting levels-of-access to secure records not just per person, but only as needed as a part of a traceable process.

However, I have never actually gotten one implemented. Poor understanding, and stupid status quo security practices mean that anything new and (perceived as) complex gets tossed when development (as always) is resource or time constrained.

When (high profile… this was for a US telecom) breaches occur, the answer is MORE security, in the sense of longer passwords, or more of them, and adding needless and annoying layers to the end users. Yes, the ones who cannot violate system protocols themselves. Ad hoc modeling of the breaches by me indicates (as mentioned in the article) even most social engineering attacks fail with a well-designed access control scheme.

But no one with a checkbook thinks that hard about the problem, or reads blogs like this.

David September 4, 2009 9:15 AM

@Christian: There’s always going to be the need for low-level people with access (something like the “Sons of Martha” from the Kipling poem). Ultra-secret labs need janitors and plumbers, databases need administrators, and so on. It’s real easy to ignore these people and/or ensure that they have no emotional loyalty to the enterprise.

I don’t think this is a role problem, though, although it’s potentially a very large one. A role issue would be something like whether, as a software developer, I can access production data, or put code into the production release. Basically, I can do a better job with more access, but I don’t actually need it and it’s more of a security risk.

@Derek: Thing is, it wouldn’t take much of a change to a procedure. Copy the whole database to a CD or DVD, but have an encryption step. That introduces almost no complication, and makes a tremendous difference if the CD or DVD is lost or intercepted.

HJohn September 4, 2009 12:44 PM

Impossibly Stupid: “I’m going with the “auditing is more important than control” camp. Most of the time a trusted individual is not going to betray you if you give them incrementally more access than normal. But if anything ever goes down, you had better have a way to tell if you were screwed by one person or by some other person who had access at the same level.”


In many situations, detective controls are cheaper and simpler to implement. There is also a hidden advantage–you stand less of a chance of detecting an unethical employee.

Of course, this formula is case by case and not always appropriate. Sometimes the cost of the incident is too great to let it happen. If you fire someone for unauthorized access to data, they may walk out the door with social security numbers or something else sensitive, and you are still quite culpable.

Jack September 4, 2009 5:53 PM

Hi all,

Why do organizations needing this kind of access control and why does each employee when they go home does not need this kind of complicated access control to protect their possession or secret? Each person or family has their possession that they treasure and guard with passion.

But surprisingly when they step inside their employer’s premises they are subjected to this kind of control. In the book “HP Way”, they once had this but lifted to their amazement of improving productivity and replacing it with respect and trust of their employee. Not the hollow publicity stun stuff.

Many employees do not care or value materials that they handled because employer typically treating them just as a disposable item and there is little relationship or trust developed.

Hence employee treat them no differently than the pen, chair, computer or paper they use to ‘do the job’.

A far better technique is to develop an environment that fosters the same kind of passion an employee uses to guard their family secret or treasure.

Human is the best ‘machine’ to deal with chaotic and fast changing stuff. Computer is the best ‘machine’ to deal with routine mundane stuff.

My 2 cents worth.

Jack

Rob Lewis September 4, 2009 7:59 PM

I think Marcus takes this one.

Regulatory compliance is bound to tighten the screws even more in future, so over-entitlement in regards to access privileges in times of financial and privacy regulation is probably not a great idea.

You may be correct in that RBAC are too hard to implement but Marcus is more so in saying that the models we are using are pretty much useless. That is why they don’t work.

Dave September 5, 2009 7:49 AM

I think RBAC is also a bit of a backlash reaction towards earlier security models like BLP, Biba, the Orange Book, and so on, which were decried as being too inflexible to be practical, “they don’t fit our requirements”. RBAC seems to have come about as a security model for which it’s pretty much impossible to say “it doesn’t fit our requirements” because you can adapt it to anything, including things not even dreamed up yet. The downside to this is that the leap from “yes, RBAC covers it (whatever ‘it’ might actually be)” to “here is how to implement a practical RBAC-based system to address your particular problem” has now become quite considerable. Although I’m loathe to bring up the S-word in this discussion, the situation with string theory does spring irresistably to mind.

Brad September 5, 2009 9:47 AM

Speed limits are typically set at the 50th percentile speed or under to generate revenue (as opposed to the correct 85th-90th percentile), so issuing “speeding tickets” suggests that IT security policies are intended for some purpose other than to increase security. Probably not such a good idea.

Roger September 7, 2009 6:54 AM

I have been thinking recently about the “access control” mechanism provided by emailing files around.

Yes, I know that email is not designed for this purpose, and it is in many ways a really awful solution, but it’s interesting to consider why users employ the system. I suspect that it reflects a fundamental brokenness not in the design of how file systems support ACLs, but in the way they are conventionally administered: centralisation of control.

Firstly, consider “DAD” as mentioned above. Alteration and Destruction are largely eliminated because access is granted only to a copy of the document, not to the original. True, confusion can be caused by circulating a modified version, but this can be tidied up by referring to an original. In many ways this is similar to the Wiki model. (As a sort of side-effect this has the feature of creating a sort of Heath-Robinson off-site backup which has the signal advantage of being able to retrieve files without talking to the IT department.)

Disclosure is interesting because it places access control to the document under the transparent, flexible control of the actual information owner — well, sort of. Unfortunately, in the email model you cannot separate access to the document from delegation of the right to distribute the document. Sometimes such delegation is a good thing, sometimes not; it would be good if the two privileges could be separated. Of course strictly speaking it is impossible to prevent a document being copied by someone who is allowed to read it, but a low hurdle is good enough for many purposes.

Requesting access is also simple: the interface combines a search function with a very simple access request! (“Hey Joe, can you send me last month’s financials, please?”)

It’s interesting to compare this to what happens in a typical company. Although a file system can theoretically allow information owners control over their own files’ ACLs, in practice this is usually completely controlled by system administrators on the grounds that it is too hard for mere users. So if a file owner wishes to use the file system to provide access, he or she has to go through the slow, tiresome process of lodging a support request with the IT department — assuming he or she even realises that this is possible, as it generally isn’t advertised.

Often, to ease its own workload the IT department will refuse to do this on a finer granularity than assigning groups to directories, and will get severely pissed off if requests appear more often than a few times per year. In one department for which I worked, not only would we go no finer than groups to directories, but the request had to come from a “section head” who was the nominal owner of a cluster of directories. This meant the poor benighted user had to pester not one but two short-tempered, time poor bureaucrats for every ACL change. As two of the three players in this game were unlikely to even know what an ACL is, this procedure certainly helped to keep IT’s ticket response times down. And don’t get me started on the procedure for editing group membership.

Even worse, it is very difficult for the information owners to determine who has access to their data (a savvy user can see which groups have access, but not who is in the group, and not historic changes to the ACL), and impossible to determine who has actually accessed it. It is possible to log that data, but such detailed logging is rarely turned on, and even if it is, mere users are definitely not provided access to the sacred logs. And whilst it is possible in principle to parse the logs to find out the exact list of who had access at a certain date in the past, this is a major chore even for administrators.

In contrast, with email the actual information owner can assign access to individuals or groups as often or as rarely as he or she wishes, using a transparent, simple-to-understand interface that automatically provides an auditable log that the user can view as often as he or she desires. The only missing features are ability to monitor delegated assignment of access (i.e., forwarding!), and revocation.

I’m not 100% sure where I’m going with this, but I guess the key point is that an effective access control system should leave user authentication and directories to the professionals, but leverage that to put access control in the information owners’ hands, through an interface that is simple, transparent, and easily audited. It could probably be built fairly easily from LDAP and Wikis.

And maybe the standard set of privileges (RWX or RWXD) is sub-optimal, too — too low level. A better set might be to use automatic version control instead of a “write” privilege, and the access properties be “access” (“yes / no”), “delegation” (“yes / no”), and “revocation” (“manual / single access / automatic after n days”). To simplify rules for composing groups, users can be manipulated in groups for ease of adding several at once, but groups are not first class subjects and privileges apply individually. That is, if I add Joe Blow, then add “All Users” of which Joe is a member, and finally remove Joe, then Joe is gone; it doesn’t matter that he was added twice. Ah, it’s late and I am now meandering.

Derek September 14, 2009 6:35 AM

@David

I agree, there are a number of different ways it could be made more secure, but the UK government and press made it seem like a single CD was produced against procedure and then lost when in reality there a hundreds of these things flying around the mail system as standard.

Kristin Abele September 22, 2009 10:27 AM

Bruce,

I wanted to congratulate you on the selection of this post for the Carnival of Trust, hosted this month by John Caddell.

The Carnival of Trust is held once a month, compiling the best blog posts dealing with the subject of trust in business, politics and society. We believe your post and comments have sparked an interesting debate within the field of trust. Thanks for a great addition to this month’s Carnival.

Best,
Kristin Abele
http://www.trustedadvisor.com/trustmatters

Anon September 22, 2009 9:36 PM

“There’s a reason RBAC came out of the military …”

Uh, no. Multilevel security came out of the military. RBAC mostly came out of NIST and commercial database makers. The rest of this article makes some good points, but it’s wrong on the history.

john December 16, 2010 9:19 AM

With things like user passwords or any data that no one else should have access to, it should simply be impossible in a large organization or in case one is providing a service to have access to it.
But if everyone has reason to trust soemone in charge of passwords who would be allowed to know or change them that would be a different.
Most certainly auditing can get ridiculous and some of these leaks are really trivial stuff. they would have gotten in trouble but if it’s me getting the leak to those maps…, or mostly people like me the risk is really negligible rather for the infrastructure the risk would be nominal.
of coruse I persume others have more neferious intentions. It should in my view be at-least alright if it’s public knowledge what major information pipe lines are planed even if we don’t have access to them. but on the other side I’d also like to have the right as an individual to secretly lay a wire to my friends house.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.