Getting Security Incentives Right

One of the problems with motivating proper security behavior within an organization is that the incentives are all wrong. It doesn't matter how much management tells employees that security is important, employees know when it really isn't -- when getting the job done cheaply and on schedule is much more important.

It seems to me that his co-workers understand the risks better than he does. They know what the real risks are at work, and that they all revolve around not getting the job done. Those risks are real and tangible, and employees feel them all the time. The risks of not following security procedures are much less real. Maybe the employee will get caught, but probably not. And even if he does get caught, the penalties aren't serious.

Given this accurate risk analysis, any rational employee will regularly circumvent security to get his or her job done. That's what the company rewards, and that's what the company actually wants.

"Fire someone who breaks security procedure, quickly and publicly," I suggested to the presenter. "That'll increase security awareness faster than any of your posters or lectures or newsletters." If the risks are real, people will get it.

Similarly, there's a supposedly an old Chinese proverb that goes "hang one, warn a thousand." Or to put it another way, we're really good at risk management. And there's John Byng, whose execution gave rise to the Voltaire quote (in French): "in this country, it is good to kill an admiral from time to time, in order to encourage the others."

I thought of all this when I read about the new security procedures surrounding the upcoming papal election:

According to the order, which the Vatican made available in English on Monday afternoon, those few who are allowed into the secret vote to act as aides will be required to take an oath of secrecy.

"I will observe absolute and perpetual secrecy with all who are not part of the College of Cardinal electors concerning all matters directly or indirectly related to the ballots cast and their scrutiny for the election of the Supreme Pontiff," the oath reads.

"I declare that I take this oath fully aware that an infraction thereof will make me subject to the penalty of excommunication 'latae sententiae', which is reserved to the Apostolic See," it continues.

Excommunication is like being fired, only it lasts for eternity.

I'm not optimistic about the College of Cardinals being able to maintain absolute secrecy during the election, because electronic devices have become so small, and electronic communications so ubiquitous. Unless someone wins on one of the first ballots -- a 2/3 majority is required to elect the next pope, so if the various factions entrench they could be at it for a while -- there are going to be leaks. Perhaps accidental, perhaps strategic: these cardinals are fallible men, after all.

Posted on March 4, 2013 at 6:38 AM • 42 Comments

Comments

tzMarch 4, 2013 7:09 AM

The one thing to take care is the firing needs to be consistent. If the person fired for a security violation is merely politically disfavored, but the pet assisstant is playing flash and java games, it will only cause fear and confusion.

Were even better at rationalizing double standards.

Dave WalkerMarch 4, 2013 7:12 AM

Along with Chinese proverbs, one of the best-known examples in the West comes from Voltaire's "Candide", and the execution of an Admiral for failing to engage an enemy fleet, and also "pour encourager les autres".

The promised penalty of excommunication held over a Cardinal being discovered disclosing information about the Papal Conclave is an interesting one, in that it is only an effective penalty if there is current assurance that all the Cardinals actually believe in the relevant flavour of deity - and that's not the easiest thing to assess...

Arashi No MouiMarch 4, 2013 7:18 AM

Tz,

And that's the reason why there needs to be a graduated discipline scale, that has defined rules and reasons. Hell, I've had to write myself up cause I screwed up (Didn't document something that I did do, but without the verification there was no evidence that I followed procedures).

But all of this requires management buy-in and support to institutionalize, and that's probably the hardest thing to get, especially when it comes up against profits.

NobodyMarch 4, 2013 7:30 AM

"Excommunication is like being fired, only it lasts for eternity."

It is also entirely imaginary, and they above all know that. I am sure even in the worst days of Rome, the ones who knew the horrible stories were lies the best were the leading cartels.


... Firing people for not following security rules so they can come through on deadlines? ...

I disagree that is the right way to go.

If you want perfect security and no usability -- Hitler. Stalin. Pol Pot. If you want strong security and usability.... far from that mindset.

Yes, "killing one, warns thousands"... but there are better, more enlightened ways to operate.


The dark ages was full of that sort of mindset.

Today, the goal is to provide security and to do it invisibly.

Shoot for the stars, you might hit something.


NobodyMarch 4, 2013 7:38 AM

"the pet assisstant is playing flash and java games, it will only cause fear and confusion.

Were even better at rationalizing double standards."

Java and Flash games? They should have their hands chopped off!

I think that people should consider the pristine model of security the middle east has -- countries like Kuwait, Saudi Arabia. A thief steals, he gets his hands cut off.

A drunkard finds some booze, they go to jail. Adulteress women get stoned. That kind of thing.

Just to be clear: security practioners are free from error. They enforce these rules, that is why.

...

Best way to ensure you are not a hypocrite is to be merciful.

Security can be done without firing people, cutting off hands, x-raying their bodies, forcing them to remember twenty different forty letter passwords...........


If you have a mind, you should have an imagination, so use it. If you are in security and have no heart, you really are just a criminal hiding out and do not belong in the business.

Where best can sociopaths go to hide? In security.

LollardfishMarch 4, 2013 7:43 AM

As a professional medievalist, I can say that while excommunication is in theory eternal, it is in practice merely a negotiating tactic. Medieval people entered strategic states of excommunication whenever it was useful, did terrible deeds, then re-entered the arms of the church by making public (and expensive) acts of contrition.

I do not believe that the Cardinals will remain sequestered from information flowing in, but perhaps flowing out. But once the doors close, some of the Cardinals will turn on their phones and check messages.

FatherMarch 4, 2013 7:54 AM

Excommunication is not even in theory eternal, in that it is not a dispensation but a recognition, that is, the Church does not so much excommunicate you as it does formally recognize that you have removed yourself from communion with it. The writ of excommunication outlines the ways in which one has strayed and lists those measures which need to be taken in order to return to full communion with the Church.

It is, then, less like firing than a situation in which your boss calls you up and says that he notices that you haven't been coming to work, that you haven't picked up your paycheck in ages, and that he wishes you would come back.

(Your points absolutely stand; I just wanted to correct misconceptions in the comments.)

DaveMarch 4, 2013 8:04 AM

Step 1: Fire someone for bypassing security to meet a deadline, as an example to the rest.

Step 2: Wonder why so many deadlines are being missed.

Simon MoffattMarch 4, 2013 8:05 AM

Security incentives are a major issue when it comes to secure implementation of anything. Business owners and end users will always see security as inhibiting what they are employed to do: namely implement processes that generate revenue (or in the above example, elect a new Pope). If anything gets in the way of those goals it will become difficult to implement. What is the alternative? The stick approach mentioned above is draconian, but can work to an extent. An alternative is making security so implicit and embedded, that the end user doesn't know they're being proactively secure. They just are through design. This is complex, certainly for existing processes, but for new products or processes, it should be an implicit goal.

JHMarch 4, 2013 8:09 AM

I think there may be cause for optimism if you compare information security to workplace safety. Much of the same trade-offs mentioned above are involved, but (at least in the US) safety is take very seriously and the workplace has gotten much safer (http://www.cdc.gov/mmwr/preview/mmwrhtml/mm4822a1.htm). This seems to be an example by analogy that it is at least possible to manage the trade-offs reasonably well.

GrumpyMarch 4, 2013 8:14 AM

"Dans ce pays-ci, il est bon de tuer de temps en temps un amiral pour encourager les autres" - Voltaire

So yes, hanging one to show that you mean business is one way of doing it. However, should a boss of mine ever institute such a policy without taking the time cost of security into consideration when handing out work loads I will judge him deeply dysfunctional and fire him on the spot. I will no longer be following this blog - Bruce has clearly demonstrated a lack of empathy and an attitude towards security placed to the far right of Genghiz Khan that I can not condone in any way.

John CampbellMarch 4, 2013 8:32 AM

The real problem I see with this whole discussion is that COMPLIANCE with security policy is "proving a negative", and, as much as a widely published punishment has some effects, where are the rewards?

I have noticed, by the way, that security work tends to be under-funded (when funded at all!) and can often be triaged because, face it, as much as orders often float down from the adminisphere like a blizzard, seldom will executives put their money where their mouth is.

Keeping a crap job where you seldom feel rewarded-- much less valued-- isn't much of an incentive when the goals (usually a drop-dead date) are set (or pressured) by someone above you on the food-chain who doesn't LISTEN to the risks.

Top-down management only works with unimpaired bottom-up feedback, but, in a perverse twist and violation of information theory, managers only want to hear that the results of their orders "match expectations".

There are a LOT of corporations that excel at lying to themselves because their executives have proven that they cannot handle the truth... or that they has met their enemy and it was themselves.

Humans ALWAYS have a problem applying rules non-capriciously so any "punishment"-based regime will NOT be incentivizing "right" behaviors; Human are far too perverse for that.

To be frank, as f**ked up as my current employer may be, they are STILL miles ahead of many of my previous employers.

Remember, no matter how good your system of incentives and disincentives are, it will still be operated by human beings.

And we all know that human beings are made of meat ( http://www.eastoftheweb.com/short-stories/UBooks/... ).

RookieMarch 4, 2013 8:38 AM

@Grumpy

So you're in effect "firing" Bruce because of one mistake that he made, without taking into cosideration other information he has that might be valuable to you?

...just checking.

misuMarch 4, 2013 9:38 AM

Corporate culture (the unwritten stories that spread ideas about appropriate behaviour through the group) is primarily a top down mechanism: its not what those at the top say, its how they act and how they demonstrate what they deem valuable. So for the firing to be most effective it should start at the top: when a director takes a laptop home and returns it full of viruses (a story I hear all too often) - they should be the first victim of the Bruce's policy. That not only sends a deep message to everyone below them, but informs everyone else with the power to implement decisions that they need to be security conscious.

Since we're quoting Chinese proverbs, Sun Tzu and the king's concubines is appropriate.

Dave MMarch 4, 2013 9:51 AM

I think it will be very difficult to provide positive reinforcement for correct behavior in regard to security. How do you "catch the employee doing something right" when almost all of the security rules are phrased as "Thou shalt not..."

somebodyMarch 4, 2013 10:13 AM

These comments seem to revolve around how the boss is sending the wrong message about security and how to send a better message. No one has addressed that maybe the boss is right.

The first step should be to convince the boss that security does matter more than deadlines. If you can't do that then maybe security matters less than you think.

boogMarch 4, 2013 10:18 AM

@Grumpy

I will no longer be following this blog - Bruce has clearly demonstrated a lack of empathy...
Aroo?

1. He wrote the quoted article 3 and a half years ago. Now you hear about it and are bailing? How do you feel about wasting the last 3 and a half years?

2. You missed the point. That article was about security policymakers not understanding people. I don't think he's really advocating firing people for circumventing security, but rather suggesting that somebody who yammers on about lack of security awareness being such a huge risk should institute such a policy if they really believe what they're babbling about. Except they don't. Because most people are capable of understanding risk in most cases, particularly when they are circumventing security to get their jobs done.

It helps to read complete articles, not just small bits of dialogue.

Andrew2March 4, 2013 10:43 AM

So long as the policy itself is not draconian, strict enforcement is possible.

One of the biggest problems with security is that people just do not believe that the policies put into place are truly necessary. If you expect employees to comply with a policy, it has to be justified. And "I'm an expert, trust me, don't do it" is not justification.

Security professionals tend to get nervous at the idea, but perhaps the best way to convince people is to share the reasoning, especially the threat model, behind the policy. It isn't enough to say, for example, "no Java, or you're fired" even if you make examples of people. You have to sell "no Java" as an effective and necessary policy first, otherwise people will pay attention just long enough to avoid getting fired then break policy in some other way that you do not (or cannot) enforce.

I have seen this tried before, but often the justification is too heavy handed: "Hackers and viruses and phishing, oh my!" Scaring people into compliance doesn't work, at least not for long. People will comply with a policy much better if you just tell them like adults that not all threats are known and therefore the policy needs to be better than the best known attack expected against their system.

Alex WMarch 4, 2013 11:12 AM

@misu hit the nail on the head - upper management and executives were the first to get shiny mobile devices on the company email infrastructure, despite absolute lack of security features in those devices; they're the first ones to get exceptions to complex password policies because their time is "too valuable" to enter a long complex password; they're the ones getting access to online cloud storage services to sync their financial spreadsheets, because their shiny new mobile devices lack the VPN capability; yet they're the last to be reprimanded for violating security policies.

LollardfishMarch 4, 2013 12:26 PM

@Father - I don't know anything about post-Tridentine practices or theories of excommunication. I would argue, however, that the medieval rhetoric and law of excommunication is far more active than what you describe. It's a wielding of the spiritual sword to cut one off from the benefits of the sacraments, at least as I read Gratian. Not that this matters to Bruce's point. But it matters to me!

KEithBMarch 4, 2013 2:12 PM

I am not sure if this was the John Byng Affair, but in _The First Salute_, Tuchmann mentions that one of the things that weighed on the British Navy during the Revolutionary War, was that there were *two* court-martials at the time. One that punished someone for *not* obeying orders, and another that punished a captain for *obeying* orders when a deviation would have meant victory.

LollardfishMarch 4, 2013 2:14 PM

@Joab - Before the 17th century, excommunication was wielded pretty explicitly as a penalty, not a "formal recognition that you have removed yourself from communion with it," as Father describes, above.

I am not expert in contemporary practice or theory, though.

It's quite likely that Father and I come at this from different angles in terms of what we view as the authoritative finding on the essence of excommunication, it just usually doesn't matter on Bruce's blog.

It's been a fun few weeks to be a medievalist though, thanks to Pope Benedict.

paulMarch 4, 2013 3:04 PM

From this discussion, it seems that the Oth step of establishing a trustworthy management structure, where employees would have reason to believe that a security policy is actually about security rather than about providing another way to punish those managers don't like, is a prerequisite to the rest.

aaaaMarch 4, 2013 4:11 PM

Lets see if I get it straight:

Problem: management prefers deadlines over safety. Punishment for not meeting deadline is worst then the one for not following security. Employees circumvent security in order to get the job done in required time.

Solution: make security punishment higher and fire some people.

What will happen: employees will follow security, but they will not be able to meet deadlines. Productivity drops.

Problem 2: productivity dropped.

Solution 2: make punishment for not meeting deadline higher and fire some people.

Problem 3: Employees circumvent security in order to get the job done in required time.

Solution 3: make security punishment even higher and fire some people.

Problem 4: ... and so on...

Repeat until everyone cheats on both deadlines and security. Right now, I do not want to work in a company managed by you.

Alternatively, management could require reasonable deadlines that would make following the rules possible.

Clive RobinsonMarch 4, 2013 4:53 PM

@ aaaa,

Lets see if I get it straight

No you have not...

Because you have forgoton two basic principles of senior short term thinking managment,

1, Delegation.
2, Be dishonest in delegation recording.

As a senior manager you don"t take chances if you wish to survive. Thus when a risk you cannot pass on to other senior managers hits your desk you simply delegate the risk.

Now to ensure you only ever get the reward of the risk not the punishment you are at best ambiguous in the way you record the delegation of the risk (if you are daft enough to record it in the first place).

Why do you think we have the "rouge agent" defence being so prevalent in the banking and other finance industries as well as Government Departments and other Civil authorities such as LEA's?

The whole argument about "Security Incentives" is pointlessly "philosophical" in nature and usually starts with a statment such as "Hypotheticaly speaking..." or "In theory...".

The simple fact is the way up the managment greasy pole is by sticking a knife in somebodies back and using it as the rung on the ladder. The higher you get the easier it is to stick the knife in.

The trick at the bottom of the pole in the very early stages of your career is recognising this simple fact and "wallpapering your A55" where you can and only taking big gain risks early on to jump up the ladder sufficiently high so you have "knife shoving privileges" so you know longer have to take a risk punishment only thhe reward.

The best way to do this at the top of the pole is to jump ship before it sinks. As I've pointed out in the past if the project succeeds in you absence you claim to others it was becausse of your brilliant leadership in the early stages that laid the foundations of success. If it fails you simply say that due to the failings of those that followed you your early brilliant leadership that had started to show success had been ruined.

With that skill mastered you only need to master the skill of knowing when to jump ship. As a stratagie it can never be a failure as long as there is another job to jump to. Eventually you do run out of upwards places to jump to, but usually if you have played it right you are now at the top of an organisation that is "to big to fail" and have thus become a "Master of the Universe"...

Knowing these two basic ideas about the managment progression cycle tells you just about everything you need to know about large corporates especialy the global ones where they can legaly hide any audit trails beyond any jurisdictional enquires on the "No evidence no crime" principle of tax evasion and drug money laundering.

Jim MooreMarch 4, 2013 5:20 PM

I think that incentives isn't the issue as much as the various forms of communication. Leaders lead. When we had an external firm conduct a security posture assessment, and everyone saw that the CFO was spending money to determine where we were at, our security posture improved for a few years, until everyone saw that the CFO wasn't regularly spending money on measuring where we were from a security posture standpoint. People do follow the lead.
If the lead is weak, or leads in the direction opposite of the "official" standards, then of course people have no incentive. Publicly firing someone, if the majority of leadership is communicating that security is irrelevant will only make people nervous, and decrease productivity.

GodelMarch 4, 2013 7:26 PM

A superficial reading of the John Byng reference suggests he was scapegoated.

I'm sure this plays a large part in how security management plays out in the real world, but I'm not sure it's the message you're trying to convey.

gollumMarch 5, 2013 5:18 AM

My two pence.
Reasonable policies, protecting both employees and firm, will be naturally followed.
If you resort to threaten, something in policies is wrong.

Nick PMarch 5, 2013 12:02 PM

@ gollum

"Reasonable policies, protecting both employees and firm, will be naturally followed.
If you resort to threaten, something in policies is wrong."

You use the word "naturally" as if following orders is human nature. It's not. People often submit to authority to various degrees. However, there's a strong amount of rebellion, laziness and selfishness in the human race. There's even some of this in people who rarely manifest it. Left to their own devices, people will do whatever they prefer to do in a given situation, even if a recommendation exists.

Hence, policy and law have negative incentives to act appropriately.

AbeMarch 5, 2013 2:24 PM

This is one of the questions I've been grappling with lately. Is the risk calculation for the individual different then the organization they work for?

The way I currently see it is the individual and organization both want to generally (not always) maximize the revenue (the goal) the organization makes as then in theory more compensation (the reward) for the individual (salary, bonus, etc) and more compensation for the organization (profits, etc) happens.

However the risk of violating the security to obtain a greater maximization of the revenue differs when looked at the individual versus the organization.

The individual doesn't usually fear termination (punishment) as the likelihood of that is low in case an event happens. The organization's impact of an event can be much higher (fines, reputation risks, profit loss etc) and the perceived likelihood of that event may be higher (hackers, disgruntled employees, etc).

Since the individual's risk (impact + likelihood) isn't at the same level as the organization usually the organization will want to protect security more but since individuals make up the organization their appetite for risk taking will always be more then the organization as a whole as their calculation would always favor taking more risks.

I'm personally not sure there is a sensible solution to align the two competing interests (organization and individual).

Standard caveat that human nature adds a level of unpredictability to the equation.

Dave MMarch 5, 2013 2:38 PM

All I caught in any of this are disincentives for breaking the rules. Is that the best we can do? They're really not the same as incentives.

Eric BMarch 5, 2013 5:15 PM

Funny you mention the papal elections... apparently it's possible to break into them... although this cardinal impostor didn't try very hard.

http://gma.yahoo.com/...

Maybe crucify him, that'll get someone's attention.

itgrrlMarch 5, 2013 8:06 PM

@Abe: I agree that the trick is to align the interests of the individual with the interests of the organisation. Most of the devs and support staff I know would dearly love to be given the time to bake-in security to the products they produce and support. I don't think their interests need realigning. In the long term, of course, it is also in the company's (reputational) interests to produce secure products. But these interests often take a back seat to the more pressing interests of short-term profitability. So this is where we need some 'correction' in interest alignment.

It seems to me that the 'lever of alignment' needs to be applied to the organisation as a whole in the form of regulatory pressure - make the penalties for privacy and/or data breaches enormous, and you create a more immediate financial incentive for companies to produce (more-)secure products. Businesses follow the money pressure. If the risk of enormous financial penalties are large enough, they will outweigh the incentive of short-term gains at the expense of security.

Of course, all this is easy to say, but much harder to implement (and get the balance right)... Especially when software/hardware dev is carried out on an uneven global playing field where other governments are not be prepared to implement similar policies, and no government wants to introduce measures that are seen as likely to make their economy less competitive.

gollumMarch 6, 2013 2:17 AM

@Nick P.
negative incentives are good for willful infringements, of course, but higly ineffective for stress decisions and/or mistakes. We cannot rely _mainly_ on them, since there will be more human error (maybe provoked by bad management or policies) then willful infringements. At least in "normal" firms. Besides, negative incentives do not stop active attackers (intruders) and disincentive incident reports from honest people. Am I wrong?

aaaaMarch 6, 2013 6:46 AM

@itgrrl I mostly with you, but what about making those punishment for companies appropriate, proportional to probable harm and level of wrongdoing instead of enormous?

Enormous overly huge punishments for companies do as much good as enormous overly huge punishment for individuals. That is, none at all.

Clive RobinsonMarch 6, 2013 7:03 AM

@ NZ,

Just sell your soul and remove your morals and jump right in and give it a try...

BUT remember there are some risks that you don't want to be close to,

http://chicagoist.com/2013/03/04/...

It appears that just "selling" is not enough for some in the security sector, they want it all and they are happy to "tip the wink" in the right direction, picking up expenses etc and other sharp practices that many would say were not far from brown envelopes...

The real question is will they actually get convictions of the real perps not those stand in "rouge agents" taking their first steps up the greasy pole...

Richard ArnoldMarch 13, 2013 7:46 AM

Naive in the ways of business. Actually, naive in the ways of any operations.

Business P&Ps, including security, have to be flexible.

Anybody who's worked in the intel community before/after a crisis (such as 9/11) knows better. When folks don't want to work together, they leverage all the P&Ps as reasons to stall. When folks decide they absolutely have to work together, P&P restriction decisions are treated as risks capable of mitigation.

A hard and fast "fire the bum" rule or recommendation is silly and naive, and has the tunnel vision only a cybersecurity professional could muster. Real-time risk management is about overall corporate risk, and sometimes you forgive and even applaud rule-breaking when, in the overall scheme of things, the loss in the area of cybersecurity is outweighed by other benefits.

J RodmanMarch 15, 2013 11:36 AM

The amusing part is that the people who understand and care about security in produce products in many organizations must regularly circumvent security rules.

The issue is that the security behavior rules for resource use are often created by a clueless IT department, while developers who care about limiting security defects have to regularly circumvent these stupid "security" rules to to do their jobs.

Examples are endless, but the most obvious are things like "run antivirus software at all times on your development machine" when the antivirus software has bugs that break the product.

Hamad alamriMarch 24, 2013 3:27 PM

Of course security awareness is an essential part of any security program within an organization. I think that the high management level in any organization must make it clear to all employees that violating a security procedure is as serious as violating any other sensitive procedure in the corp. Staff should understand that they are subject to serious punishment if they didn't follow the security procedures. Moreover, organization must motivate employee correctly to conduct proper security behavior. Getting the job done costly as security procedures state is more important than getting it done cheaply without following the security rules.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..