Security and Human Behavior (SHB 2014)

I'm at SHB 2014: the Seventh Annual Interdisciplinary Workshop on Security and Human Behavior. This is a small invitational gathering of people studying various aspects of the human side of security. The fifty people in the room include psychologists, computer security researchers, sociologists, behavioral economists, philosophers, political scientists, lawyers, anthropologists, business school professors, neuroscientists, and a smattering of others. It's not just an interdisciplinary event; most of the people here are individually interdisciplinary.

I call this the most intellectually stimulating two days of my years. The goal is discussion amongst the group. We do that by putting everyone on panels, but only letting each person talk for 5-7 minutes The rest of the 90-minute panel is left for discussion.

The conference is organized by Alessandro Acquisti, Ross Anderson, and me. This year we're at Cambridge University, in the UK.

The conference website contains a schedule and a list of participants, which includes links to writings by each of them. Ross Anderson is liveblogging the event. It's also being recorded; I'll post the link when it goes live.

Here are my posts on the first, second, third, fourth, fifth, and sixth SHB workshops. Follow those links to find summaries, papers, and audio recordings of the workshops. It's hard to believe we've been doing this for seven years.

Posted on June 9, 2014 at 4:50 AM • 17 Comments


CallMeLateForSupperJune 9, 2014 6:56 AM

"Bruce Schneier, Harvard Law School: Our Security Models with Never Work – No Matter What We Do"

Surely a typo; a security model with Never Work would be counterproductive. ;-)

Although I winced at references - both explicit and implied - to psychology, I am nevertheless interested in a number of the topics. Would like to be a fly on the wall. Years ago I lived just twenty minutes from Cambridge, but I was an indestructable and poor student with no interest in security. Today I live a good eight hours from Bean Town. (sigh) I must "wait for the book".

kashmarekJune 9, 2014 4:28 PM

Typo or not, now we know what kind of posts are avoided on this Weblog.

Lorin RickerJune 9, 2014 4:52 PM

SHB -- Sounds fascinating. Of course, if the conference succeeds in unraveling and/or explaining the "human behavior" of NSA spooks and politicians, be sure to top-line it for the rest of us.

CallMeLateForSupperJune 9, 2014 8:28 PM

I found "It's All About The Benjamins: An empirical study on incentivizing users to ignore secutiry advice", Nicolas Christin et al, to be very interesting. Apparently there are persons who will not put at risk their 'puter and everything strored on it for less than 50-cents but will readily do exactly that for 50-cents or a dollar. Go figure.

I'm going to slip a copy of this paper to a family member who insists that she both cares about her own security and doesn't take unnecessary risks with it... while doing all her banking, email, and bare-back SMS with a cell phone. A cell phone full of nifty "apps".

Coyne TibbetsJune 10, 2014 12:18 AM

Just an anecdotal story on security behavior from earlier in my experience.

I was administrator of a system that offered two security access rights: Administrator (A) and privileged-program (P).

The P right did not allow the user to directly maintain system rights, but personal experience had demonstrated the access it did allow permitted a user with that right to use the debugger to "promote" themselves to A right. (By bit-twiddling; I had worked that out on my own by inspecting code with that self-same debugger.)

One of my users we'll call "Joe" had been given P right, but not A right. Joe persisted in leaving his session logged on while he walked away from it. I objected to that practice on the grounds that the P right could be used to promote oneself to A right. His response, paraphrased was, "Yeah, but no one [but you] knows how to do that." (Security through obscurity.)

In this case, I solved the problem not by withdrawing the P right, but by adding to Joe's user the A right. With that right in place, even he recognized the concern and responded by treating his session with much more care.

I went further and made that policy: That any user that had P right must also have A right. The risk was little if any greater, since P right implied A right.

WaelJune 10, 2014 1:20 AM

@Coyne Tibbets,

Just an anecdotal story...
Amazing! Security through reverse psychology :) Or is it just psychology? Getting sleepy now, been up a while. Better start counting the goats :)

Bruce SchneierJune 10, 2014 8:29 AM

"Years ago I lived just twenty minutes from Cambridge, but I was an indestructable and poor student with no interest in security."

I am in Cambridge, UK. I should have made that clearer.

WaelJune 10, 2014 8:48 AM

@Bruce Schneier -- Agent 0011 (or prime number of your choice),

I am in Cambridge, UK. I should have made that clearer.
Cambridge, UK? Oh-oh... MI5?
Was reading ...
I like the last paragraph:

Discretion is vital: As with all parts of the recruitment process, you should be discrete about what you tell other people. This includes how much information you share on social networking sites - for instance don't update your status with 'applying to MI5'!
US counter parts should make that clearer too. Michael Hayden didn't learn the lesson :)

mooJune 10, 2014 5:17 PM


Of course they should be discrete, but being discreet about it might be even more important!

Coyne TibbetsJune 11, 2014 9:16 PM


I guess it's some kind of psychology. But I think the big thing is that making Joe's user an "A" user eliminated the security through obscurity argument: Joe could no longer argue that "no one knew" how to abuse the authority.

WaelJune 11, 2014 10:57 PM

@Coyne Tibbets,
Also true. Conversely, don't you think Joe can game the system as well, if he needs to elevate his privilege? Your strategy, in a way, assumes Joe has no malicious intents.

Coyne TibbetsJune 14, 2014 2:10 PM


In this case, if Joe had had malicious intent, he wouldn't have been initially given P privilege; A privilege therefore did not increase the risk. The concern wasn't malicious intent on his part, it was careless disregard.

AC2June 17, 2014 5:07 AM

There are a few WTF moments for me on Ross' liveblog which make me wonder...

" For example, he was
once hired to try to understand
why any honest person would
want a prepaid mobile phone"

" John Lyle fights spam for

RGP SecurityOctober 7, 2014 11:21 PM


Is Eavesdropping Necessary?

Human factors are strongly at play in signals intelligence collection. One could say, perhaps justifiably, that the motivations of the collectors will often determine the success or failure of an intelligence effort. But to talk in such simplistic terms does not do justice to the complex and murky human reality. From a professional viewpoint, the collectors and technicians should care if their work succeeds or if their efforts mean anything beneficial to their country. One presumes that they do care, but this is not always the case. Promotions, careers, posh assignments, foreign travel, expense accounts, self-interest of every sort, all contribute to whether anything will be said in the face of senseless activities or plain illegality. The self-interest of the people in the organization is not always in the interests of the country.

Many think of eavesdropping as fundamentally corrupt and dishonorable. Its current manifestation in the United States has been likened to a self-licking ice cream cone that just gets bigger and bigger. Constant eavesdropping, even against friends, is saying that no one is a real friend, and that other people do not matter. It is the practice of denying people their dignity and their right to express themselves freely, and it is now big business. If eavesdropping activities are legal, then this is bad enough, but if the activities are illegal, then a culture of fear and silence breeds within the eavesdropping organization because someone may someday be held accountable. But most of this can be subsumed under an easy-going, and swollen, self-interest. Self-interest and denial are seen at every level of the organization.

There is a fundamental division here: eavesdroppers have to trust each other to do their jobs well, but the longer that stay in that line of work the less they trust anyone (to include each other) as denial increases. A robust inward-looking counter-intelligence operation also shows that no one is absolutely trusted in the organization. People inside the eavesdropping organization end up divided off from others even though they can tell themselves they are doing the right thing in the world, even when the deaths of innocent people are involved. In the act of eavesdropping, the disregard for the rights of others only begins. Those others can be people one works with, people down the street, or someone on the other side of the earth.

These fractures are indicative that something is rotten: lying to legislators, not trusting people in your own organization, appealing to greed, and denying responsibility.

Take the example of another very similar swollen bureaucracy: the U.S. Army. At some point after 9-11 the U.S. Army noticed that soldiers with high scores on standardized foreign language proficiency tests often did not want to stay in the military. Those soldiers, cryptologic linguists, had usually learned a foreign language at the prestigious DLI in Monterey, California. Those with lower scores were more often eager to stay in the military. Those with higher scores were looking for the exits despite generous bonuses. Leaders scratched their heads.

What the leaders never figured out, or never admitted, was that the Army most emphatically did not want to have highly skilled soldiers. The Army just did not care. Having highly skilled soldiers had nothing to do with the boss being promoted. Pure obedience would trump job skill, and the former required less effort from all concerned. Appearances would suffice. The analogy here is that a large eavesdropping organization (also hierarchical, also given to obedience, and also militarized) does not necessarily care if it is doing anything truly effective. It just needs to look as if the world depends on them. Also, smart people can be a threat: look at Snowden. The organization wants to expand and function without hindrance and without anyone voicing other opinions.

Budget growth is not hindered by the high turn-over of employees, dumb policies, or pursuing senseless activities like spying on people who play Angry Birds (even though professionalization is reduced). It is like promotion in the Army not being linked to job skill. It does not matter who gets promoted, or how hallow things get, as long as appearances are kept. It used to matter, but the Army did away with those tests because a perception existed that soldiers from certain racial groups would be disadvantaged during test-taking. In other words, like finding a kind of camouflage to wear that is visible at all times in all places (the infamous DCU), the Army created a kind of reverse Darwinism: the selection of the unfittest. White trash never had it so good.

Eavesdropping at the national level works in the same way. It does not have to make sense, be efficient, or be done by anyone special. It just has to spend money and stay secret. One hesitates to consider the track record and the amount of money spent. 9-11 was not foreseen. The Russian invasion of Ukraine was missed. The war in Iraq was not won. The war in Afghanistan is the longest one in American history. The invasion of Libya was, and is, a fiasco. The Tsarnaev brothers had red flags all over them, but they were not stopped in Boston. Terrorism is not declining. Is anyone responsible? Has eavesdropping done anything effective to make the world safer? Is it necessary?

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient Systems, Inc.