Interview: Bruce Schneier

BT Counterpane's founder and chief technology officer talks to SA Mathieson at Infosecurity Europe

Bruce Schneier packed out the show’s keynote theatre when he spoke about ‘The Psychology of Security’, based on a draft essay he published in February. He outlined a range of research suggesting that our perceptions of a given risk are heightened if it is – among other things – spectacular, discussed widely, outside our normal experience or willingly taken rather than beyond our control. Such biases are ideal for hunter-gatherers living in small family groups in Kenya in 100 000BC, he argues, but not for modern life.

So how does this apply to infosecurity risks? “The obvious place is the people who are afraid of cyber-terrorism, while minimising cyber-crime,” he says. “Cyber-terrorism gets the news, it’s the hot topic, it’s the scary topic and people are afraid of it. Cyber-crime doesn’t get as much news, and I think people very much underplay that threat. You see it also when people overplay the threat of peer-to-peer, or they get all scared of people bringing their iPods in and maybe putting data on it. They forget that data could walk out on paper. So there is a lot of people reacting to the news, instead of to the reality of security. Now, it’s hard to blame them. This is what’s reported, this is what people worry about, but I think there’s a big difference in how people perceive internet security and what’s really going on.

“I’ve always said that I think the industry spends about the right amount of money on internet security, it’s just spent really, really badly, and that’s because people are missing what the threats are.

What areas do infosecurity professionals underspend on? “I think they underspend on the risks of financial fraud, I think they pretty much ignore reputational risks. We find a lot of examples in the United States where large data thefts result in a measurable change in your stock-price, and it’s not a good change. I think companies really don’t even think about those sorts of risks. These fall into the category of very rare, but very devastating attacks, and it’s hard to adequately deal with those, because your normal insurance model of average loss expectancy doesn’t work very well.

“On the other hand, some things are doing very well. If you have a decent anti-virus program, you’re doing phenomenally. If you keep your patches up to date, if you pay attention, if you’ve got some good services for dealing with the threat of the day, you’re likely to emerge pretty unscathed. Now, you might have to pull some overtime, here and there, but you know, that’s part of the job. So that kind of stuff I think we have largely well in hand. You have two sets of companies, you have the companies that get it, who are investing in these security measures, and you have the companies that don’t, who aren’t, and they just get whacked.

“I think companies underestimate the severity of insider threat, they’re mostly concerned about attacks from the outside and downplay the threats from the inside. But this is true all over humanity. In the United States, most kidnapping happens by relatives, yet we’re afraid of the stranger sneaking in to our child’s bedroom. Most credit card fraud happens from someone who lives in the same house as you do. You are most likely to be killed violently by someone you know than by a stranger, yet in our head, it’s exactly the reverse. We fear the unknown, and on your computer, you’re most likely to be hacked by someone in your company, not outside your company. Now this is hard, it’s much easier to build a wall to keep the bad guys out. If the bad guys are already inside, you’ve hired them, it’s much harder. One of the best things we do at Counterpane is catch insiders, because no-one else does, and it’s very satisfying when we do.

One of the flaws in our judgement of risk, according to Schneier’s essay, is our preference of a sure gain of £1 to a 50% chance of £2 – or even £4 – but we prefer to gamble when it comes losing money. “In general, what psychological research shows is that people are risk-averse when it comes to gain, and risk-seeking when it comes to losses,” he says. “You see this in IT when companies are ignoring these extremely low probability, high damage events. They are risking a large loss, because in their heads that’s a better deal than spending the money to mitigate it, even if financially the math works out the other way. That’s just the cognitive bias we have as people.”

There are others: “Optimism bias is, ‘it won’t happen to me’. So you open the paper and you read about this company that got hacked, and there’s all this damage, you’re the CEO and you say, ‘ha ha, it happened to that guy, it won’t happen to me, I won’t worry about it’. The smart CEO looks and says, ‘wow, that could have been me, let’s work out what the risk is, and shall we mitigate it’.”

So can infosecurity professionals guard against our inherent cognitive biases? “Our brain has been built not to be a computer, not to be rational, not to be logical,” he says. “There are ways to train around it. It’s hard, and it involves education and training. This is the kind of thing where you teach policemen not to react with their gut, but to stop and think. You want to train a CEO to think about risk, and a lot of business tries to do this. It doesn’t do it in security very well, but I think it can be fixed.”

So the answer is education and ignoring gut instinct? “Or at least understanding where your gut instinct goes wrong,” Schneier says. “If you understand the pathologies, you can correct them. If I know I see things as more optimistic than they are, I can know that and correct it, just like if I know that I see blue darker than it really is, I can in my head correct for it.” The aim of the paper is to highlight our biases: “Here’s how the brain works when it’s thinking about security, we as security technologists need to understand this. BT has a risk cockpit, this fancy console that it uses to show executives what their security posture on the network is. If we don’t know the cognitive biases of the people looking at it, we’re not going to design it well – that’s just the way it is. We will do a better job if we know how things will be perceived.”

“Infosecurity always gets in the way of business. Security gets in the way – that’s its job, whether it’s a door lock, or airport security, or a network firewall. We want it to get in the way because it does something good. You don’t want it, because it makes your life more difficult, and there’s inherently a battle between getting things done and being secure. And usually getting things done wins, which is why a lot of security is so poor.

“In a sense, security is a tax on the honest. When I got to the show, I had to stand in line and get a badge. Why did I have to get a badge? Because if I don’t, some people will try to sneak in. If everyone was honest, I could have saved a whole bunch of time.”

“These are not technological problems, so be careful of technological solutions,” Schneier warns. As with what he calls “nonsense counter-terrorism policy, people mitigate against particular tactics, rather than the broad threat, so the tactics just change. If I can’t use a USB key, I’ll use something else. If you can’t blow up an airplane, blow up a shopping mall. We’re not solving the problems,” he says. “It’s real important to look at the broad threats, rather than the particulars of a tactic.

“The real security is, you can’t download and carry around sensitive data, and the only way you’re going to solve that is by hiring honest people,” he says. It is possible to create draconian security – Counterpane has such a system for staff who work on sensitive customer data, with terminals without printers, USB ports, disc-drives or external network connections. “You can do that. Does it get in the way? By God it does. But that’s the point, because we have to guarantee the security of our customer’s data. In most instances, companies can’t be that draconian.”

Schneier believes that the wider world will better understand risk in the future, although this may take some time. He praises a recent report by the Royal Academy of Engineering [PDF link] which argues that security and privacy are not in opposition, and that we can have both: “Isn’t that a good report? Did you read it?” he asks. Some reports suggested this was naïve. “It’s not naïve, it’s difficult,” he replies. “But if don’t have an ideal to shoot for, we’re never going to get anywhere close. I love it that they said those things – they put a stake in the ground and said, this is where we should go. They didn’t say we will get there tomorrow, they didn’t say it’s going to be perfect, they didn’t say it will be easy, they said this is what we should do. And I think the recommendations were spot on.

“There are sensible solutions. I don’t have near-term optimism. I think we’re living in a time of stupid security. I think our fears of terrorism make us do all kinds of crazy, stupid, self-destructive things. But long-term, 10, 15 years, yes, I’m very optimistic that we will maintain privacy and liberty, we will continue the march towards freedom of the past millennium, and it will not be reversed.

“Martin Luther King Jr. said, the arc of history is long but it bends towards justice. And yeah, these past five years have been pretty terrible for freedom and liberties, privacy and democracy, but you know 100 years ago women couldn’t vote. 200 years ago in my country, blacks were slaves. Things get better – they get better slowly though.”

Categories: Text, Written Interviews

Sidebar photo of Bruce Schneier by Joe MacInnis.