Interview with Bruce Schneier

By Ed Cone
Know It All
November 6, 2008

An edited version of this interview will appear in CIO Insight.

I asked security guru Bruce Schneier about those troublesome voting machines and the mindset that foists them upon us.

Schneier: The security of voting machines points to two big issues. The first one is that security is actually very hard. People think technology magically makes security worries a thing of the past, but it's just not true. Security is very hard and very subtle.
These voting machine companies were no better than any other software, or hardware, computer company we've seen in the past few years. They did a really lousy job. And because the systems were proprietary, because the companies had a vested interest in keeping the flaws secret, the public didn't know about them. So we have this problem of insecure voting machines.
The other thing it points to is the need to have backup systems that work. When you have an insecure system, or a system that could be insecure, the way you make it secure is often not by spending the huge amounts of money to secure the system, but having secure backup procedures, or secure procedures around the system. And that's why people who understand computer security call for voter verifiable paper trails. No matter what the machine is, what it does, whether it works or not, whether it's hackable or not, it's got a paper back up to fall back on if something happens.
Another issue with voting is that we only do it every other year. An ATM system gets used millions or thousands of times a day, every day, so problems are found and fixed. Voting, we forget about it, so it's much harder to build up any institutional knowledge of how to do it. People come to the voting booths, and the machines are different this year. They've never been taught, they've never been trained, the poll workers have different machines, there isn't the familiarity they get with a VCR.

Know It All: ATMs and gas pumps seem pretty secure. Are there institutional reasons that the government seems get this stuff so wrong so often?

Schneier: There are a couple of reasons that things like automatic teller machines and gas pumps are more secure. The first one is, there's money involved. If someone hacks an ATM, the bank loses money. The bank has a financial interest in making those ATMs secure. If someone hacks a voting machine, nobody loses money. In fact, half the country is happy with the result. So it's much harder to get the economic incentives aligned.
The other issue about voting machines is that ballots are secret. A lot of the security in computerized financial systems is based on audits, based on being able to unravel a transaction. If you go to an ATM and you push a bunch of buttons and you get out ten times the cash you were supposed to, that's a mistake, but that mistake will be caught in audit, and likely, you will be figured out as the person who got the money by accident, and it will be taken out of your account. Because ballots are secret, a lot of the auditing tools that we in the community have developed for financial systems don't apply.

So the government takes the wrong approach to voting-machine security. What about airport security and its role in preventing terrorism? Bruce Schneier is not much more encouraging on that front than he was on the voting machines:

Bruce Schneier: The mistake is the focus. Counterterrorism in the United States is very much a political issue. It's important as a politician to defend against what the bad guys did last week, because you're going to look really bad if they do it again. So you're forced, really by politics, to spend more money defending against particular tactics than you are defending against the broad threat. And the TSA is the artifact of that.
The terrorists used airlines in 2001 in a particular way, and we need to make sure they never do that particular thing again. So what we get is an institution focused on defending against tactics rather than the threat. And like any institution, once it's formed, once it's brought into existence, it has to continue to justify its own existence. So you get an ever-increasing amount of airline security at the expense of general security.
Remember, every dollar spent taking away liquids is a dollar not being spent on Arabic translators. And taking away liquids only works if you're lucky enough to guess the plot correctly. Arabic translators work regardless of what the plot is.

Know It All: You recently demonstrated an ability to get liquids past security in any case. But you do say there has been some progress in securing air travel.

Schneier: I've said there are two things - reinforcing the cockpit doors, and convincing passengers they have to fight back. Everything else has been a red herring. People have argued with me that sky marshals also have been effective, but sky marshals are interesting - it's not the sky marshals that are effective, it's the idea of sky marshals that is effective. If you convince the public that you have sky marshals, you don't actually need them.
Know It All: Government seems to consistently mismanage IT projects. It overestimates capabilities, and underestimates human factors. Why does this happen?
Schneier:Government is just big, and I think big is bad at this. If it's a massive company like an airline or an automobile manufacturer, they have the same problems with overrun systems -- the difference is, they're more likely to pull the plug quicker. Because they have a financial bottom line to worry about every year, every quarter, whereas a government is more likely to have an entrenched bureaucracy.
Know It All: If I'm the CIO of a big company, what do I learn from this?
Schneier: You should learn that if you don't get the economics right, no security will work. A lot of these failures aren't technical failures, they're motivational failures, they're tradeoff failures, they're failures in the economics that make the system work. You should take away from this that security is hard, that it's not a matter of tossing a piece of technology at a problem, and suddenly it works. The devil is in the details, and the details are complicated.

Know It All: One of the more compelling security stories I worked on involved a casino -- there was a real culture of security there, lots of technology of course but everyone expected to be watched. The CIO has no problem checking his laptop in and out every day, dealers yell out every time they break a $20 bill -- it reminds me of the example you have used of the bell on cash registers being there to alert the store owner that the clerk is handling money.

Schneier: It's an old culture, and it's a culture that's used to dealing with cash. It's a culture that isn't forgiving of security breaches. And it's not just players cheating, it's employees. Most theft, most fraud at casinos is from minimum wage employees, it's from all those dealers. So they've had a culture for decades of people watching people watching people - dealers watch customers, pit bosses watch dealers, floor managers watch pit bosses, the cameras watch everybody. There are audits, there are controls every which way, because they're dealing in a very high volume cash business, but they needed to build a system of checks and balances - they couldn't just have everything be on credit cards and check it at the end of the month.
Know It All: How do you inculcate that kind of culture in your people if you're in another industry?
Schneier: You probably can't do it, and it's probably wrong to try. People are inherently nice. They're social. The reason social engineering works is because people are polite and helpful and friendly. And you could inculcate them to be mean, surly, suspicious, and nasty, but honestly you'd probably go out of business. You could imagine setting up a bank where everyone is strip-searched when they go into the building. We'd be more secure, but it wouldn't be a very profitable bank. You could imagine a department store where everybody is watching, everybody is suspicious, everything is paid attention to - and nobody's going to shop there.
Security is a tradeoff. And these types of human security issues, human attacks, social engineering, all prey on the inherent qualities that you want in your employees. You want them to be friendly and helpful. You want them to be team players. You can turn them into something else, but your company is going to suffer. What that means is that we're probably going to have to accept a certain amount of social engineering as the price of being in business. So now the question is, what sort of controls can I put in place, whether they be preventive or auditing, to limit the amount of damage that is inevitable, because I'm hiring pleasant people as employees.
Know It All: But people at casinos are nice, and they're not strip searching me. There's a culture of security, but it's a hospitality business.
Schneier: It's expensive. You can decide you want to pay it, you could have all the employees at a retail store be friendly, and hire an equal number of guards to look around. You get hospitality, and you get security, but you probably don't get profits. You might be able to train people to create that kind of culture, but that's expensive, too. These attacks prey on human nature. You're going into a business, I'm holding an armful of boxes and ask you to hold the door, you're going to hold the door for me, you're not going to ask to see my badge, and that kind of thing is what the attackers prey on, whether it be real or virtual, and to train that out of somebody makes society a much less pleasant place to live.
Know It All: I wrote recently about social responsibility in IT, and found that it's not just about donating used computers, but has come to incorporate safeguarding privacy and data. Can security be increased by leveraging people's good nature?
Schneier: I think there is a possibility there, especially in terms of data privacy - but that requires a bunch of things. It requires good whistle-blower protection laws, because you could have somebody who wants to do right by saying hey, my company is misbehaving, and it's going to need transparency, so that companies know what's going on with their data. So I think it could work, and I think it's a great thing, but it takes some cultural changes in business that business is going to resist.

Having discussed the role of culture and human nature, we turn to technology and prevention.

Know It All: What about brute force, tech-driven hacking, the kind of thing you see in movies -- who wins that arms race?

Bruce Schneier: It's a question of tactics, and on any given week, one might better the other. Deciding who, at the time of writing, is ahead, isn't really relevant. The bad guys have an objective, and they will take the easiest path. If the easiest path is tricking a secretary, then they'll do that. If the easiest path is a new vulnerability in Windows Vista that hasn't been patched yet, they'll do that. Figuring out which one they're going to do today tells you nothing about what they're going to do tomorrow. Like the TSA, we need to spend more effort on the general threat than focusing on what particular tactic is in vogue this week.
Know It All: What is the threat, really? Or more precisely, who is the threat to corporate IT?
Schneier: Mostly it's crime, the thing we have to worry about the most is criminals. Hacking changed from a hobbyist pursuit to a criminal pursuit. Criminals have gone international, they've gone up-market, they've gotten much more professional.
Crime takes several flavors. The common one is what we call identity theft, which is basically fraud through impersonation. But we see attacking and owning of computers, sending of spam for commercial purposes, or for denial of service extortion. We're seeing more and more of that, it's still primarily targeted against fringe industries - online gambling, online porn - and fringe markets, like companies in the Caribbean, but it's rapidly growing. The question to ask is, if you are a large criminal organization, and you have control over 100,000 computers, how could you make money with them. And you end up with the things that criminals are now doing.
Know It All:How does a CEO deal with all of this? What's the structure in terms of working with a CIO or CSO?
Schneier: The details of the structure matter less than the fact that senior management cares, and that there is communication among these various people. I don't care whether the CIO or the CSO is running IT security, as long as the two of them will talk. I don't care if the CSO is under the finance people or under the IT people, as long as when something happens he can talk to the right people. Where exactly the lines are drawn matters little or none.
What's important is to understand that IT security is part of security, and those are part of governance, and those are part of making the company profitable, and people have to make a bunch of hard tradeoffs. You have to decide whether more security is good or bad - it's good for security, but it could be bad for business. The decisions have to be made at a high enough level that you can make them intelligently, and that's far more important than exactly where things are connected.

Know It All: Have we seen the death of privacy?

Bruce Schneier: Scott McNealy made the famous comment -- you have zero privacy already, get over it - and the death of privacy has been written about for many years. I think the death of privacy is overrated. Technically, the treats to privacy are enormous, but just because someone invented the camera doesn't mean that everyone gets naked pictures of themselves taken, and just because someone invented a recording device doesn't meant everything gets recorded.
Whenever you have technical advances that perturb our rights, the way you fix that is through laws. If you want to preserve privacy, don't look to technology, look to the legal system. Laws are trailing technology in general. You might have laws that protect your privacy for videotape rentals, but don't apply to downloaded movies on the internet, or laws that protect the privacy of your mail as it goes through the post office, but doesn't protect your email as it goes through ISPs. We're living in a world where a lot of laws are written to be technically specific, and are becoming obsolete when the technology changes so fast. Better laws are technologically invariant.
Know It All: Privacy is a cultural issue, too. A lot of younger people seem to be less concerned with privacy than their elders. Is that healthy?
Schneier: The Internet is the greatest generation gap since Rock and Roll. There's an enormous difference in the way the older and younger generations use the Internet, and that's healthy. We can look in horror at some things the younger generation is doing, but you're looking at the future.
It's not that young people don't care about privacy, they're very concerned about privacy. If you ask them, they'll tell you. They just have a different socialization. They want to have control over their data, what upsets them is if something happens to their data - say their pictures - that they don't want. We as the older generation are morally obligated to build systems that will allow the younger generation to communicate, to contribute, to be part of society, without forcing them into particular boxes that we might think is required of them.
Know It All: Dan Gillmor wrote us a column suggesting that companies shouldn't maintain customer data -- you can't lose what you don't keep.
Schneier: That's the best way to secure customer data, not to have it. The way to make it work is to make companies liable to exposed customer data, to give them the economic problem of owning my data. They are the only entity that can protect it, yet when the data is lost, they don't feel the pain - I do. In a capitalist system, they won't protect the data to the extent I want. They can't. The free market doesn't support that kind of decision, so if you want to make companies more responsible with customer data, you need to fix that externality.
In order to do that, you need to give individuals the ability to sue companies, because then the cost of losing my data goes up. And I think you find that if the cost of losing someone's data is higher, because of the risk of a lawsuit, then companies will save less data. We can't force companies not to keep records, that doesn't make sense, either, but if you force companies to pay the true cost of maintaining and storing data, I think you'll find a lot less of it being stored.

Know It All: This is kind of depressing for a CIO -- my technology won't keep up, I need to be sued, my people can't be changed...what's the good news? What affirmative steps can I take to create a secure and profitable company?

Bruce Schneier: The good news is that society works. None of the problems I've talked about are new, they've been problems for thousands of years. Most people are honest and honorable, most of the time. If that weren't true, civilization would collapse. So most bad news is around the edges. Companies aren't going out of business every day. The lesson is, maybe you're worrying too much. Yes, it's your job to worry, but go outside and have some social perspective.
Lots of things are life and death, and there are bad actors, but even with respect to terrorism a little healthy skepticism is a good thing. There are people who are scared all the time, they've been defeated.
Know It All:Some people out there want people to be scared.
Schneier: Fear is very common way to sell security. People buy things for reasons of fear or greed - either you want the thing, or you want to avoid something else. Security is inherently a fear sell. Nobody wants to buy security, they want to avoid what would happen if they didn't. It's perfectly understandable that companies push the fear button.
Know It All: Which is not to say that real problems don't exist.
Schneier: Of course
Know It All: It's the balance between perception and reality that's hard.
Schneier. Right.

earlier story: The Things He Carried
later story: Bruce Schneier: Securing Your PC and Your Privacy
back to News and Interviews

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..