Bruce Schneier on IT Insecurity

There are no easy solutions to today's security challenges, and companies often approach them in the wrong way, says Bruce Schneier.

Talking with security expert Bruce Schneier does not always leave a person feeling more secure. That’s because Schneier doesn’t sell easy solutions. Instead, he challenges businesses, governments and individuals to examine their assumptions about risk, to eschew simplistic answers and to accept the fact that no system is—or can be—perfectly secure.

Now the chief security technology officer of BT, Schneier worked at the Department of Defense and Bell Labs before founding Counterpane Internet Security, which was acquired by BT. He has a master’s degree in computer science and a B. A. in physics.

Books by Schneier include Applied Cryptography, an explanation of codes and code-breaking that Wired magazine called “the book the National Security Agency wanted never to be published”; Secrets and Lies, about computer and network security; Beyond Fear, a look at issues facing individuals and organizations; and his latest, Schneier on Security, a collection of his articles and essays. Schneier discussed these issues CIO Insight Senior Writer Edward Cone. This is an edited version of their conversation.

CIO Insight: What will a CIO take away from this conversation on security?

Bruce Schneier: If you don’t get the economics right, no security will work. A lot of failures aren’t technical failures—they’re motivational failures or tradeoff failures. Security is hard: It’s not simply a matter of tossing a piece of technology at a problem. The devil is in the details, and the details are complicated.

One of the more compelling security stories I worked on involved a casino that had a real culture of security. It had lots of technology, and everyone expected to be watched. The CIO has no problem checking his laptop in and out every day, and dealers yell out every time they break a $20 bill. It reminds me of the example you’ve used of the bell on cash registers being there to alert the store owner that the clerk is handling money.

Schneier: It’s an old culture—a culture that’s used to dealing with cash and that isn’t forgiving of security breaches. For decades, they’ve had a culture of people watching people watching people: Dealers watch customers, pit bosses watch dealers, floor managers watch pit bosses and the cameras watch everybody.

There are audits and controls every which way, because they’re dealing in a high-volume cash business. However, they needed to build a system of checks and balances. They couldn’t just have everything be on credit cards and check it at the end of the month.

How do you inculcate that kind of culture in your people if you’re in another industry?

Schneier: You probably can’t do it, and it’s probably wrong to try. People are inherently nice—and social. The reason social engineering works is because people are polite and helpful and friendly. You could inculcate them to be mean, surly, suspicious and nasty, but you’d probably go out of business.

Imagine setting up a bank where everyone is strip-searched when they go into the building. It would be more secure, but it wouldn’t be a very profitable bank. And imagine a department store where everybody is watching everything, and everybody is suspicious. Nobody is going to shop there.

Security is a tradeoff. These types of human security issues, human attacks, social engineering, all prey on the inherent qualities that you want in your employees. You want them to be friendly and helpful. You want them to be team players. You can turn them into something else, but your company is going to suffer.

We’re probably going to have to accept a certain amount of social engineering as the price of being in business. So now the question is, What sort of controls can I put in place—whether preventive or auditing—to limit the amount of damage that is inevitable, because I’m hiring pleasant people as employees?

But people at casinos are nice, and they’re not strip-searching me. There’s a culture of security, but it’s a hospitality business.

Schneier: That’s expensive. You can decide you want to pay it, you could have all the employees at a retail store be friendly, and hire an equal number of guards to look around. You get hospitality, and you get security, but you probably don’t get profits. You might be able to train people to create that kind of culture, but that’s expensive, too.

We made it through the election without a disaster, but concerns over electronic voting machines persist. Why can’t government get such a basic security issue right?

Schneier: The security of voting machines points to two big issues. The first one is that security is actually very hard. People think technology magically makes security worries a thing of the past, but that’s not true. These voting machine companies are no better than any other software or hardware computer company. And because the systems were proprietary—because the companies had a vested interest in keeping the flaws secret—the public didn’t know about them.

That’s why we need to have backup systems that work. When you have an insecure system, or a system that could be insecure, the way you make it secure is often by having secure backup procedures or secure procedures around the system. That’s why people who understand computer security call for voter verifiable paper trails. Then, no matter what the machine is or what it does, whether it works or not, whether it’s hackable or not, it’s got a paper backup to fall back on if something happens.

The other issue with voting is that we only do it every other year. An ATM system gets used thousands of times a day, every day, so problems are found and fixed. With voting, we forget about it, so it’s much harder to build up any institutional knowledge of how to do it. People came to the voting booths, and the machines were different this year. They’ve never been taught how to use them, and there isn’t the familiarity they get with a VCR.

ATMs and gas pumps seem pretty secure. Are there institutional reasons why the government seems to get this stuff so wrong so often?

Schneier: There are a couple of reasons why things like automatic teller machines and gas pumps are more secure. The first one is, there’s money involved. If someone hacks an ATM, the bank loses money, so the bank has a financial interest in making those ATMs secure.

If someone hacks a voting machine, nobody loses money. In fact, half the country is happy with the result. So it’s much harder to get the economic incentives aligned.

The other issue about voting machines is that ballots are secret. A lot of the security in computerized financial systems is based on audits, on being able to unravel a transaction. If you go to an ATM and you push a bunch of buttons and you get out 10 times the cash you were supposed to, that’s a mistake, but that mistake will be caught in audit. It’s likely that the bank will figure out you got the money by accident, and it will be taken out of your account. Because ballots are secret, a lot of the auditing tools that we in the community have developed for financial systems don’t apply.

What about airport security and its role in preventing terrorism? Does the government get that right? Since you recently took two bottles of liquid through a checkpoint, I’m guessing the answer is “no.”

Schneier: The mistake is the focus. Counterterrorism in the United States is very much a political issue. It’s important for politicians to defend against what the bad guys did last week, because they’re going to look really bad if the terrorists do it again. So politicians are forced to spend more money defending against particular tactics than in defending against the broad threat.

The TSA [Transportation Security Administration] is the artifact of that. The terrorists used airlines in 2001 in a particular way, and we need to make sure they never do that particular thing again. So what we get is an institution focused on defending against tactics rather than against the threat. And, like any institution, once it’s formed, the TSA has to continue to justify its existence. So you get an ever-increasing amount of airline security at the expense of general security.

Remember, every dollar spent taking away liquids is a dollar not being spent on Arabic translators. And taking away liquids only works if you’re lucky enough to guess the plot correctly. Arabic translators work regardless of what the plot is.

There are two things [that have been effective]: reinforcing the cockpit doors and convincing passengers they have to fight back. Everything else has been a red herring. People have argued with me that sky marshals also have been effective, but it’s not the sky marshals who are effective, it’s the idea of sky marshals that is effective. If you convince the public that you have sky marshals, you don’t actually need them.

Government seems to consistently mismanage IT projects. Why does this happen?

Schneier: Government is just big, and I think big is bad at this. If it’s a massive company like an airline or an automobile manufacturer, they have the same problems with overrun systems. The difference is that they’re more likely to pull the plug quicker, because they have a financial bottom line to worry about every year, every quarter. In contrast, government is more likely to have an entrenched bureaucracy.

The idea of social responsibility in IT is coming to mean more than simply donating used computers. It incorporates safeguarding privacy and data. Can security be increased by leveraging people’s good nature?

Schneier: I think there is a possibility there, especially in terms of data privacy, but it requires a bunch of things. It requires good whistle-blower protection laws, because you could have somebody who wants to do right by saying, “Hey, my company is misbehaving.” And it’s going to need transparency, so that companies know what’s going on with their data. I think it could work, and I think it’s a great thing, but it requires some cultural changes in business that companies are going to resist.

What about brute force, tech-driven hacking—the kind of thing you see in movies—who wins that arms race?

Schneier: It’s a question of tactics, and, on any given week, one might be better than the other. Deciding who, at the time of writing, is ahead isn’t really relevant.

The bad guys have an objective, and they will take the easiest path. If the easiest path is tricking a secretary, then they’ll do that. If the easiest path is a new vulnerability in Windows Vista that hasn’t been patched yet, they’ll do that. Figuring out which one they’re going to do today tells you nothing about what they’re going to do tomorrow. Like the TSA, we need to spend more effort on the general threat than focusing on what tactic is in vogue this week.

Who or what is the main threat?

Schneier: Mostly it’s crime. Hacking changed from a hobbyist pursuit to a criminal pursuit. Criminals have gone international, they’ve gone up-market and they’ve gotten much more professional.

Crime takes several flavors. The common one is what we call identity theft, which is basically fraud through impersonation. But we also see them attacking and owning computers, and sending spam for commercial purposes or for denial of service extortion.

We’re seeing more and more of that. It’s still primarily targeted against fringe industries, such as online gambling and online porn, and fringe markets, like companies in the Caribbean, but it’s growing rapidly. The question to ask is, If you’re a large criminal organization with control over 100,000 computers, how could you make money with them? You end up with the things that criminals are now doing.

How does a company deal with all this? What’s the structure in terms of working with a CIO or CSO?

Schneier: The details of the structure matter less than the fact that senior management cares, and that there is communication among these various people. I don’t care whether the CIO or the CSO is running IT security, as long as the two of them talk. I don’t care if the CSO is under the finance people or under the IT people, as long as he can talk to the right people when something happens.

It’s important to understand that IT security is part of overall security, and that is part of governance, and that is part of making the company profitable. So people have to make hard tradeoffs. You have to decide whether more security is good or bad: It may be good for security, but it could be bad for business. The decisions have to be made at a high enough level that you can make them intelligently, and that’s far more important than determining exactly where things are connected.

Have we seen the death of privacy?

Schneier: I think the death of privacy is overrated. Technically, the threats to privacy are enormous, but just because someone invented the camera doesn’t mean that everyone gets naked pictures of themselves, and just because someone invented a recording device doesn’t meant everything gets recorded.

Whenever you have technical advances that perturb our rights, the way you fix that is through laws. Don’t look to technology.

In general, laws are trailing technology. You might have laws that protect your privacy for videotape rentals, but don’t apply to downloaded movies on the Internet. Other laws might protect the privacy of your mail as it goes through the post office, but don’t protect your e-mail as it goes through ISPs. We’re living in a world where a lot of laws are written to be technically specific, and they are becoming obsolete because the technology changes so fast. Better laws are technologically invariant.

A lot of younger people seem to be less concerned with privacy than their elders. Is that healthy?

Schneier: The Internet is responsible for the greatest generation gap since rock and roll. There’s an enormous difference in the way the older and younger generations use the Internet, and that’s healthy. We can look in horror at some things the younger generation is doing, but you’re looking at the future.

It’s not that young people don’t care about privacy, they just have a different socialization. They want to have control over their data: What upsets them is if something happens to their data—say, their photos—that they don’t want. We as the older generation are morally obligated to build systems that will allow the younger generation to communicate, to contribute and be part of society without forcing them into particular boxes that we think is required of them.

This is kind of depressing for CIOs: Their technology won’t keep up, they need to be sued, and their people can’t be changed. What’s the good news?

Schneier: The good news is that society works. None of the problems I’ve talked about are new: They’ve been problems for thousands of years. Most people are honest and honorable, most of the time. If that weren’t true, civilization would collapse.

Most bad news is around the edges. Companies aren’t going out of business every day. The lesson is that you may be worrying too much. Yes, it’s your job to worry, but go outside and have some social perspective.

Lots of things are life and death, and there are bad guys. But even with respect to terrorism, a little healthy skepticism is a good thing. People who are scared all the time have been defeated.

Categories: Text, Written Interviews

Sidebar photo of Bruce Schneier by Joe MacInnis.