Blaming The User Is Easy -- But It's Better to Bypass Them Altogether
By Bruce Schneier
Blaming the victim is common in IT: users are to blame because they don't patch their systems, choose lousy passwords, fall for phishing attacks, and so on. But, while users are, and will continue to be, a major source of security problems, focusing on them is an unhelpful way to think.
People regularly don't do things they are supposed to: changing the oil in their cars, going to the dentist, replacing the batteries in their smoke detectors. Why? Because people learn from experience. If something is immediately harmful, ie, touching a hot stove or petting a live tiger, they quickly learn not to do it. But if someone skips an oil change, ignores a computer patch, or chooses a lousy password, it's unlikely to matter . No feedback, no learning.
We've tried to solve this in several ways. We give people rules of thumb: oil change every 5,000 miles; secure password guidelines. Or we send notifications: smoke alarms beep at us, dentists send postcards, Google warns us if we are about to visit a website suspected of hosting malware. But, again, the effects of ignoring these aren't generally felt immediately.
This makes security primarily a hindrance to the user. It's a recurring obstacle: something that interferes with the seamless performance of the user's task. And it's human nature, wired into our reasoning skills, to remove recurring obstacles. So, if the consequences of bypassing security aren't obvious, then people will naturally do it.
This is the problem with Microsoft's User Account Control (UAC). Introduced in Vista, the idea is to improve security by limiting the privileges applications have when they're running. But the security prompts pop up too frequently, and there's rarely any ill-effect from ignoring them. So people do ignore them.
This doesn't mean user education is worthless. On the contrary, user education is an important part of any corporate security program. And at home, the more users understand security threats and hacker tactics, the more secure their systems are likely to be. But we should also recognise the limitations of education.
The solution is to better design security systems that assume uneducated users: to prevent them from changing security settings that would leave them exposed to undue risk, or -- even better -- to take security out of their hands entirely.
For example, we all know that backups are a good thing. But if you forget to do a backup this week, nothing terrible happens. In fact, nothing terrible happens for years on end when you forget. So, despite what you know, you start believing that backups aren't really that important. Apple got the solution right with its backup utility Time Machine. Install it, plug in an external hard drive, and you are automatically backed up against hardware failure and human error. It's easier to use it than not.
For its part, Microsoft has made great strides in securing its operating system, providing default security settings in Windows XP and even more in Windows Vista to ensure that, when a naive user plugs a computer in, it's not defenceless.
Unfortunately, blaming the user can be good business. Mobile phone companies save money if they can bill their customers when a calling card number is stolen and used fraudulently . British banks save money by blaming users when they are victims of chip-and-pin fraud . This is continuing, with some banks going so far as to accuse the victim of perpetrating the fraud, despite evidence of large-scale fraud by organised crime syndicates .
The legal system needs to fix the business problems, but system designers need to work on the technical problems. They must accept that security systems that require the user to do the right thing are doomed to fail. And then they must design resilient security nevertheless.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.