Sue Companies, Not Coders

  • Bruce Schneier
  • Wired
  • October 20, 2005

At a security conference last week, Howard Schmidt, the former White House cybersecurity adviser, took the bold step of arguing that software developers should be held personally accountable for the security of the code they write.

He’s on the right track, but he’s made a dangerous mistake. It’s the software manufacturers that should be held liable, not the individual programmers. Getting this one right will result in more-secure software for everyone; getting it wrong will simply result in a lot of messy lawsuits.

To understand the difference, it’s necessary to understand the basic economic incentives of companies, and how businesses are affected by liabilities. In a capitalist society, businesses are profit-making ventures, and they make decisions based on both short- and long-term profitability. They try to balance the costs of more-secure software—extra developers, fewer features, longer time to market—against the costs of insecure software: expense to patch, occasional bad press, potential loss of sales.

The result is what you see all around you: lousy software. Companies find that it’s cheaper to weather the occasional press storm, spend money on PR campaigns touting good security, and fix public problems after the fact than to design security right from the beginning.

The problem with this analysis is that most of the costs of insecure software fall on the users. In economics, this is known as an externality: an effect of a decision not borne by the decision maker.

Normally, you would expect users to respond by favoring secure products over insecure products—after all, they’re making their own buying decisions based on the same capitalist model. But that’s not generally possible. In some cases, software monopolies limit the available product choice; in other cases, the “lock-in effect” created by proprietary file formats or existing infrastructure or compatibility requirements makes it harder to switch; and in still other cases, none of the competing companies have made security a differentiating characteristic. In all cases, it’s hard for an average buyer to distinguish a truly secure product from an insecure product with a “boy, are we secure” marketing campaign.

The end result is that insecure software is common. But because users, not software manufacturers, pay the price, nothing improves. Making software manufacturers liable fixes this externality.

Watch the mechanism work. If end users can sue software manufacturers for product defects, then the cost of those defects to the software manufacturers rises. Manufacturers are now paying the true economic cost for poor software, and not just a piece of it. So when they’re balancing the cost of making their software secure versus the cost of leaving their software insecure, there are more costs on the latter side. This will provide an incentive for them to make their software more secure.

To be sure, making software more secure will cost money, and manufacturers will have to pass those costs on to users in the form of higher prices. But users are already paying extra costs for insecure software: costs of third-party security products, costs of consultants and security-services companies, direct and indirect costs of losses. Making software manufacturers liable moves those costs around, and as a byproduct causes the quality of software to improve.

This is why Schmidt’s idea won’t work. He wants individual software developers to be liable, and not the corporations. This will certainly give pissed-off users someone to sue, but it won’t reduce the externality and it won’t result in more-secure software.

Computer security isn’t a technological problem—it’s an economic problem. Socialists might imagine that companies will improve software security out of the goodness of their hearts, but capitalists know that it needs to be in companies’ economic best interest. We’ll have fewer vulnerabilities when the entities that have the capability to reduce those vulnerabilities have the economic incentive to do so. And this is why solutions like liability and regulation work.

Categories: Computer and Information Security, Economics of Security, Laws and Regulations

Sidebar photo of Bruce Schneier by Joe MacInnis.