Essays in the Category "Economics of Security"

Page 4 of 4

Information Security: How Liable Should Vendors Be?

  • Bruce Schneier
  • Computerworld
  • October 28, 2004

An update to this essay was published in ENISA Quarterly in January 2007.

Information insecurity is costing us billions. We pay for it in theft: information theft, financial theft. We pay for it in productivity loss, both when networks stop working and in the dozens of minor security inconveniences we all have to endure. We pay for it when we have to buy security products and services to reduce those other two losses. We pay for security, year after year.

The problem is that all the money we spend isn’t fixing the problem. We’re paying, but we still end up with insecurities…

Security and Compliance

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2004

View or Download in PDF Format

It’s been said that all business-to-business sales are motivated by either fear or greed. Traditionally, security products and services have been a fear sell: fear of burglars, murders, kidnappers, and—more recently—hackers. Despite repeated attempts by the computer security industry to position itself as a greed sell—”better Internet security will make your company more profitable because you can better manage your risks”—fear remains the primary motivator for the purchase of network security products and services…

Hacking the Business Climate for Network Security

  • Bruce Schneier
  • IEEE Computer
  • April 2004

Computer security is at a crossroads. It’s failing, regularly, and with increasingly serious results. CEOs are starting to notice. When they finally get fed up, they’ll demand improvements. (Either that or they’ll abandon the Internet, but I don’t believe that is a likely possibility.) And they’ll get the improvements they demand; corporate America can be an enormously powerful motivator once it gets going.

For this reason, I believe computer security will improve eventually. I don’t think the improvements will come in the short term, and I think that they will be met with considerable resistance. This is because the engine of improvement will be fueled by corporate boardrooms and not computer-science laboratories, and as such won’t have anything to do with technology. Real security improvement will only come through liability: holding software manufacturers accountable for the security and, more generally, the quality of their products. This is an enormous change, and one the computer industry is not going to accept without a fight…

Liability changes everything

  • Bruce Schneier
  • Heise Security
  • November 2003

German translation

Computer security is not a problem that technology can solve. Security solutions have a technological component, but security is fundamentally a people problem. Businesses approach security as they do any other business uncertainty: in terms of risk management. Organizations optimize their activities to minimize their cost-risk product, and understanding those motivations is key to understanding computer security today.

It makes no sense to spend more on security than the original cost of the problem, just as it makes no sense to pay liability compensation for damage done when spending money on security is cheaper. Businesses look for financial sweet spots—-adequate security for a reasonable cost, for example—and if a security solution doesn’t make business sense, a company won’t do it…

Should Vendors be Liable for Their Software's Security Flaws?

  • Bruce Schneier
  • Network World
  • April 22, 2002

Network security is not a technological problem; it’s a business problem. The only way to address it is to focus on business motivations. To improve the security of their products, companies – both vendors and users – must care; for companies to care, the problem must affect stock price. The way to make this happen is to start enforcing liabilities.

The only way to get many companies to spend significant resources to ensure the security of their customers’ data is to hold them liable for misuse of this data. Similarly, the only way to get software vendors to reduce features, lengthen development cycles and invest in secure software development processes is to hold them liable for security vulnerabilities in their products…

Cyber Underwriters Lab?

  • Bruce Schneier
  • Communications of the ACM
  • April 2001

Underwriters Laboratories (UL) is an independent testing organization created in 1893, when William Henry Merrill was called in to find out why the Palace of Electricity at the Columbian Exposition in Chicago kept catching on fire (which is not the best way to tout the wonders of electricity). After making the exhibit safe, he realized he had a business model on his hands. Eventually, if your electrical equipment wasn’t UL certified, you couldn’t get insurance.

Today, UL rates all kinds of equipment, not just electrical. Safes, for example, are rated based on time to crack and strength of materials. A “TL-15” rating means that the safe is secure against a burglar who is limited to safecracking tools and 15 minutes’ working time. These ratings are not theoretical; employed by UL, actual hotshot safecrackers take actual safes and test them. Applying this sort of thinking to computer networks—firewalls, operating systems, Web servers—is a natural idea. And the newly formed Center for Internet Security (no relation to UL) plans to implement it…

Sidebar photo of Bruce Schneier by Joe MacInnis.