Essays in the Category "Economics of Security"

Page 3 of 4

How Security Companies Sucker Us With Lemons

  • Bruce Schneier
  • Wired
  • April 19, 2007

More than a year ago, I wrote about the increasing risks of data loss because more and more data fits in smaller and smaller packages. Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem—I use PGPdisk—but Secustick sounds even better: It automatically erases itself after a set number of bad password attempts. The company makes a bunch of other impressive claims: The product was commissioned, and eventually approved, by the French intelligence service; it is used by many militaries and banks; its technology is revolutionary…

Does Secrecy Help Protect Personal Information?

  • Bruce Schneier
  • Information Security
  • January 2007

This essay appeared as the second half of a point-counterpoint with Marcus Ranum. Marcus’s side can be found on his website.

Personal information protection is an economic problem, not a security problem. And the problem can be easily explained: The organizations we trust to protect our personal information do not suffer when information gets exposed. On the other hand, individuals who suffer when personal information is exposed don’t have the capability to protect that information.

There are actually two problems here: Personal information is easy to steal, and it’s valuable once stolen. We can’t solve one problem without solving the other. The solutions aren’t easy, and you’re not going to like them…

Schneier: Full Disclosure of Security Vulnerabilities a 'Damned Good Idea'

  • Bruce Schneier
  • CSO Online
  • January 2007

Full disclosure—the practice of making the details of security vulnerabilities public—is a damned good idea. Public scrutiny is the only reliable way to improve security, while secrecy only makes us less secure.

Unfortunately, secrecy sounds like a good idea. Keeping software vulnerabilities secret, the argument goes, keeps them out of the hands of the hackers (See The Vulnerability Disclosure Game: Are We More Secure?). The problem, according to this position, is less the vulnerability itself and more the information about the vulnerability…

Drugs: Sports' Prisoner's Dilemma

  • Bruce Schneier
  • Wired
  • August 10, 2006

The big news in professional bicycle racing is that Floyd Landis may be stripped of his Tour de France title because he tested positive for a banned performance-enhancing drug. Sidestepping the issues of whether professional athletes should be allowed to take performance-enhancing drugs, how dangerous those drugs are, and what constitutes a performance-enhancing drug in the first place, I’d like to talk about the security and economic issues surrounding the issue of doping in professional sports.

Drug testing is a security issue. Various sports federations around the world do their best to detect illegal doping, and players do their best to evade the tests. It’s a classic security arms race: Improvements in detection technologies lead to improvements in drug-detection evasion, which in turn spur the development of better detection capabilities. Right now, it seems that the drugs are winning; in places, these drug tests are described as “intelligence tests”: If you can’t get around them, you don’t deserve to play…

Google's Click-Fraud Crackdown

  • Bruce Schneier
  • Wired
  • July 13, 2006

Google’s $6 billion-a-year advertising business is at risk because it can’t be sure that anyone is looking at its ads. The problem is called click fraud, and it comes in two basic flavors.

With network click fraud, you host Google AdSense advertisements on your own website. Google pays you every time someone clicks on its ad on your site. It’s fraud if you sit at the computer and repeatedly click on the ad or—better yet—write a computer program that repeatedly clicks on the ad. That kind of fraud is easy for Google to spot, so the clever network click fraudsters simulate different IP addresses, or install Trojan horses on other people’s computers to generate the fake clicks…

It's the Economy, Stupid

  • Bruce Schneier
  • Wired
  • June 29, 2006

Italian translation

I’m sitting in a conference room at Cambridge University, trying to simultaneously finish this article for Wired News and pay attention to the presenter onstage.

I’m in this awkward situation because 1) this article is due tomorrow, and 2) I’m attending the fifth Workshop on the Economics of Information Security, or WEIS: to my mind, the most interesting computer security conference of the year.

The idea that economics has anything to do with computer security is relatively new. Ross Anderson and I seem to have stumbled upon the idea independently. He, in his brilliant article from 2001, “…

Make Vendors Liable for Bugs

  • Bruce Schneier
  • Wired
  • June 1, 2006

Have you ever been to a retail store and seen this sign on the register: “Your purchase free if you don’t get a receipt”? You almost certainly didn’t see it in an expensive or high-end store. You saw it in a convenience store, or a fast-food restaurant. Or maybe a liquor store. That sign is a security device, and a clever one at that. And it illustrates a very important rule about security: It works best when you align interests with capability.

If you’re a store owner, one of your security worries is employee theft. Your employees handle cash all day, and dishonest ones will pocket some of it for themselves. The history of the cash register is mostly a history of preventing this kind of theft. Early cash registers were just boxes with a bell attached. The bell rang when an employee opened the box, alerting the store owner—who was presumably elsewhere in the store—that an employee was handling money…

Sue Companies, Not Coders

  • Bruce Schneier
  • Wired
  • October 20, 2005

At a security conference last week, Howard Schmidt, the former White House cybersecurity adviser, took the bold step of arguing that software developers should be held personally accountable for the security of the code they write.

He’s on the right track, but he’s made a dangerous mistake. It’s the software manufacturers that should be held liable, not the individual programmers. Getting this one right will result in more-secure software for everyone; getting it wrong will simply result in a lot of messy lawsuits.

To understand the difference, it’s necessary to understand the basic economic incentives of companies, and how businesses are affected by liabilities. In a capitalist society, businesses are profit-making ventures, and they make decisions based on both short- and long-term profitability. They try to balance the costs of more-secure software—extra developers, fewer features, longer time to market—against the costs of insecure software: expense to patch, occasional bad press, potential loss of sales…

A Real Remedy for Phishers

  • Bruce Schneier
  • Wired
  • October 6, 2005

Last week California became the first state to enact a law specifically addressing phishing. Phishing, for those of you who have been away from the internet for the past few years, is when an attacker sends you an e-mail falsely claiming to be a legitimate business in order to trick you into giving away your account info—passwords, mostly. When this is done by hacking DNS, it’s called pharming.

Financial companies have until now avoided taking on phishers in a serious way, because it’s cheaper and simpler to pay the costs of fraud. That’s unacceptable, however, because consumers who fall prey to these scams pay a price that goes beyond financial losses, in inconvenience, stress and, in some cases, blots on their credit reports that are hard to eradicate. As a result, lawmakers need to do more than create new punishments for wrongdoers—they need to create tough new incentives that will effectively force financial companies to change the status quo and improve the way they protect their customers’ assets. Unfortunately, the California …

Economics of Information Security

  • Ross Anderson and Bruce Schneier
  • IEEE Security & Privacy
  • January/February 2005

View or Download in PDF Format

Several years ago, a number of researchers began to realize that many security systems fail not so much for technical reasons as from misplaced incentives. Often the people who could protect a system were not the ones who suffered the costs of failure. Hospital medical-records systems provided comprehensive billing-management features for the administrators who specified them, but were not so good at protecting patients’ privacy. Auto- matic teller machines suffered from fraud in countries like the United Kingdom and the Netherlands, where poor regulation left banks without sufficient incentive to se- cure their systems, and allowed them to pass the cost of fraud along to their customers. And one reason the Internet is insecure is that liability for attacks is so diffuse…

Sidebar photo of Bruce Schneier by Joe MacInnis.