Essays in the Category “Economics of Security”

"Stalker Economy" Here to Stay

  • Bruce Schneier
  • CNN
  • November 20, 2013

Google recently announced that it would start including individual users' names and photos in some ads. This means that if you rate some product positively, your friends may see ads for that product with your name and photo attached—without your knowledge or consent. Meanwhile, Facebook is eliminating a feature that allowed people to retain some portions of their anonymity on its website.

These changes come on the heels of Google's move to explore replacing tracking cookies with something that users have even less control over.

Read More →

A Fraying of the Public/Private Surveillance Partnership

  • Bruce Schneier
  • The Atlantic
  • November 8, 2013

The public/private surveillance partnership between the NSA and corporate data collectors is starting to fray. The reason is sunlight. The publicity resulting from the Snowden documents has made companies think twice before allowing the NSA access to their users' and customers' data.

Pre-Snowden, there was no downside to cooperating with the NSA.

Read More →

Why It's So Easy to Hack Your Home

  • Bruce Schneier
  • CNN
  • August 15, 2013

Last weekend a Texas couple apparently discovered that the electronic "baby monitor" in their children's bedroom had been hacked. According to a local TV station, the couple said they heard an unfamiliar voice coming from the room, went to investigate and found that someone had taken control of the camera monitor remotely and was shouting profanity-laden abuse. The child's father unplugged the monitor.

What does this mean for the rest of us? How secure are consumer electronic systems, now that they're all attached to the Internet?

Read More →

Take Stop-and-Scan with a Grain of Salt

Security Has Become a For-Profit Business

  • Bruce Schneier
  • New York Daily News
  • March 3, 2013

This is an edited version of a longer essay.

It's a new day for the New York Police Department, with technology increasingly informing the way cops do their jobs. With innovation come new possibilities, but also new concerns.

For one, the NYPD is testing a security apparatus that uses terahertz radiation to detect guns under clothing from a distance. As Police Commissioner Ray Kelly explained back in January, "If something is obstructing the flow of that radiation, for example a weapon, the device will highlight that object."

Ignore, for a moment, the glaring constitutional concerns, which make the stop-and-frisk debate pale in comparison: virtual strip-searching, evasion of probable cause, potential profiling.

Read More →

Why Framing Your Enemies Is Now Virtually Child's Play

In the eternal arms race between bad guys and those who police them, automated systems can have perverse effects

  • Bruce Schneier
  • The Guardian
  • October 15, 2009

A few years ago, a company began to sell a liquid with identification codes suspended in it. The idea was that you would paint it on your stuff as proof of ownership. I commented that I would paint it on someone else's stuff, then call the police.

I was reminded of this recently when a group of Israeli scientists demonstrated that it's possible to fabricate DNA evidence.

Read More →

Facebook Should Compete on Privacy, Not Hide It Away

  • Bruce Schneier
  • The Guardian
  • July 15, 2009

Reassuring people about privacy makes them more, not less, concerned. It's called "privacy salience", and Leslie John, Alessandro Acquisti, and George Loewenstein -- all at Carnegie Mellon University -- demonstrated this in a series of clever experiments. In one, subjects completed an online survey consisting of a series of questions about their academic behaviour -- "Have you ever cheated on an exam?" for example. Half of the subjects were first required to sign a consent warning -- designed to make privacy concerns more salient -- while the other half did not.

Read More →

Raising the Cost of Paperwork Errors Will Improve Accuracy

  • Bruce Schneier
  • The Guardian
  • June 24, 2009

It's a sad, horrific story. Homeowner returns to find his house demolished. The demolition company was hired legitimately but there was a mistake and it demolished the wrong house. The demolition company relied on GPS co-ordinates, but requiring street addresses isn't a solution.

Read More →

Do You Know Where Your Data Are?

  • Bruce Schneier
  • The Wall Street Journal
  • April 28, 2009

Do you know what your data did last night? Almost none of the more than 27 million people who took the RealAge quiz realized that their personal health data was being used by drug companies to develop targeted e-mail marketing campaigns.

There's a basic consumer protection principle at work here, and it's the concept of "unfair and deceptive" trade practices. Basically, a company shouldn't be able to say one thing and do another: sell used goods as new, lie on ingredients lists, advertise prices that aren't generally available, claim features that don't exist, and so on.

Read More →

An Enterprising Criminal Has Spotted a Gap in the Market

  • Bruce Schneier
  • The Guardian
  • April 2, 2009

Before his arrest, Tom Berge stole lead roof tiles from several buildings in south-east England, including the Honeywood Museum in Carshalton, the Croydon parish church, and the Sutton high school for girls. He then sold those tiles to scrap metal dealers.

As a security expert, I find this story interesting for two reasons. First, among attempts to ban, or at least censor, Google Earth, lest it help the terrorists, here is an actual crime that relied on the service: Berge needed Google Earth for reconnaissance.

Read More →

How Perverse Incentives Drive Bad Security Decisions

  • Bruce Schneier
  • Wired
  • February 26, 2009

An employee of Whole Foods in Ann Arbor, Michigan, was fired in 2007 for apprehending a shoplifter. More specifically, he was fired for touching a customer, even though that customer had a backpack filled with stolen groceries and was running away with them.

I regularly see security decisions that, like the Whole Foods incident, seem to make absolutely no sense. However, in every case, the decisions actually make perfect sense once you understand the underlying incentives driving the decision.

Read More →

Here Comes Here Comes Everybody

Book Review of Here Comes Everybody: The Power of Organizing Without Organizations
By Clay Shirky
Penguin Press: 2008. 336 pp. $25.95, ISBN: 978-159420153-0

  • Bruce Schneier
  • IEEE Spectrum
  • September 2008

In 1937, Ronald Coase answered one of the most perplexing questions in economics: if markets are so great, why do organizations exist? Why don't people just buy and sell their own services in a market instead?

Read More →

Boston Court's Meddling With "Full Disclosure" Is Unwelcome

  • Bruce Schneier
  • Wired
  • August 21, 2008

In eerily similar cases in the Netherlands and the United States, courts have recently grappled with the computer-security norm of "full disclosure," asking whether researchers should be permitted to disclose details of a fare-card vulnerability that allows people to ride the subway for free.

The "Oyster card" used on the London Tube was at issue in the Dutch case, and a similar fare card used on the Boston "T" was the center of the U.S. case. The Dutch court got it right, and the American court, in Boston, got it wrong from the start -- despite facing an open-and-shut case of First Amendment prior restraint.

Read More →

Why Being Open about Security Makes Us All Safer in the Long Run

  • Bruce Schneier
  • The Guardian
  • August 7, 2008

German translation

London's Oyster card has been cracked, and the final details will become public in October. NXP Semiconductors, the Philips spin-off that makes the system, lost a court battle to prevent the researchers from publishing. People might be able to use this information to ride for free, but the sky won't be falling. And the publication of this serious vulnerability actually makes us all safer in the long run.

Read More →

Economics, Not Apathy, Exposes Chemical Plants To Danger

  • Bruce Schneier
  • Wired
  • October 18, 2007

It's not true that no one worries about terrorists attacking chemical plants, it's just that our politics seem to leave us unable to deal with the threat.

Toxins such as ammonia, chlorine, propane and flammable mixtures are constantly being produced or stored in the United States as a result of legitimate industrial processes. Chlorine gas is particularly toxic; in addition to bombing a plant, someone could hijack a chlorine truck or blow up a railcar. Phosgene is even more dangerous.

Read More →

Nonsecurity Considerations in Security Decisions

  • Bruce Schneier
  • IEEE Computers and Security
  • May/June 2007

Security decisions are generally made for nonsecurity reasons. For security professionals and technologists, this can be a hard lesson. We like to think that security is vitally important. But anyone who has tried to convince the sales VP to give up her department's Blackberries or the CFO to stop sharing his password with his secretary knows security is often viewed as a minor consideration in a larger decision.

Read More →

How Security Companies Sucker Us With Lemons

  • Bruce Schneier
  • Wired
  • April 19, 2007

Danish translation

More than a year ago, I wrote about the increasing risks of data loss because more and more data fits in smaller and smaller packages. Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem -- I use PGPdisk -- but Secustick sounds even better: It automatically erases itself after a set number of bad password attempts.

Read More →

Schneier: Full Disclosure of Security Vulnerabilities a 'Damned Good Idea'

  • Bruce Schneier
  • CSO Online
  • January 2007

Full disclosure -- the practice of making the details of security vulnerabilities public -- is a damned good idea. Public scrutiny is the only reliable way to improve security, while secrecy only makes us less secure.

Unfortunately, secrecy sounds like a good idea. Keeping software vulnerabilities secret, the argument goes, keeps them out of the hands of the hackers (See The Vulnerability Disclosure Game: Are We More Secure?).

Read More →

Does Secrecy Help Protect Personal Information?

  • Bruce Schneier
  • Information Security
  • January 2007

This essay appeared as the second half of a point-counterpoint with Marcus Ranum. Marcus's side can be found on his website.

Personal information protection is an economic problem, not a security problem. And the problem can be easily explained: The organizations we trust to protect our personal information do not suffer when information gets exposed. On the other hand, individuals who suffer when personal information is exposed don't have the capability to protect that information.

Read More →

Drugs: Sports' Prisoner's Dilemma

  • Bruce Schneier
  • Wired
  • August 10, 2006

The big news in professional bicycle racing is that Floyd Landis may be stripped of his Tour de France title because he tested positive for a banned performance-enhancing drug. Sidestepping the issues of whether professional athletes should be allowed to take performance-enhancing drugs, how dangerous those drugs are, and what constitutes a performance-enhancing drug in the first place, I'd like to talk about the security and economic issues surrounding the issue of doping in professional sports.

Drug testing is a security issue. Various sports federations around the world do their best to detect illegal doping, and players do their best to evade the tests.

Read More →

Google's Click-Fraud Crackdown

  • Bruce Schneier
  • Wired
  • July 13, 2006

Google's $6 billion-a-year advertising business is at risk because it can't be sure that anyone is looking at its ads. The problem is called click fraud, and it comes in two basic flavors.

With network click fraud, you host Google AdSense advertisements on your own website. Google pays you every time someone clicks on its ad on your site.

Read More →

It's the Economy, Stupid

  • Bruce Schneier
  • Wired
  • June 29, 2006

Italian translation

I'm sitting in a conference room at Cambridge University, trying to simultaneously finish this article for Wired News and pay attention to the presenter onstage.

I'm in this awkward situation because 1) this article is due tomorrow, and 2) I'm attending the fifth Workshop on the Economics of Information Security, or WEIS: to my mind, the most interesting computer security conference of the year.

The idea that economics has anything to do with computer security is relatively new. Ross Anderson and I seem to have stumbled upon the idea independently.

Read More →

Make Vendors Liable for Bugs

  • Bruce Schneier
  • Wired
  • June 1, 2006

Have you ever been to a retail store and seen this sign on the register: "Your purchase free if you don't get a receipt"? You almost certainly didn't see it in an expensive or high-end store. You saw it in a convenience store, or a fast-food restaurant. Or maybe a liquor store.

Read More →

Sue Companies, Not Coders

  • Bruce Schneier
  • Wired
  • October 20, 2005

At a security conference last week, Howard Schmidt, the former White House cybersecurity adviser, took the bold step of arguing that software developers should be held personally accountable for the security of the code they write.

He's on the right track, but he's made a dangerous mistake. It's the software manufacturers that should be held liable, not the individual programmers. Getting this one right will result in more-secure software for everyone; getting it wrong will simply result in a lot of messy lawsuits.

Read More →

A Real Remedy for Phishers

  • Bruce Schneier
  • Wired
  • October 6, 2005

Last week California became the first state to enact a law specifically addressing phishing. Phishing, for those of you who have been away from the internet for the past few years, is when an attacker sends you an e-mail falsely claiming to be a legitimate business in order to trick you into giving away your account info -- passwords, mostly. When this is done by hacking DNS, it's called pharming.

Financial companies have until now avoided taking on phishers in a serious way, because it's cheaper and simpler to pay the costs of fraud.

Read More →

Information Security: How Liable Should Vendors Be?

  • Bruce Schneier
  • Computerworld
  • October 28, 2004

An update to this essay was published in ENISA Quarterly in January 2007.

Information insecurity is costing us billions. We pay for it in theft: information theft, financial theft. We pay for it in productivity loss, both when networks stop working and in the dozens of minor security inconveniences we all have to endure. We pay for it when we have to buy security products and services to reduce those other two losses.

Read More →

Security and Compliance

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2004

It's been said that all business-to-business sales are motivated by either fear or greed. Traditionally, security products and services have been a fear sell: fear of burglars, murders, kidnappers, and -- more recently -- hackers. Despite repeated attempts by the computer security industry to position itself as a greed sell -- "better Internet security will make your company more profitable because you can better manage your risks" -- fear remains the primary motivator for the purchase of network security products and services.

The problem is that many security risks are not borne by the organization making the purchasing decision.

Read More →

Hacking the Business Climate for Network Security

  • Bruce Schneier
  • IEEE Computer
  • April 2004

Computer security is at a crossroads. It's failing, regularly, and with increasingly serious results. CEOs are starting to notice. When they finally get fed up, they'll demand improvements.

Read More →

Liability changes everything

  • Bruce Schneier
  • Heise Security
  • November 2003

German translation

Computer security is not a problem that technology can solve. Security solutions have a technological component, but security is fundamentally a people problem. Businesses approach security as they do any other business uncertainty: in terms of risk management. Organizations optimize their activities to minimize their cost-risk product, and understanding those motivations is key to understanding computer security today.

Read More →

Should Vendors be Liable for Their Software's Security Flaws?

  • Bruce Schneier
  • Network World
  • April 22, 2002

Network security is not a technological problem; it's a business problem. The only way to address it is to focus on business motivations. To improve the security of their products, companies - both vendors and users - must care; for companies to care, the problem must affect stock price. The way to make this happen is to start enforcing liabilities.

Read More →

Cyber Underwriters Lab?

  • Bruce Schneier
  • Communications of the ACM
  • April 2001

Underwriters Laboratories (UL) is an independent testing organization created in 1893, when William Henry Merrill was called in to find out why the Palace of Electricity at the Columbian Exposition in Chicago kept catching on fire (which is not the best way to tout the wonders of electricity). After making the exhibit safe, he realized he had a business model on his hands. Eventually, if your electrical equipment wasn't UL certified, you couldn't get insurance.

Today, UL rates all kinds of equipment, not just electrical.

Read More →

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient Systems, Inc.