Essays: 2012 Archives

Militarizing Cyberspace Will Do More Harm Than Good

  • Bruce Schneier
  • The Irish Times
  • November 29, 2012

We’re in the early years of a cyberwar arms race. It’s expensive, it’s destabilising and it threatens the very fabric of the internet we use every day. Cyberwar treaties, as imperfect as they might be, are the only way to contain the threat.

If you read the press and listen to government leaders, we’re already in the middle of a cyberwar. By any normal definition of the word ‘war’, this is ridiculous. But the definition of cyberwar has been expanded to include government-sponsored espionage, potential terrorist attacks in cyberspace, large-scale criminal fraud and even hacker kids attacking government networks and critical infrastructure. This definition is being pushed by the military and government contractors, both of which are gaining power and making money from cyberwar fears…

When It Comes to Security, We're Back to Feudalism

  • Bruce Schneier
  • Wired
  • November 26, 2012

Some of us have pledged our allegiance to Google: We have Gmail accounts, we use Google Calendar and Google Docs, and we have Android phones. Others have pledged allegiance to Apple: We have Macintosh laptops, iPhones, and iPads; and we let iCloud automatically synchronize and back up everything. Still others of us let Microsoft do it all. Or we buy our music and e-books from Amazon, which keeps records of what we own and allows downloading to a Kindle, computer, or phone. Some of us have pretty much abandoned e-mail altogether … for Facebook…

Lance Armstrong and the Prisoners' Dilemma of Doping in Professional Sports

  • Bruce Schneier
  • Wired
  • October 26, 2012

Doping in professional sports is back in the news, as the overwhelming evidence against Lance Armstrong led to his being stripped of his seven Tour de France titles and more. But instead of focusing on the issues of performance-enhancing drugs and whether professional athletes be allowed to take them, I’d like to talk about the security and economic aspects of the issue.

Because drug testing is a security issue. Various sports federations around the world do their best to detect illegal doping, and players do their best to evade the tests. It’s a classic security arms race: Improvements in detection technologies lead to improvements in drug detection evasion, which in turn spur the development of better detection capabilities. Right now, it seems drugs are winning; in some places, these drug tests are described as “intelligence tests”—if you can’t get around them, you don’t deserve to play…

Fear Pays the Bills, but Accounts Must Be Settled

  • Bruce Schneier
  • New York Times Room for Debate
  • October 19, 2012

A lot of the debate around President Obama’s cybersecurity initiative center on how much of a burden it would be on industry, and how that should be financed. As important as that debate is, it obscures some of the larger issues surrounding cyberwar, cyberterrorism, and cybersecurity in general.

It’s difficult to have any serious policy discussion amongst the fear mongering. Secretary Panetta’s recent comments are just the latest; search the Internet for “cyber 9/11,” “cyber Peal-Harbor,” “cyber Katrina,” or—my favorite—”cyber Armageddon.”

There’s an enormous amount of money and power that results from pushing cyberwar and cyberterrorism: power within the military, the Department of Homeland Security, and the Justice Department; and lucrative government contracts supporting those organizations. As long as cyber remains a prefix that scares, it’ll continue to be used as a bugaboo…

The Importance of Security Engineering

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2012

View or Download in PDF Format

In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don’t recommend reading the entire discussion; we spent 14,000 words talking past each other. But what’s interesting is how our debate illustrates the differences between a security engineer and an intelligent layman. Harris was uninterested in the detailed analysis required to understand a security system and unwilling to accept that security engineering is a specialized discipline with a body of knowledge and relevant expertise. He trusted his intuition…

Drawing the Wrong Lessons from Horrific Events

  • Bruce Schneier
  • CNN
  • July 31, 2012

Horrific events, such as the massacre in Aurora, can be catalysts for social and political change. Sometimes it seems that they’re the only catalyst; recall how drastically our policies toward terrorism changed after 9/11 despite how moribund they were before.

The problem is that fear can cloud our reasoning, causing us to overreact and to overly focus on the specifics. And the key is to steer our desire for change in that time of fear.

Our brains aren’t very good at probability and risk analysis. We tend to exaggerate spectacular, strange and rare events, and downplay ordinary, familiar and common ones. We think rare risks are more common than they are. We fear them more than probability indicates we should…

So You Want to Be a Security Expert

  • Bruce Schneier
  • Krebs on Security
  • July 12, 2012

This essay orginally appeared as part of a series of advice columns on how to break into the field of security.

I regularly receive e-mail from people who want advice on how to learn more about computer security, either as a course of study in college or as an IT person considering it as a career choice.

First, know that there are many subspecialties in computer security. You can be an expert in keeping systems from being hacked, or in creating unhackable software. You can be an expert in finding security problems in software, or in networks. You can be an expert in viruses, or policies, or cryptography. There are many, many opportunities for many different skill sets. You don’t have to be a coder to be a security expert…

Securing Medical Research: A Cybersecurity Point of View

  • Bruce Schneier
  • Science
  • June 22, 2012

ABSTRACT: The problem of securing biological research data is a difficult and complicated one. Our ability to secure data on computers is not robust enough to ensure the security of existing data sets. Lessons from cryptography illustrate that neither secrecy measures, such as deleting technical details, nor national solutions, such as export controls, will work.


Science and Nature have each published papers on the H5N1 virus in humans after considerable debate about whether the research results in those papers could help terrorists create a bioweapon (…

Debate Club: An International Cyberwar Treaty Is the Only Way to Stem the Threat

  • Bruce Schneier
  • U.S. News
  • June 8, 2012

We’re in the early years of a cyberwar arms race. It’s expensive, it’s destabilizing, and it threatens the very fabric of the Internet we use every day. Cyberwar treaties, as imperfect as they might be, are the only way to contain the threat.

If you read the press and listen to government leaders, we’re already in the middle of a cyberwar. By any normal definition of the word “war,” this is ridiculous. But the definition of cyberwar has been expanded to include government-sponsored espionage, potential terrorist attacks in cyberspace, large-scale criminal fraud, and even hacker kids attacking government networks and critical infrastructure. This definition is being pushed both by the military and by government contractors, who are gaining power and making money on cyberwar fear…

The Vulnerabilities Market and the Future of Security

  • Bruce Schneier
  • Forbes
  • May 30, 2012

Brazilian Portuguese translation

Recently, there have been several articles about the new market in zero-day exploits: new and unpatched computer vulnerabilities. It’s not just software companies, who sometimes pay bounties to researchers who alert them of security vulnerabilities so they can fix them. And it’s not only criminal organizations, who pay for vulnerabilities they can exploit. Now there are governments, and companies who sell to governments, who buy vulnerabilities with the intent of keeping them secret so they can exploit them.

To Profile or Not to Profile? (Part 2)

A Debate between Sam Harris and Bruce Schneier

  • Sam Harris and Bruce Schneier
  • Sam Harris's Blog
  • May 25, 2012

Return to Part 1

A profile that encompasses “anyone who could conceivably be Muslim” needs to include almost everyone. Anything less and you’re missing known Muslim airplane terrorist wannabes.

SH:It includes a lot of people, but I wouldn’t say almost everyone. In fact, I just flew out of San Jose this morning and witnessed a performance of security theater so masochistic and absurd that, given our ongoing discussion, it seemed too good to be true. If I hadn’t known better, I would have thought I was being punked by the TSA.

The line at the back-scatter X-ray machines was moving so slowly that I opted for a full-body pat down. What was the hang up? There were …

To Profile or Not to Profile? (Part 1)

A Debate between Sam Harris and Bruce Schneier

  • Sam Harris and Bruce Schneier
  • Sam Harris's Blog
  • May 25, 2012

Introduction by Sam Harris

I recently wrote two articles in defense of “profiling” in the context of airline security (1 & 2), arguing that the TSA should stop doing secondary screenings of people who stand no reasonable chance of being Muslim jihadists. I knew this proposal would be controversial, but I seriously underestimated how inflamed the response would be. Had I worked for a newspaper or a university, I could well have lost my job over it.

One thing that united many of my critics was their admiration for Bruce Schneier. Bruce is an expert on security who has written for …

The Trouble with Airport Profiling

  • Bruce Schneier
  • Forbes
  • May 9, 2012

Why do otherwise rational people think it’s a good idea to profile people at airports? Recently, neuroscientist and best-selling author Sam Harris related a story of an elderly couple being given the twice-over by the TSA, pointed out how these two were obviously not a threat, and recommended that the TSA focus on the actual threat: “Muslims, or anyone who looks like he or she could conceivably be Muslim.”

This is a bad idea. It doesn’t make us any safer—and it actually puts us all at risk.

The right way to look at security is in terms of cost-benefit trade-offs. If adding profiling to airport checkpoints allowed us to detect more threats at a lower cost, than we should implement it. If it didn’t, we’d be foolish to do so. Sometimes profiling works. Consider a sheep in a meadow, happily munching on grass. When he spies a wolf, he’s going to judge that individual wolf based on a bunch of assumptions related to the past behavior of its species. In short, that sheep is going to profile…and then run away. This makes perfect sense, and is why evolution produced sheep—and other animals—that …

Economist Debates: Airport Security

  • Bruce Schneier
  • The Economist
  • March 20, 2012

These essays are part of a debate with Kip Hawley, the former Administrator of the TSA. For the full debate, see The Economist‘s website.

German translation

Opening Remarks

Let us start with the obvious: in the entire decade or so of airport security since the attacks on America on September 11th 2001, the Transportation Security Administration (TSA) has not foiled a single terrorist plot or caught a single terrorist. Its own “Top 10 Good Catches of 2011” does not have a single terrorist on the list. The “good catches” are forbidden items carried by mostly forgetful, and entirely innocent, people—the sorts of guns and knives that would have been just as easily caught by pre-9/11 screening procedures. Not that the TSA is expert at that; it regularly …

How Changing Technology Affects Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2012

View or Download in PDF Format

This essay was republished in Wired on February 24, 2014.

Security is a tradeoff, a balancing act between attacker and defender. Unfortunately, that balance is never static. Changes in technology affect both sides. Society uses new technologies to decrease what I call the scope of defection—what attackers can get away with—and attackers use new technologies to increase it. What’s interesting is the difference between how the two groups incorporate new technologies.

Changes in security systems can be slow. Society has to implement any new security technology as a group, which implies agreement and coordination and—in some instances—a lengthy bureaucratic procurement process. Meanwhile, an attacker can just use the new technology. For example, at the end of the horse-and-buggy era, it was easier for a bank robber to use his new motorcar as a getaway vehicle than it was for a town’s police department to decide it needed a police car, get the budget to buy one, choose which one to buy, buy it, and then develop training and policies for it. And if only one police department did this, the bank robber could just move to another town. Defectors are more agile and adaptable, making them much better at being early adopters of new technology…

High-Tech Cheats in a World of Trust

  • Bruce Schneier
  • New Scientist
  • February 27, 2012

I CAN put my cash card into an ATM anywhere in the world and take out a fistful of local currency, while the corresponding amount is debited from my bank account at home. I don’t even think twice: regardless of the country, I trust that the system will work.

The whole world runs on trust. We trust that people on the street won’t rob us, that the bank we deposited money in last month returns it this month, that the justice system punishes the guilty and exonerates the innocent. We trust the food we buy won’t poison us, and the people we let in to fix our boiler won’t murder us…

The Big Idea: Bruce Schneier

  • Bruce Schneier
  • Whatever
  • February 16, 2012

My big idea is a big question. Every cooperative system contains parasites. How do we ensure that society’s parasites don’t destroy society’s systems?

It’s all about trust, really. Not the intimate trust we have in our close friends and relatives, but the more impersonal trust we have in the various people and systems we interact with in society. I trust airline pilots, hotel clerks, ATMs, restaurant kitchens, and the company that built the computer I’m writing this short essay on. I trust that they have acted and will act in the ways I expect them to. This type of trust is more a matter of consistency or predictability than of intimacy…

Sidebar photo of Bruce Schneier by Joe MacInnis.