Essays: 2009 Archives

Is Aviation Security Mostly for Show?

  • Bruce Schneier
  • CNN
  • December 29, 2009

Last week’s attempted terror attack on an airplane heading from Amsterdam to Detroit has given rise to a bunch of familiar questions.

How did the explosives get past security screening? What steps could be taken to avert similar attacks? Why wasn’t there an air marshal on the flight? And, predictably, government officials have rushed to institute new safety measures to close holes in the system exposed by the incident.

Reviewing what happened is important, but a lot of the discussion is off-base, a reflection of the fundamentally wrong conception most people have of terrorism and how to combat it…

Cold War Encryption is Unrealistic in Today's Trenches

  • Bruce Schneier
  • The Japan Times
  • December 23, 2009

Sometimes mediocre encryption is better than strong encryption, and sometimes no encryption is better still.

The Wall Street Journal reported this week that Iraqi, and possibly also Afghan, militants are using commercial software to eavesdrop on U.S. Predators, other unmanned aerial vehicles, or UAVs, and even piloted planes. The systems weren’t “hacked”—the insurgents can’t control them—but because the downlink is unencrypted, they can watch the same video stream as the coalition troops on the ground.

The naive reaction is to ridicule the military. Encryption is …

Virus and Protocol Scares Happen Every Day—But Don't Let Them Worry You

An SSL security flaw got bloggers hot and bothered, but it's the vendors who need to take action

  • Bruce Schneier
  • The Guardian
  • December 9, 2009

Last month, researchers found a security flaw in the SSL protocol, which is used to protect sensitive web data. The protocol is used for online commerce, webmail, and social networking sites. Basically, hackers could hijack an SSL session and execute commands without the knowledge of either the client or the server. The list of affected products is enormous.

If this sounds serious to you, you’re right. It is serious. Given that, what should you do now? Should you not use SSL until it’s fixed, and only pay for internet purchases over the phone? Should you download some kind of protection? Should you take some other remedial action? What?…

News Media Strategies for Survival for Journalists

  • Bruce Schneier
  • Twin Cities Daily Planet
  • November 14, 2009

Those of us living through the Internet-caused revolution in journalism can’t see what’s going to come out the other side: how readers will interact with journalism, what the sources of journalism will be, how journalists will make money.  All we do know is that mass-market journalism is hurting, badly, and may not survive.  And that we have no idea how to thrive in this new world of digital media.

I have five pieces of advice to those trying to survive and wanting to thrive: based both on experiences as a successful Internet pundit and blogger, and my observations of others, successful and unsuccessful.  I’ll talk about writing, but everything I say applies to audio and video as well…

Reputation is Everything in IT Security

  • Bruce Schneier
  • The Guardian
  • November 11, 2009

In the past, our relationship with our computers was technical. We cared what CPU they had and what software they ran. We understood our networks and how they worked. We were experts, or we depended on someone else for expertise. And security was part of that expertise.

This is changing. We access our email via the web, from any computer or from our phones. We use Facebook, Google Docs, even our corporate networks, regardless of hardware or network. We, especially the younger of us, no longer care about the technical details. Computing is infrastructure; it’s a commodity. It’s less about products and more about services; we simply expect it to work, like telephone service or electricity or a transportation network…

"Zero Tolerance" Really Means Zero Discretion

  • Bruce Schneier
  • MPR NewsQ
  • November 4, 2009

Recent stories have documented the ridiculous effects of zero-tolerance weapons policies in a Delaware school district: a first-grader expelled for taking a camping utensil to school, a 13-year-old expelled after another student dropped a pocketknife in his lap, and a seventh-grader expelled for cutting paper with a utility knife for a class project. Where’s the common sense? the editorials cry.

These so-called zero-tolerance policies are actually zero-discretion policies. They’re policies that must be followed, no situational discretion allowed. We encounter them whenever we go through airport security: no liquids, gels or aerosols. Some workplaces have them for sexual harassment incidents; in some sports a banned substance found in a urine sample means suspension, even if it’s for a real medical condition. Judges have zero discretion when faced with mandatory sentencing laws: three strikes for drug offences and you go to jail, mandatory sentencing for statutory rape (underage sex), etc. A national restaurant chain won’t serve hamburgers rare, even if you offer to sign a waiver. Whenever you hear “that’s the rule, and I can’t do anything about it”—and they’re not lying to get rid of you—you’re butting against a zero discretion policy…

Nature's Fears Extend to Online Behavior

  • Bruce Schneier
  • The Japan Times
  • November 3, 2009

It’s hard work being prey. Watch the birds at a feeder. They’re constantly on alert, and will fly away from food—from easy nutrition—at the slightest movement or sound. Given that I’ve never, ever seen a bird plucked from a feeder by a predator, it seems like a whole lot of wasted effort against a small threat.

Assessing and reacting to risk is one of the most important things a living creature has to deal with. The amygdala, an ancient part of the brain that first evolved in primitive fishes, has that job. It’s what’s responsible for the fight-or-flight reflex. Adrenaline in the bloodstream, increased heart rate, increased muscle tension, sweaty palms; that’s the amygdala in action. You notice it when you fear a dark alley, have vague fears of terrorism, or worry about predators stalking your children on the Internet. And it works fast, faster than consciousnesses: show someone a snake and their amygdala will react before their conscious brain registers that they’re looking at a snake…

Is Antivirus Dead?

  • Bruce Schneier
  • Information Security
  • November 2009

This essay appeared as the second half of a point/counterpoint with Marcus Ranum. Marcus’s half is here.

Security is never black and white. If someone asks, “for best security, should I do A or B?” the answer almost invariably is both. But security is always a trade-off. Often it’s impossible to do both A and B—there’s no time to do both, it’s too expensive to do both, or whatever—and you have to choose. In that case, you look at A and B and you make you best choice. But it’s almost always more secure to do both.

Yes, antivirus programs have been getting less effective…

Beyond Security Theater

We need to move beyond security measures that look good on television to those that actually work, argues Bruce Schneier.

  • Bruce Schneier
  • New Internationalist
  • November 2009

Terrorism is rare, far rarer than many people think. It’s rare because very few people want to commit acts of terrorism, and executing a terrorist plot is much harder than television makes it appear. The best defences against terrorism are largely invisible: investigation, intelligence, and emergency response. But even these are less effective at keeping us safe than our social and political policies, both at home and abroad. However, our elected leaders don’t think this way: they are far more likely to implement security theater against movie-plot threats…

Why Framing Your Enemies Is Now Virtually Child's Play

In the eternal arms race between bad guys and those who police them, automated systems can have perverse effects

  • Bruce Schneier
  • The Guardian
  • October 15, 2009

A few years ago, a company began to sell a liquid with identification codes suspended in it. The idea was that you would paint it on your stuff as proof of ownership. I commented that I would paint it on someone else’s stuff, then call the police.

I was reminded of this recently when a group of Israeli scientists demonstrated that it’s possible to fabricate DNA evidence. So now, instead of leaving your own DNA at a crime scene, you can leave fabricated DNA. And it isn’t even necessary to fabricate. In Charlie Stross’s novel Halting State, the bad guys foul a crime scene by blowing around the contents of a vacuum cleaner bag, containing the DNA of dozens, if not hundreds, of people…

The Difficulty of Un-Authentication

  • Bruce Schneier
  • Threatpost
  • September 28, 2009

By Bruce Schneier

In computer security, a lot of effort is spent on the authentication problem. Whether it’s passwords, secure tokens, secret questions, image mnemonics, or something else, engineers are continually coming up with more complicated—and hopefully more secure—ways for you to prove you are who you say you are over the Internet.

This is important stuff, as anyone with an online bank account or remote corporate network knows. But a lot less thought and work have gone into the other end of the problem: how do you tell the system on the other end of the line that you’re no longer there? How do you unauthenticate yourself?…

The Battle Is On Against Facebook and Co to Regain Control of Our Files

Our use of social networking, as well as iPhones and Kindles, relinquishes control of how we delete files -- we need that back

  • Bruce Schneier
  • The Guardian
  • September 9, 2009

File deletion is all about control. This used to not be an issue. Your data was on your computer, and you decided when and how to delete a file. You could use the delete function if you didn’t care about whether the file could be recovered or not, and a file erase program—I use BCWipe for Windows—if you wanted to ensure no one could ever recover the file.

As we move more of our data onto cloud computing platforms such as Gmail and Facebook, and closed proprietary platforms such as the Kindle and the iPhone deleting data is much harder.

You have to trust that these companies will delete your data when you ask them to, but they’re …

Is Perfect Access Control Possible?

  • Bruce Schneier
  • Information Security
  • September 2009

This essay appeared as the second half of a point/counterpoint with Marcus Ranum. Marcus’s half is here.

Access control is difficult in an organizational setting. On one hand, every employee needs enough access to do his job. On the other hand, every time you give an employee more access, there’s more risk: he could abuse that access, or lose information he has access to, or be socially engineered into giving that access to a malfeasant. So a smart, risk-conscious organization will give each employee the exact level of access he needs to do his job, and no more…

Offhand but On Record

More and more people are using computers to chat with each other, but there's no such thing as a passing conversation on the Web

  • Bruce Schneier
  • The Japan Times
  • August 19, 2009

Facebook recently made changes to its service agreement in order to make members’ data more accessible to other computer users. Amuse, Inc. announced last week that hackers stole credit-card information from about 150,000 clients. Hackers broke into the social network Twitter’s system and stole documents.

Your online data is not private. It may seem private, but it’s not. Take e-mail, for example. You might be the only person who knows your e-mail password, but you’re not the only person who can read your e-mail. Your e-mail provider can read it too—along with anyone he gives access to. That can include any backbone provider who happened to route that mail from the sender to you. In addition, if you read your e-mail from work, various people at your company have access to it, too. And, if they have taps at the correct points, so can the police, the U.S. National Security Agency, and any other well-funded national intelligence organization—along with any hackers or criminals sufficiently skilled to break into one of these sites…

Lockpicking and the Internet

  • Bruce Schneier
  • Dark Reading
  • August 10, 2009

Physical locks aren’t very good. They keep the honest out, but any burglar worth his salt can pick the common door lock pretty quickly.

It used to be that most people didn’t know this. Sure, we all watched television criminals and private detectives pick locks with an ease only found on television and thought it realistic, but somehow we still held onto the belief that our own locks kept us safe from intruders.

The Internet changed that.

First was the MIT Guide to Lockpicking (PDF), written by the late Bob (“Ted the Tool”) Baldwin. Then came Matt Blaze’s 2003 …

The Value of Self-Enforcing Protocols

  • Bruce Schneier
  • Threatpost
  • August 10, 2009

There are several ways two people can divide a piece of cake in half. One way is to find someone impartial to do it for them. This works, but it requires another person. Another way is for one person to divide the piece, and the other person to complain (to the police, a judge, or his parents) if he doesn’t think it’s fair. This also works, but still requires another person – at least to resolve disputes. A third way is for one person to do the dividing, and for the other person to choose the half he wants.

That third way, known by kids, pot smokers, and everyone else who needs to divide something up quickly and fairly, is called cut-and-choose. People use it because its a self-enforcing protocol: a protocol designed so that neither party can cheat…

People Understand Risks—But Do Security Staff Understand People?

Natural human risk intuition deserves respect -- even when it doesn't help the security team

  • Bruce Schneier
  • The Guardian
  • August 5, 2009

This essay also appeared in The Sydney Morning Herald, and The Age.

People have a natural intuition about risk, and in many ways it’s very good. It fails at times due to a variety of cognitive biases, but for normal risks that people regularly encounter, it works surprisingly well: often better than we give it credit for.

This struck me as I listened to yet another conference presenter complaining about security awareness training. He was talking about the difficulty of getting employees at his company to actually follow his security policies: encrypting data on memory sticks, not sharing passwords, not logging in from untrusted wireless networks. “We have to make people understand the risks,” he said…

Technology Shouldn't Give Big Brother a Head Start

  • Bruce Schneier
  • MPR NewsQ
  • July 31, 2009

China is the world’s most successful Internet censor. While the Great Firewall of China isn’t perfect, it effectively limits information flowing in and out of the country. But now the Chinese government is taking things one step further.

Under a requirement taking effect soon, every computer sold in China will have to contain the Green Dam Youth Escort software package. Ostensibly a pornography filter, it is government spyware that will watch every citizen on the Internet.

Green Dam has many uses. It can police a list of forbidden Web sites. It can monitor a user’s reading habits. It can even enlist the computer in some massive botnet attack, as part of a hypothetical future cyberwar…

Protect Your Laptop Data From Everyone, Even Yourself

  • Bruce Schneier
  • Wired
  • July 15, 2009

Last year, I wrote about the increasing propensity for governments, including the U.S. and Great Britain, to search the contents of people’s laptops at customs. What we know is still based on anecdote, as no country has clarified the rules about what their customs officers are and are not allowed to do, and what rights people have.

Companies and individuals have dealt with this problem in several ways, from keeping sensitive data off laptops traveling internationally, to storing the data—encrypted, of course—on websites and then downloading it at the destination. I have never liked either solution. I do a lot of work on the road, and need to carry all sorts of data with me all the time. It’s a lot of data, and downloading it can take a long time. Also, I like to work on long international flights…

Facebook Should Compete on Privacy, Not Hide It Away

  • Bruce Schneier
  • The Guardian
  • July 15, 2009

Reassuring people about privacy makes them more, not less, concerned. It’s called “privacy salience”, and Leslie John, Alessandro Acquisti, and George Loewenstein—all at Carnegie Mellon University—demonstrated this in a series of clever experiments. In one, subjects completed an online survey consisting of a series of questions about their academic behaviour—”Have you ever cheated on an exam?” for example. Half of the subjects were first required to sign a consent warning—designed to make privacy concerns more salient—while the other half did not. Also, subjects were randomly assigned to receive either a privacy confidentiality assurance, or no such assurance. When the privacy concern was made salient (through the consent warning), people reacted negatively to the subsequent confidentiality assurance and were less likely to reveal personal information…

So-called Cyberattack Was Overblown

  • Bruce Schneier
  • MPR NewsQ
  • July 13, 2009

To hear the media tell it, the United States suffered a major cyberattack last week. Stories were everywhere. “Cyber Blitz hits U.S., Korea” was the headline in Thursday’s Wall Street Journal. North Korea was blamed.

Where were you when North Korea attacked America? Did you feel the fury of North Korea’s armies? Were you fearful for your country? Or did your resolve strengthen, knowing that we would defend our homeland bravely and valiantly?

My guess is that you didn’t even notice, that – if you didn’t open a newspaper or read a news website – you had no idea anything was happening. Sure, a few government websites were knocked out, but that’s not alarming or even uncommon. Other government websites were attacked but defended themselves, the sort of thing that happens all the time. If this is what an international cyberattack looks like, it hardly seems worth worrying about at all…

Security, Group Size, and the Human Brain

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2009

View or Download in PDF Format

If the size of your company grows past 150 people, it’s time to get name badges. It’s not that larger groups are somehow less secure, it’s just that 150 is the cognitive limit to the number of people a human brain can maintain a coherent social relationship with.

Primatologist Robin Dunbar derived this number by comparing neocortex—the “thinking” part of the mammalian brain—volume with the size of primate social groups. By analyzing data from 38 primate genera and extrapolating to the human neocortex size, he predicted a human “mean group size” of roughly 150…

Clear Common Sense for Takeoff: How the TSA Can Make Airport Security Work for Passengers Again

  • Bruce Schneier
  • New York Daily News
  • June 24, 2009

It’s been months since the Transportation Security Administration has had a permanent director. If, during the job interview (no, I didn’t get one), President Obama asked me how I’d fix airport security in one sentence, I would reply: “Get rid of the photo ID check, and return passenger screening to pre-9/11 levels.”

Okay, that’s a joke. While showing ID, taking your shoes off and throwing away your water bottles isn’t making us much safer, I don’t expect the Obama administration to roll back those security measures anytime soon. Airport security is more about CYA than anything else: defending against what the terrorists did last time…

Raising the Cost of Paperwork Errors Will Improve Accuracy

  • Bruce Schneier
  • The Guardian
  • June 24, 2009

It’s a sad, horrific story. Homeowner returns to find his house demolished. The demolition company was hired legitimately but there was a mistake and it demolished the wrong house. The demolition company relied on GPS co-ordinates, but requiring street addresses isn’t a solution. A typo in the address is just as likely, and it would have demolished the house just as quickly. The problem is less how the demolishers knew which house to knock down, and more how they confirmed that knowledge. They trusted the paperwork, and the paperwork was wrong. Informality works when every­body knows everybody else. When merchants and customers know each other, government officials and citizens know each other, and people know their neighbours, people know what’s going on. In that sort of milieu, if something goes wrong, people notice…

How Science Fiction Writers Can Help, or Hurt, Homeland Security

  • Bruce Schneier
  • Wired
  • June 18, 2009

A couple of years ago, the Department of Homeland Security hired a bunch of science fiction writers to come in for a day and think of ways terrorists could attack America. If our inability to prevent 9/11 marked a failure of imagination, as some said at the time, then who better than science fiction writers to inject a little imagination into counterterrorism planning?

I discounted the exercise at the time, calling it “embarrassing.” I never thought that 9/11 was a failure of imagination. I thought, and still think, that 9/11 was primarily a confluence of three things: the dual failure of centralized coordination and local control within the FBI, and some lucky breaks on the part of the attackers. More imagination leads to more …

Be Careful When You Come to Put Your Trust in the Clouds

Cloud computing may represent the future of computing but users still need to be careful about who is looking after their data

  • Bruce Schneier
  • The Guardian
  • June 4, 2009

This year’s overhyped IT concept is cloud computing. Also called software as a service (Saas), cloud computing is when you run software over the internet and access it via a browser. The salesforce.com customer management software is an example of this. So is Google Docs. If you believe the hype, cloud computing is the future.

But, hype aside, cloud computing is nothing new. It’s the modern version of the timesharing model from the 1960s, which was eventually killed by the rise of the personal computer. It’s what Hotmail and Gmail have been doing all these years, and it’s social networking sites, remote backup companies, and remote email filtering companies such as MessageLabs. Any IT outsourcing – network infrastructure, security monitoring, remote hosting – is a form of cloud computing…

Coordinate, But Distribute Responsibility

  • Bruce Schneier
  • NYTimes.com
  • May 29, 2009

This essay appeared as part of a round table about Obama’s cybersecurity speech on The New York Times‘s Room for Debate blog.

I am optimistic about President Obama’s new cybersecurity policy and the appointment of a new “cybersecurity coordinator,” though much depends on the details. What we do know is that the threats are real, from identity theft to Chinese hacking to cyberwar.

His principles were all welcome—securing government networks, coordinating responses, working to secure the infrastructure in private hands (the power grid, the communications networks, and so on), although I think he’s …

We Shouldn't Poison Our Minds with Fear of Bioterrorism

  • Bruce Schneier
  • The Guardian
  • May 14, 2009

Terrorists attacking our food supply is a nightmare scenario that has been given new life during the recent swine flu outbreak. Although it seems easy to do, understanding why it hasn’t happened is important. GR Dalziel, at the Nanyang Technological University in Singapore, has written a report chronicling every confirmed case of malicious food contamination in the world since 1950: 365 cases in all, plus 126 additional unconfirmed cases. What he found demonstrates the reality of terrorist food attacks.

It turns out 72% of the food poisonings occurred at the end of the food supply chain – at home – typically by a friend, relative, neighbour, or co-worker trying to kill or injure a specific person. A characteristic example is Heather Mook of York, who in 2007 tried to kill her husband by putting rat poison in his spaghetti…

Should We Have an Expectation of Online Privacy?

  • Bruce Schneier
  • Information Security
  • May 2009

This essay appeared as the second half of a point/counterpoint with Marcus Ranum. Marcus’s half is here.

If your data is online, it is not private. Oh, maybe it seems private. Certainly, only you have access to your e-mail. Well, you and your ISP. And the sender’s ISP. And any backbone provider who happens to route that mail from the sender to you. And, if you read your personal mail from work, your company. And, if they have taps at the correct points, the NSA and any other sufficiently well-funded government intelligence organization—domestic and international…

Do You Know Where Your Data Are?

  • Bruce Schneier
  • The Wall Street Journal
  • April 28, 2009

Do you know what your data did last night? Almost none of the more than 27 million people who took the RealAge quiz realized that their personal health data was being used by drug companies to develop targeted e-mail marketing campaigns.

There’s a basic consumer protection principle at work here, and it’s the concept of “unfair and deceptive” trade practices. Basically, a company shouldn’t be able to say one thing and do another: sell used goods as new, lie on ingredients lists, advertise prices that aren’t generally available, claim features that don’t exist, and so on…

How the Great Conficker Panic Hacked into Human Credulity

  • Bruce Schneier
  • The Guardian
  • April 23, 2009

This essay also appeared in the Gulf Times.

Conficker’s April Fool’s joke—the huge, menacing build-up and then nothing—is a good case study on how we think about risks, one whose lessons are applicable far outside computer security. Generally, our brains aren’t very good at probability and risk analysis. We tend to use cognitive shortcuts instead of thoughtful analysis. This worked fine for the simple risks we encountered for most of our species’s existence, but it’s less effective against the complex risks society forces us to face today.

We tend to judge the probability of something happening on how easily we can bring examples to mind. It’s why people tend to buy earthquake insurance after an earthquake, when the risk is lowest. It’s why those of us who have been the victims of a crime tend to fear crime more than those who haven’t. And it’s why we fear a repeat of 9/11 more than other types of terrorism…

An Enterprising Criminal Has Spotted a Gap in the Market

  • Bruce Schneier
  • The Guardian
  • April 2, 2009

Before his arrest, Tom Berge stole lead roof tiles from several buildings in south-east England, including the Honeywood Museum in Carshalton, the Croydon parish church, and the Sutton high school for girls. He then sold those tiles to scrap metal dealers.

As a security expert, I find this story interesting for two reasons. First, among attempts to ban, or at least censor, Google Earth, lest it help the terrorists, here is an actual crime that relied on the service: Berge needed Google Earth for reconnaissance.

But more interesting is the discrepancy between the value of the lead tiles to the original owner and to the thief. The Sutton school had to spend £10,000 to buy new lead tiles; the Croydon Church had to repair extensive water damage after the theft. But Berge only received £700 a tonne from London scrap metal dealers…

Who Should Be in Charge of Cybersecurity?

  • Bruce Schneier
  • The Wall Street Journal
  • March 31, 2009

U.S. government cybersecurity is an insecure mess, and fixing it is going to take considerable attention and resources. Trying to make sense of this, President Barack Obama ordered a 60-day review of government cybersecurity initiatives. Meanwhile, the U.S. House Subcommittee on Emerging Threats, Cybersecurity, Science and Technology is holding hearings on the same topic.

One of the areas of contention is who should be in charge. The FBI, DHS and DoD—specifically, the NSA—all have interests here. Earlier this month, Rod Beckström resigned from his position…

It’s Time to Drop the "Expectation of Privacy" Test

  • Bruce Schneier
  • Wired
  • March 26, 2009

In the United States, the concept of “expectation of privacy” matters because it’s the constitutional test, based on the Fourth Amendment, that governs when and how the government can invade your privacy.

Based on the 1967 Katz v. United States Supreme Court decision, this test actually has two parts. First, the government’s action can’t contravene an individual’s subjective expectation of privacy; and second, that expectation of privacy must be one that society in general recognizes as reasonable. That second part isn’t based on anything like polling data; it is more of a normative idea of what level of privacy people should be allowed to expect, given the competing importance of personal privacy on one hand and the government’s interest in public safety on the other…

Blaming The User Is Easy—But It's Better to Bypass Them Altogether

  • Bruce Schneier
  • The Guardian
  • March 12, 2009

Blaming the victim is common in IT: users are to blame because they don’t patch their systems, choose lousy passwords, fall for phishing attacks, and so on. But, while users are, and will continue to be, a major source of security problems, focusing on them is an unhelpful way to think.

People regularly don’t do things they are supposed to: changing the oil in their cars, going to the dentist, replacing the batteries in their smoke detectors. Why? Because people learn from experience. If something is immediately harmful, ie, touching a hot stove or petting a live tiger, they quickly learn not to do it. But if someone skips an oil change, ignores a computer patch, or …

The Kindness of Strangers

  • Bruce Schneier
  • The Wall Street Journal
  • March 12, 2009

When I was growing up, children were commonly taught: “don’t talk to strangers.” Strangers might be bad, we were told, so it’s prudent to steer clear of them.

And yet most people are honest, kind, and generous, especially when someone asks them for help. If a small child is in trouble, the smartest thing he can do is find a nice-looking stranger and talk to him.

These two pieces of advice may seem to contradict each other, but they don’t. The difference is that in the second instance, the child is choosing which stranger to talk to. Given that the overwhelming majority of people will help, the child is likely to get help if he chooses a random stranger. But if a stranger comes up to a child and talks to him or her, it’s not a random choice. It’s more likely, although still unlikely, that the stranger is up to no good…

Privacy in the Age of Persistence

  • Bruce Schneier
  • BBC News
  • February 26, 2009

Welcome to the future, where everything about you is saved. A future where your actions are recorded, your movements are tracked, and your conversations are no longer ephemeral. A future brought to you not by some 1984-like dystopia, but by the natural tendencies of computers to produce data.

Data is the pollution of the information age. It’s a natural by-product of every computer-mediated interaction. It stays around forever, unless it’s disposed of. It is valuable when reused, but it must be done carefully. Otherwise, its after-effects are toxic…

How Perverse Incentives Drive Bad Security Decisions

  • Bruce Schneier
  • Wired
  • February 26, 2009

An employee of Whole Foods in Ann Arbor, Michigan, was fired in 2007 for apprehending a shoplifter. More specifically, he was fired for touching a customer, even though that customer had a backpack filled with stolen groceries and was running away with them.

I regularly see security decisions that, like the Whole Foods incident, seem to make absolutely no sense. However, in every case, the decisions actually make perfect sense once you understand the underlying incentives driving the decision. All security decisions are trade-offs, but the motivations behind them are not always obvious: They’re often subjective, and driven by external incentives. And often security trade-offs are made for nonsecurity reasons…

The Secret Question Is: Why Do IT Systems Use Insecure Passwords?

  • Bruce Schneier
  • The Guardian
  • February 19, 2009

Since January, the Conficker.B worm has been spreading like wildfire across the internet, infecting the French navy, hospitals in Sheffield, the court system in Houston, Texas, and millions of computers worldwide. One of the ways it spreads is by cracking administrator passwords on networks. Which leads to the important question: why are IT administrators still using easy-to-guess passwords?

Computer authentication systems have two basic requirements. They need to keep the bad guys from accessing your account, and they need to allow you to access your account. Both are important, and every system is a balancing act between the two. Too little security, and the bad guys will get in too easily. But if the authentication system is too complicated, restrictive, or hard to use, you won’t be able, or won’t bother, to use it…

Thwarting an Internal Hacker

  • Bruce Schneier
  • The Wall Street Journal
  • February 16, 2009

Rajendrasinh Makwana was a UNIX contractor for Fannie Mae. On October 24, he was fired. Before he left, he slipped a logic bomb into the organization’s network. The bomb would have “detonated” on January 31. It was programmed to disable access to the server on which it was running, block any network monitoring software, systematically and irretrievably erase everything—and then replicate itself on all 4,000 Fannie Mae servers. Court papers claim the damage would have been in the millions of dollars, a number that seems low. Fannie Mae would have been shut down for at least a week…

Social Networking Risks

  • Bruce Schneier
  • Information Security
  • February 2009

This essay appeared as the first half of a point-counterpoint with Marcus Ranum.

Are employees blogging corporate secrets? It’s not an unreasonable fear, actually. People have always talked about work to their friends. It’s human nature for people to talk about what’s going on in their lives, and work is a lot of most people’s lives. Historically, organizations generally didn’t care very much. The conversations were intimate and ephemeral, so the risk was small. Unless you worked for the military with actual national secrets, no one worried about it very much…

Terrorists May Use Google Earth, But Fear Is No Reason to Ban It

  • Bruce Schneier
  • The Guardian
  • January 29, 2009

This essay also appeared in The Hindu, Brisbane Times, and The Sydney Morning Herald.

German translation

It regularly comes as a surprise to people that our own infrastructure can be used against us. And in the wake of terrorist attacks or plots, there are fear-induced calls to ban, disrupt or control that infrastructure. According to officials investigating the Mumbai attacks, the terrorists used images from Google Earth to help learn their way around. This isn’t the first time Google Earth has been charged with helping terrorists: in 2007, Google Earth images of British military bases were found in the homes of …

How to Ensure Police Database Accuracy

  • Bruce Schneier
  • The Wall Street Journal
  • January 27, 2009

Earlier this month, the Supreme Court ruled that evidence gathered as a result of errors in a police database is admissible in court. Their narrow decision is wrong, and will only ensure that police databases remain error-filled in the future.

The specifics of the case are simple. A computer database said there was a felony arrest warrant pending for Bennie Herring when there actually wasn’t. When the police came to arrest him, they searched his home and found illegal drugs and a gun. The Supreme Court was asked to rule whether the police had the right to arrest him for possessing those items, even though there was no legal basis for the search and arrest in the first place…

Why Technology Won't Prevent Identity Theft

  • Bruce Schneier
  • The Wall Street Journal
  • January 9, 2009

Hebrew translation

Impersonation isn’t new. In 1556, a Frenchman was executed for impersonating Martin Guerre and this week hackers impersonated Barack Obama on Twitter. It’s not even unique to humans: mockingbirds, Viceroy butterflies, and the brown octopus all use impersonation as a survival strategy. For people, detecting impersonation is a hard problem for three reasons: we need to verify the identity of people we don’t know, we interact with people through “narrow” communications channels like the telephone and Internet, and we want computerized systems to do the verification for us…

Tigers Use Scent, Birds Use Calls—Biometrics Are Just Animal Instinct

  • Bruce Schneier
  • The Guardian
  • January 8, 2009

Biometrics may seem new, but they’re the oldest form of identification. Tigers recognise each other’s scent; penguins recognise calls. Humans recognise each other by sight from across the room, voices on the phone, signatures on contracts and photographs on drivers’ licences. Fingerprints have been used to identify people at crime scenes for more than 100 years.

What is new about biometrics is that computers are now doing the recognising: thumbprints, retinal scans, voiceprints, and typing patterns. There’s a lot of technology involved here, in trying to both limit the number of false positives (someone else being mistakenly recognised as you) and false negatives (you being mistakenly not recognised). Generally, a system can choose to have less of one or the other; less of both is very hard…

State Data Breach Notification Laws: Have They Helped?

  • Bruce Schneier
  • Information Security
  • January 2009

This essay appeared as the second half of a point/counterpoint with Marcus Ranum. Marcus’s half is here.

THERE ARE THREE REASONS for breach notification laws. One, it’s common politeness that when you lose something of someone else’s, you tell him. The prevailing corporate attitude before the law—”They won’t notice, and if they do notice they won’t know it’s us, so we are better off keeping quiet about the whole thing”—is just wrong. Two, it provides statistics to security researchers as to how pervasive the problem really is. And three, it forces companies to improve their security…

Architecture of Privacy

  • Bruce Schneier
  • IEEE Security & Privacy
  • January/February 2009

View or Download in PDF Format

The Internet isn’t really for us. We’re here at the beginning, stumbling around, just figuring out what it’s good for and how to use it. The Internet is for those born into it, those who have woven it into their lives from the beginning. The Internet is the greatest generation gap since rock and roll, and only our children can hope to understand it.

Larry Lessig famously said that, on the Internet, code is law. Facebook’s architecture limits what we can do there, just as gravity limits what we can do on Earth. The 140-character limit on SMSs is as effective as a legal ban on grammar, spelling, and long-winded sentences: KTHXBYE…

Sidebar photo of Bruce Schneier by Joe MacInnis.