Essays: 2005 Archives

Unchecked Presidential Power

In the weeks after 9/11, while America and the world were grieving, President Bush built a legal rationale for a dictatorship. Then he started using it to avoid the law.

  • Bruce Schneier
  • Minneapolis Star Tribune
  • December 20, 2005

This past Thursday, the New York Times exposed the most significant violation of federal surveillance law in the post-Watergate era. President Bush secretly authorized the National Security Agency to engage in domestic spying, wiretapping thousands of Americans and bypassing the legal procedures regulating this activity.

This isn’t about the spying, although that’s a major issue in itself. This is about the Fourth Amendment protections against illegal search. This is about circumventing a teeny tiny check by the judicial branch, placed there by the legislative branch, placed there 27 years ago—on the last occasion that the executive branch abused its power so broadly…

Uncle Sam is Listening

Bush may have bypassed federal wiretap law to deploy more high-tech methods of surveillance.

  • Bruce Schneier
  • Salon
  • December 20, 2005

When President Bush directed the National Security Agency to secretly eavesdrop on American citizens, he transferred an authority previously under the purview of the Justice Department to the Defense Department and bypassed the very laws put in place to protect Americans against widespread government eavesdropping. The reason may have been to tap the NSA’s capability for data mining and widespread surveillance.

Illegal wiretapping of Americans is nothing new. In the 1950s and ’60s, in a program called “Project Shamrock,” the NSA intercepted every single telegram coming in or going out of the United States. It conducted eavesdropping without a warrant on behalf of the CIA and other agencies. Much of this became public during the 1975 Church Committee hearings and resulted in the now famous Foreign Intelligence Surveillance Act …

Hold the Photons!

  • Bruce Schneier
  • Wired
  • December 15, 2005

How would you feel if you invested millions of dollars in quantum cryptography, and then learned that you could do the same thing with a few 25-cent Radio Shack components?

I’m exaggerating a little here, but if a new idea out of Texas A&M University turns out to be secure, we’ve come close.

Earlier this month, Laszlo Kish proposed securing a communications link, like a phone or computer line, with a pair of resistors. By adding electronic noise, or using the natural thermal noise of the resistors—called “Johnson noise”—Kish can prevent eavesdroppers from listening in…

The Hackers are Coming!

  • Bruce Schneier
  • Utility Automation & Engineering T&D
  • December 13, 2005

Over the past few years, we have seen hacking transform from a hobbyist activity to a criminal one. Hobbyist threats included defacing web pages, releasing worms that did damage, and running denial-of-service attacks against major networks. The goal was fun, notoriety, or just plain malice.

The new criminal attacks have a more focused goal: profit. This difference makes the new attackers more dangerous and potentially more damaging.

Criminals differ from hobbyists in several respects. One, they care less about finesse. Hobbyist hackers looked for new and clever attacks, while criminals will use whatever works. Hobbyists regularly advertised their presence, while criminals are more likely to be stealthy. Hobbyists generally didn’t care who they attacked, while criminals are more likely to target individual organizations. Criminal attackers are less risk-averse; they’re willing to risk jail, which hobbyists are largely not. As such, criminal attackers will engage in behavior that hobbyists avoid…

Airline Security a Waste of Cash

  • Bruce Schneier
  • Wired
  • December 1, 2005

Since 9/11, our nation has been obsessed with air-travel security. Terrorist attacks from the air have been the threat that looms largest in Americans’ minds. As a result, we’ve wasted millions on misguided programs to separate the regular travelers from the suspected terrorists—money that could have been spent to actually make us safer.

Consider CAPPS and its replacement, Secure Flight. These are programs to check travelers against the 30,000 to 40,000 names on the government’s No-Fly list, and another 30,000 to 40,000 on its Selectee list…

Airplane Security and Metal Knives

  • Bruce Schneier
  • The Sydney Morning Herald
  • November 30, 2005

This essay also appeared in The Age.

Two weeks ago, Immigration Minister Amanda Vanstone caused a stir by ridiculing airplane security in a public speech. She derided much of post-9/11 airline security, especially the use of plastic knives instead of metal ones, and said “a lot of what we do is to make people feel better as opposed to actually achieve an outcome.”

As a foreigner, I know very little about Australian politics. I don’t know anything about Senator Vanstone, her politics, her policies, or her party. I have no idea what she stands for. But as a security technologist, I agree 100% with her comments. Most airplane security is what I call “security theater”: ineffective measures designed to make people feel better about flying…

The Erosion of Freedom

Spying tools are now routinely used against ordinary, law-abiding Americans who have no connection to terrorism.

  • Bruce Schneier
  • Minneapolis Star Tribune
  • November 21, 2005

Christmas 2003, Las Vegas. Intelligence hinted at a terrorist attack on New Year’s Eve. In the absence of any real evidence, the FBI tried to compile a real-time database of everyone who was visiting the city. It collected customer data from airlines, hotels, casinos, rental car companies, even storage locker rental companies. All this information went into a massive database—probably close to a million people overall—that the FBI’s computers analyzed, looking for links to known terrorists. Of course, no terrorist attack occurred and no plot was discovered: The intelligence was wrong…

Real Story of the Rogue Rootkit

  • Bruce Schneier
  • Wired
  • November 17, 2005

Spanish translation

It’s a David and Goliath story of the tech blogs defeating a mega-corporation.

On Oct. 31, Mark Russinovich broke the story in his blog: Sony BMG Music Entertainment distributed a copy-protection scheme with music CDs that secretly installed a rootkit on computers. This software tool is run without your knowledge or consent—if it’s loaded on your computer with a CD, a hacker can gain and maintain access to your system and you wouldn’t know it.

The Sony code modifies Windows so you can’t tell it’s there, a process called “cloaking” in the hacker world. It acts as spyware, surreptitiously sending information about you to Sony. And it can’t be removed; trying to get rid of it …

Fatal Flaw Weakens RFID Passports

  • Bruce Schneier
  • Wired
  • November 3, 2005

In 2004, when the U.S. State Department first started talking about embedding RFID chips in passports, the outcry from privacy advocates was huge. When the State Department issued its draft regulation in February, it got 2,335 comments, 98.5 percent negative. In response, the final State Department regulations, issued last week, contain two features that attempt to address security and privacy concerns. But one serious problem remains.

Before I describe the problem, some context on the surrounding controversy may be helpful. RFID chips are passive, and broadcast information to any reader that queries the chip. So critics, myself …

The Zotob Storm

  • Bruce Schneier
  • IEEE Security & Privacy
  • November/December 2005

View or Download in PDF Format

If you’ll forgive the possible comparison to hurricanes, Internet epidemics are much like severe weather: they happen randomly, they affect some segments of the population more than others, and your previous preparation determines how effective your defense is.

Zotob was the first major worm outbreak since MyDoom in January 2004. It happened quickly—less than five days after Microsoft published a critical security bulletin (its 39th of the year). Zotob’s effects varied greatly from organization to organization: some networks were brought to their knees, while others didn’t even notice…

Sue Companies, Not Coders

  • Bruce Schneier
  • Wired
  • October 20, 2005

At a security conference last week, Howard Schmidt, the former White House cybersecurity adviser, took the bold step of arguing that software developers should be held personally accountable for the security of the code they write.

He’s on the right track, but he’s made a dangerous mistake. It’s the software manufacturers that should be held liable, not the individual programmers. Getting this one right will result in more-secure software for everyone; getting it wrong will simply result in a lot of messy lawsuits.

To understand the difference, it’s necessary to understand the basic economic incentives of companies, and how businesses are affected by liabilities. In a capitalist society, businesses are profit-making ventures, and they make decisions based on both short- and long-term profitability. They try to balance the costs of more-secure software—extra developers, fewer features, longer time to market—against the costs of insecure software: expense to patch, occasional bad press, potential loss of sales…

A Real Remedy for Phishers

  • Bruce Schneier
  • Wired
  • October 6, 2005

Last week California became the first state to enact a law specifically addressing phishing. Phishing, for those of you who have been away from the internet for the past few years, is when an attacker sends you an e-mail falsely claiming to be a legitimate business in order to trick you into giving away your account info—passwords, mostly. When this is done by hacking DNS, it’s called pharming.

Financial companies have until now avoided taking on phishers in a serious way, because it’s cheaper and simpler to pay the costs of fraud. That’s unacceptable, however, because consumers who fall prey to these scams pay a price that goes beyond financial losses, in inconvenience, stress and, in some cases, blots on their credit reports that are hard to eradicate. As a result, lawmakers need to do more than create new punishments for wrongdoers—they need to create tough new incentives that will effectively force financial companies to change the status quo and improve the way they protect their customers’ assets. Unfortunately, the California …

A Sci-Fi Future Awaits the Court

  • Bruce Schneier
  • Wired
  • September 22, 2005

At John Roberts’ confirmation hearings last week, there weren’t enough discussions about science fiction. Technologies that are science fiction today will become constitutional questions before Roberts retires from the bench. The same goes for technologies that cannot even be conceived of now. And many of these questions involve privacy.

According to Roberts, there is a “right to privacy” in the Constitution. At least, that’s what he said during his Senate hearings last week. It’s a politically charged question, because the two decisions that established the right to contraceptives and abortion—Griswold v. Connecticut (1965) and Roe v. Wade (1973)—are based in part on a right to privacy. “Where do you stand on privacy?” can be code for “Where do you stand on abortion?”…

Toward a Truly Safer Nation

  • Bruce Schneier
  • Minneapolis Star Tribune
  • September 11, 2005

Leaving aside the political posturing and the finger-pointing, how did our nation mishandle Katrina so badly? After spending tens of billions of dollars on homeland security (hundreds of billions, if you include the war in Iraq) in the four years after 9/11, what did we do wrong? Why were there so many failures at the local, state and federal levels?

These are reasonable questions. Katrina was a natural disaster and not a terrorist attack, but that only matters before the event. Large-scale terrorist attacks and natural disasters differ in cause, but they’re very similar in aftermath. And one can easily imagine a Katrina-like aftermath to a terrorist attack, especially one involving nuclear, biological or chemical weapons…

Terrorists Don't Do Movie Plots

  • Bruce Schneier
  • Wired
  • September 8, 2005

Sometimes it seems like the people in charge of homeland security spend too much time watching action movies. They defend against specific movie plots instead of against the broad threats of terrorism.

We all do it. Our imaginations run wild with detailed and specific threats. We imagine anthrax spread from crop dusters. Or a contaminated milk supply. Or terrorist scuba divers armed with almanacs. Before long, we’re envisioning an entire movie plot, without Bruce Willis saving the day. And we’re scared.

Psychologically, this all makes sense. Humans have good imaginations. Box cutters and shoe bombs conjure vivid mental images. “We must protect the Super Bowl” packs more emotional punch than the vague “we should defend ourselves against terrorism.”…

University Networks and Data Security

  • Bruce Schneier
  • IEEE Security & Privacy
  • September/October 2005

View or Download the PDF

In general, the problems of securing a university network are no different than those of securing any other large corporate network. But when it comes to data security, universities have their own unique problems. It’s easy to point fingers at students—a large number of potentially adversarial transient insiders. Yet that’s really no different from a corporation dealing with an assortment of employees and contractors—the difference is the culture.

Universities are edge-focused; central policies tend to be weak, by design, with maximum autonomy for the edges. This means they have natural tendencies against centralization of services. Departments and individual professors are used to being semiautonomous. Because these institutions were established long before the advent of computers, when networking did begin to infuse universities, it developed within existing administrative divisions. Some universities have academic departments with separate IT departments, budgets, and staff, with a central IT group providing bandwidth but little or no oversight. Unfortunately, these smaller IT groups don’t generally count policy development and enforcement as part of their core competencies…

Make Businesses Pay in Credit Card Scam

  • Bruce Schneier
  • New York Daily News
  • June 23, 2005

The epidemic of personal data thefts and losses – most recently 40 million individuals by Visa and MasterCard – should concern us for two reasons: personal privacy and identity theft.

Real reform is required to solve these problems. We need to reduce the amount of personal information collected, limit how it can be used and resold, and require companies that mishandle our data to be liable for that mishandling. And, most importantly, we need to make financial institutions liable for fraudulent transactions.

Whether it is the books we take out of the library, the Web sites we visit, our medical information or the contents of our E-mails and text messages, most of us have personal data that we don’t want made public. Legislation that securely keeps this data out of the hands of criminals won’t affect the privacy invasions committed by reputable companies in the name of price discrimination, marketing or customer service…

Attack Trends: 2004 and 2005

  • Bruce Schneier
  • Queue
  • June 2, 2005

Counterpane Internet Security Inc. monitors more than 450 networks in 35 countries, in every time zone. In 2004 we saw 523 billion network events, and our analysts investigated 648,000 security “tickets.” What follows is an overview of what’s happening on the Internet right now, and what we expect to happen in the coming months.

In 2004, 41 percent of the attacks we saw were unauthorized activity of some kind, 21 percent were scanning, 26 percent were unauthorized access, 9 percent were DoS (denial of service), and 3 percent were misuse of applications…

Risks of Third-Party Data

  • Bruce Schneier
  • Communications of the ACM
  • May 2005

Reports are coming in torrents. Criminals are known to have downloaded personal credit information of over 145,000 Americans from ChoicePoint’s network. Hackers took over one of Lexis Nexis’ databases, gaining access to personal files of 32,000 people. Bank of America Corp. lost computer data tapes that contained personal information on 1.2 million federal employees, including members of the U.S. Senate. A hacker downloaded the names, Social Security numbers, voicemail and SMS messages, and photos of 400 T-Mobile customers, and probably had access to all of their 16.3 million U.S. customers. In a separate incident, Paris Hilton’s phone book and SMS messages were hacked and distributed on the Internet…

Is Two-Factor Authentication Too Little, Too Late?

  • Bruce Schneier
  • Network World
  • April 4, 2005

Recently I published an essay arguing that two-factor authentication is an ineffective defense against identity theft (see www.schneier.com/essay-083.html). For example, issuing tokens to online banking customers won’t reduce fraud, because new attack techniques simply ignore the countermeasure. Unfortunately, some took my essay as a condemnation of two-factor authentication in general. This is not true. It’s simply a matter of understanding the threats and the attacks.

Passwords just don’t work anymore. As computers have gotten faster, password guessing has gotten easier. Ever-more-complicated passwords are required to evade password-guessing software. At the same time, there’s an upper limit to how complex a password users can be expected to remember. About five years ago, these two lines crossed: It is no longer reasonable to expect users to have passwords that can’t be guessed. For anything that requires reasonable security, the era of passwords is over…

Two-Factor Authentication: Too Little, Too Late

  • Bruce Schneier
  • Communications of the ACM
  • April 2005

Two-factor authentication isn’t our savior. It won’t defend against phishing. It’s not going to prevent identity theft. It’s not going to secure online accounts from fraudulent transactions. It solves the security problems we had 10 years ago, not the security problems we have today.

The problem with passwords is that it is too easy to lose control of them. People give their passwords to other people. People write them down, and other people read them. People send them in email, and that email is intercepted. People use them to log into remote servers, and their communications are eavesdropped on. Passwords are also easy to guess. And once any of that happens, the password no longer works as an authentication token because you can never be sure who is typing in that password…

Why Data Mining Won't Stop Terror

  • Bruce Schneier
  • Wired
  • March 9, 2005

In the post-9/11 world, there’s much focus on connecting the dots. Many believe data mining is the crystal ball that will enable us to uncover future terrorist plots. But even in the most wildly optimistic projections, data mining isn’t tenable for that purpose. We’re not trading privacy for security; we’re giving up privacy and getting no security in return.

Most people first learned about data mining in November 2002, when news broke about a massive government data mining program called Total Information Awareness. The basic idea was as audacious as it was repellent: suck up as much data as possible about everyone, sift through it with massive computers, and investigate patterns that might indicate terrorist plots…

Digital Information Rights Need Tech-Savvy Courts

  • Bruce Schneier
  • eWeek
  • February 14, 2005

Opinion: The courts need to recognize that in the information age, virtual privacy and physical privacy don’t have the same boundaries.

For at least seven months last year, a hacker had access to T-Mobile’s customer network. He is known to have accessed information belonging to 400 customers—names, Social Security numbers, voice mail messages, SMS messages, photos—and probably had the ability to access data belonging to any of T-Mobile’s 16.3 million U.S. customers. But in its fervor to report on the security of cell phones, and T-Mobile in particular, the media missed the most important point of the story: The security of much of our data is not under our control…

The Curse of the Secret Question

  • Bruce Schneier
  • Computerworld
  • February 9, 2005

It’s happened to all of us: We sign up for some online account, choose a difficult-to-remember and hard-to-guess password, and are then presented with a “secret question” to answer. Twenty years ago, there was just one secret question: “What’s your mother’s maiden name?” Today, there are more: “What street did you grow up on?” “What’s the name of your first pet?” “What’s your favorite color?” And so on.

The point of all these questions is the same: a backup password. If you forget your password, the secret question can verify your identity so you can choose another password or have the site e-mail your current password to you. It’s a great idea from a customer service perspective—a user is less likely to forget his first pet’s name than some random password—but terrible for security. The answer to the secret question is much easier to guess than a good password, and the information is much more public. (I’ll bet the name of my family’s first pet is in some database somewhere.) And even worse, everybody seems to use the same series of secret questions…

Economics of Information Security

  • Ross Anderson and Bruce Schneier
  • IEEE Security & Privacy
  • January/February 2005

View or Download in PDF Format

Several years ago, a number of researchers began to realize that many security systems fail not so much for technical reasons as from misplaced incentives. Often the people who could protect a system were not the ones who suffered the costs of failure. Hospital medical-records systems provided comprehensive billing-management features for the administrators who specified them, but were not so good at protecting patients’ privacy. Auto- matic teller machines suffered from fraud in countries like the United Kingdom and the Netherlands, where poor regulation left banks without sufficient incentive to se- cure their systems, and allowed them to pass the cost of fraud along to their customers. And one reason the Internet is insecure is that liability for attacks is so diffuse…

Authentication and Expiration

  • Bruce Schneier
  • IEEE Security & Privacy
  • January/February 2005

View or Download in PDF Format

There’s a security problem with many Internet authentication systems that’s never talked about: there’s no way to terminate the authentication.

A couple of months ago, I bought something from an e-commerce site. At the checkout page, I wasn’t able to just type in my credit-card number and make my purchase. Instead, I had to choose a username and password. Usually I don’t like doing that, but in this case I wanted to be able to access my account at a later date. In fact, the password was useful because I needed to return an item I purchased…

Sidebar photo of Bruce Schneier by Joe MacInnis.