Essays: 2006 Archives
How good are the passwords people are choosing to protect their computers and online accounts?
It's a hard question to answer because data is scarce. But recently, a colleague sent me some spoils from a MySpace phishing attack: 34,000 actual user names and passwords.
The attack was pretty basic.
Spam is filling up the Internet, and it's not going away anytime soon.
It's not just e-mail. We have voice-over-IP spam, instant message spam, cellphone text message spam, blog comment spam and Usenet newsgroup spam. And, if you think broadly enough, these computer-network spam delivery mechanisms join the ranks of computer telemarketing (phone spam), junk mail (paper spam), billboards (visual space spam) and cars driving through town with megaphones (audio spam).
In the world of voting, automatic recount laws are not uncommon. Virginia, where George Allen lost to James Webb in the Senate race by 7,800 out of over 2.3 million votes, or 0.33 percent percent, is an example. If the margin of victory is 1 percent or less, the loser is allowed to ask for a recount. If the margin is 0.5 percent or less, the government pays for it.
This essay also appeared in the Pittsburgh Post-Gazette.
Last week in Florida's 13th Congressional district, the victory margin was only 386 votes out of 153,000. There'll be a mandatory lawyered-up recount, but it won't include the almost 18,000 votes that seem to have disappeared. The electronic voting machines didn't include them in their final tallies, and there's no backup to use for the recount.
Last week Christopher Soghoian created a Fake Boarding Pass Generator website, allowing anyone to create a fake Northwest Airlines boarding pass: any name, airport, date, flight.
This action got him visited by the FBI, who later came back, smashed open his front door, and seized his computers and other belongings. It resulted in calls for his arrest -- the most visible by Rep. Edward Markey (D-Massachusetts) -- who has since recanted. And it's gotten him more publicity than he ever dreamed of.
This essay appeared as part of a point-counterpoint with Marcus Ranum.
Regulation is all about economics. Here's the theory. In a capitalist system, companies make decisions based on their own self-interest. This isn't a bad thing; it's actually a very good thing.
You've seen them: those large concrete blocks in front of skyscrapers, monuments and government buildings, designed to protect against car and truck bombs. They sprang up like weeds in the months after 9/11, but the idea is much older. The prettier ones doubled as planters; the uglier ones just stood there.
Form follows function.
The political firestorm over former U.S. Rep. Mark Foley's salacious instant messages hides another issue, one about privacy. We are rapidly turning into a society where our intimate conversations can be saved and made public later. This represents an enormous loss of freedom and liberty, and the only way to solve the problem is through legislation.
Why should we waste time at airport security, screening people with U.S. government security clearances? This perfectly reasonable question was asked recently by Robert Poole, director of transportation studies at The Reason Foundation, as he and I were interviewed by WOSU Radio in Ohio.
Poole argued that people with government security clearances, people who are entrusted with U.S.
Earlier this month, the popular social networking site Facebook learned a hard lesson in privacy. It introduced a new feature called "News Feeds" that shows an aggregation of everything members do on the site, such as added and deleted friends, a change in relationship status, a new favorite song, a new interest. Instead of a member's friends having to go to his page to view any changes, these changes are all presented to them automatically.
The outrage was enormous.
This essay also appeared in San Jose Mercury News, Sacramento Bee, Concord Monitor, Fort Worth Star Telegram, Dallas Morning News, Contra Costa Times, Statesman Journal, and The Clarion-Ledger.
If you have a passport, now is the time to renew it -- even if it's not set to expire anytime soon. If you don't have a passport and think you might need one, now is the time to get it. In many countries, including the United States, passports will soon be equipped with RFID chips.
If you really want to see Microsoft scramble to patch a hole in its software, don't look to vulnerabilities that impact countless Internet Explorer users or give intruders control of thousands of Windows machines. Just crack Redmond's DRM.
Security patches used to be rare. Software vendors were happy to pretend that vulnerabilities in their products were illusory -- and then quietly fix the problem in the next software release.
This essay appeared as part of a point-counterpoint with Marcus Ranum. Marcus's side can be found on his website.
If you define “critical infrastructure” as “things essential for the functioning of a society and economy,” then software is critical infrastructure. For many companies and individuals, if their computers stop working then they stop working.
It's a situation that sneaked up on us.
On Aug. 16, two men were escorted off a plane headed for Manchester, England, because some passengers thought they looked either Asian or Middle Eastern, might have been talking Arabic, wore leather jackets, and looked at their watches -- and the passengers refused to fly with them on board.
The men were questioned for several hours and then released.
On Aug. 15, an entire airport terminal was evacuated because someone's cosmetics triggered a false positive for explosives. The same day, a Muslim man was removed from an airplane in Denver for reciting prayers.
It's easy to defend against what they planned last time, but it's shortsighted.
Hours-long waits in the security line. Ridiculous prohibitions on what you can carry onboard. Last week's foiling of a major terrorist plot and the subsequent airport security graphically illustrates the difference between effective security and security theater.
None of the airplane security measures implemented because of 9/11 -- no-fly lists, secondary screening, prohibitions against pocket knives and corkscrews -- had anything to do with last week's arrests.
The big news in professional bicycle racing is that Floyd Landis may be stripped of his Tour de France title because he tested positive for a banned performance-enhancing drug. Sidestepping the issues of whether professional athletes should be allowed to take performance-enhancing drugs, how dangerous those drugs are, and what constitutes a performance-enhancing drug in the first place, I'd like to talk about the security and economic issues surrounding the issue of doping in professional sports.
Drug testing is a security issue. Various sports federations around the world do their best to detect illegal doping, and players do their best to evade the tests.
What could you do if you controlled a network of thousands of computers -- or, at least, could use the spare processor cycles on those machines? You could perform massively parallel computations: model nuclear explosions or global weather patterns, factor large numbers or find Mersenne primes, or break cryptographic problems.
All of these are legitimate applications. And you can visit distributed.net and download software that allows you to donate your spare computer cycles to some of these projects.
Google's $6 billion-a-year advertising business is at risk because it can't be sure that anyone is looking at its ads. The problem is called click fraud, and it comes in two basic flavors.
With network click fraud, you host Google AdSense advertisements on your own website. Google pays you every time someone clicks on its ad on your site.
This essay appeared as part of a point-counterpoint with Marcus Ranum.
I've long been hostile to certifications -- I've met too many bad security professionals with certifications and know many excellent security professionals without certifications. But, I've come to believe that, while certifications aren't perfect, they're a decent way for a security professional to learn some of the things he's going to know, and a potential employer to assess whether a job candidate has the security expertise he's going to need to know.
What's changed? Both the job requirements and the certification programs.
I'm sitting in a conference room at Cambridge University, trying to simultaneously finish this article for Wired News and pay attention to the presenter onstage.
I'm in this awkward situation because 1) this article is due tomorrow, and 2) I'm attending the fifth Workshop on the Economics of Information Security, or WEIS: to my mind, the most interesting computer security conference of the year.
The idea that economics has anything to do with computer security is relatively new. Ross Anderson and I seem to have stumbled upon the idea independently.
For a while now, I have been writing about our penchant for "movie-plot threats" -- terrorist fears based on very specific attack scenarios.
Terrorists with crop-dusters, terrorists exploding baby carriages in subways, terrorists filling school buses with explosives -- these are all movie-plot threats. They're good for scaring people, but it's just silly to build national security policy around them.
But if we're going to worry about unlikely attacks, why can't they be exciting and innovative ones?
Have you ever been to a retail store and seen this sign on the register: "Your purchase free if you don't get a receipt"? You almost certainly didn't see it in an expensive or high-end store. You saw it in a convenience store, or a fast-food restaurant. Or maybe a liquor store.
Better to Put People, Not Computers, in Charge of Investigating Potential Plots
Collecting information about every American's phone calls is an example of data mining. The basic idea is to collect as much information as possible on everyone, sift through it with massive computers, and uncover terrorist plots. It's a compelling idea, and convinces many. But it's wrong.
French translation [#1]
French translation [#2]
The most common retort against privacy advocates -- by those in favor of ID checks, cameras, databases, data mining and other wholesale surveillance measures -- is this line: "If you aren't doing anything wrong, what do you have to hide?"
Some clever answers: "If I'm not doing anything wrong, then you have no cause to watch me." "Because the government gets to define what's wrong, and they keep changing the definition." "Because you might do something wrong with my information." My problem with quips like these -- as right as they are -- is that they accept the premise that privacy is about hiding a wrong. It's not. Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect.
Two proverbs say it best: Quis custodiet custodes ipsos? ("Who watches the watchers?") and "Absolute power corrupts absolutely."
Cardinal Richelieu understood the value of surveillance when he famously said, "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged." Watch someone long enough, and you'll find something to arrest -- or just blackmail -- with.
When technology serves its owners, it is liberating. When it is designed to serve others, over the owner's objection, it is oppressive. There's a battle raging on your computer right now -- one that pits you against worms and viruses, Trojans, spyware, automatic update features and digital rights management technologies. It's the battle to determine who owns your computer.
California was the first state to pass a law requiring companies that keep personal data to disclose when that data is lost or stolen. Since then, many states have followed suit. Now Congress is debating federal legislation that would do the same thing nationwide.
Except that it won't do the same thing: The federal bill has become so watered down that it won't be very effective.
It seems like every time someone tests airport security, airport security fails. In tests between November 2001 and February 2002, screeners missed 70 percent of knives, 30 percent of guns and 60 percent of (fake) bombs. And recently, testers were able to smuggle bomb-making parts through airport security in 21 of 21 attempts. It makes you wonder why we're all putting our laptops in a separate bin and taking off our shoes.
Over the past 20 years, there's been a sea change in the battle for personal privacy.
The pervasiveness of computers has resulted in the almost constant surveillance of everyone, with profound implications for our society and our freedoms. Corporations and the police are both using this new trove of surveillance data. We as a society need to understand the technological trends and discuss their implications.
Does it make sense to surrender management, including security, of six U.S. ports to a Dubai-based company? This question has set off a heated debate between the administration and Congress, as members of both parties condemned the deal.
Most of the rhetoric is political posturing, but there's an interesting security issue embedded in the controversy.
One of the basic philosophies of security is defense in depth: overlapping systems designed to provide security even if one of them fails. An example is a firewall coupled with an intrusion-detection system (IDS). Defense in depth provides security, because there's no single point of failure and no assumed single vector for attacks.
It is for this reason that a choice between implementing network security in the middle of the network -- in the cloud -- or at the endpoints is a false dichotomy.
I don't know about your wallet, but mine contains a driver's license, three credit cards, two bank ATM cards, frequent-flier cards for three airlines and frequent-guest cards for three hotel chains, memberships cards to two airline clubs, a library card, a AAA card, a Costco membership, and a bunch of other ID-type cards.
Any technologist who looks at the pile would reasonably ask: why all those cards? Most of them are not intended to be hard-to-forge identification cards; they're simply ways of carrying around unique numbers that are pointers into a database. Why does Visa bother issuing credit cards in the first place?
Some years ago, I left my laptop computer on a train from Washington to New York. Replacing the computer was expensive, but at the time I was more worried about the data.
Of course I had good backups, but now a copy of all my e-mail, client files, personal writings and book manuscripts were ... well, somewhere. Probably the drive would be erased by the computer's new owner, but maybe my personal and professional life would end up in places I didn't want them to be.
In a recent essay, Kevin Kelly warns of the dangers of anonymity. It's OK in small doses, he maintains, but too much of it is a problem: "(I)n every system that I have seen where anonymity becomes common, the system fails. The recent taint in the honor of Wikipedia stems from the extreme ease which anonymous declarations can be put into a very visible public record. Communities infected with anonymity will either collapse, or shift the anonymous to pseudo-anonymous, as in eBay, where you have a traceable identity behind an invented nickname."
Kelly has a point, but it comes out all wrong.
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient, an IBM Company.