Blog: December 2004 Archives

Wi-Fi Shielding Paint

I have no idea how well this works, but it’s a clever idea. From Information Week:

Force Field Wireless makes three products that it says can dramatically reduce the leakage of wireless signals from a room or building.

One odd side-point from the article:

Force Field has been trying to interest the Department of Homeland Security, but discussions are ongoing, Wray says. “Ironically, we have had foreign governments contact us—from the Middle East. Kind of scary.” Wray says he won’t sell to them.

I wonder what’s so scary about selling metal paint to a Middle East government. Maybe the company thinks they will use the paint to “cover up” their misdeeds or poison political prisoners?

Posted on December 30, 2004 at 5:52 PM23 Comments

Canadian Airport Security Loses Uniforms

From CBC News:

1,127 uniform items belonging to Canadian airport screeners were lost or stolen in a nine-month period.

I’m not sure if this is an interesting story or not. We know that a uniform isn’t necessarily a reliable authentication tool, yet we use them anyway.

Losing 1,127 uniforms is bad, because they can be used to impersonate officials. But even if the 1,127 uniforms are found, they can be faked. Can you tell the difference between a legitimate uniform and a decent fake? I can’t.

The real story is the informal nature of most of our real-world authentication systems, and how they can be exploited.

I wrote about this in Beyond Fear (page 199):

Many authentication systems are even more informal. When someone knocks on your door wearing an electric company uniform, you assume she’s there to read the meter. Similarly with deliverymen, service workers, and parking lot attendants. When I return my rental car, I don’t think twice about giving the keys to someone wearing the correct color uniform. And how often do people inspect a police officer’s badge? The potential for intimidation makes this security system even less effective.

Uniforms are easy to fake. In the wee hours of the morning on 18 March 1990, two men entered the Isabella Stuart Gardner Museum in Boston disguised as policemen. They duped the guards, tied them up, and proceeded to steal a dozen paintings by Rembrandt, Vermeer, Manet, and Degas, valued at $300 million. (Thirteen years later, the crime is still unsolved and the art is still missing.) During the Battle of the Bulge in World War II, groups of German commandos operated behind American lines. Dressed as American troops, they tried to deliver false orders to units in an effort to disrupt American plans. Hannibal used the same trick—to greater success—dressing up soldiers who were fluent in Latin in the uniforms of Roman officials and using them to open city gates.

Spies actually take advantage of this authentication problem when recruiting agents. They sometimes recruit a spy by pretending to be working for some third country. For example, a Russian agent working in the U.S. might not be able to convince an American to spy for Russia, but he can pretend to be working for France and might be able to convince the person to spy for that country. This is called “false flag recruitment.” How’s the recruit going to authenticate the nationality of the person he’s spying for?

There’s some fascinating psychology involved in this story. We all authenticate using visual cues, and official uniforms are a big part of that. (When a policeman, or an employee from the local electric company, comes to your door and asks to come in, how to you authenticate him? His uniform and his badge or ID.)

Posted on December 29, 2004 at 8:37 AM17 Comments

Bad Quote

In a story on a computer glitch that forced Comair to cancel 1,100 flighs on Christmas Day, I was quoted in an AP story as saying:

“If this kind of thing could happen by accident, what would happen if the bad guys did this on purpose?” he said.

I’m sure I said that, but I wish the reporter hadn’t used it. It’s just the sort of fear-mongering that I object to when others do it.

Posted on December 28, 2004 at 8:58 AM1 Comments

Physical Access Control

In Los Angeles, the “HOLLYWOOD” sign is protected by a fence and a locked gate. Because several different agencies need access to the sign for various purposes, the chain locking the gate is formed by several locks linked together. Each of the agencies has the key to its own lock, and not the key to any of the others. Of course, anyone who can open one of the locks can open the gate.

This is a nice example of a multiple-user access-control system. It’s simple, and it works. You can also make it as complicated as you want, with different locks in parallel and in series.

Posted on December 23, 2004 at 8:36 AM22 Comments

Airline Passenger Profiling

From an anonymous reader who works for the airline industry in the United States:

There are two initiatives in the works, neither of which leaves me feeling very good about privacy rights.

The first is being put together by the TSA and is called the “Secure Flight Initiative.” An initial test of this program was performed recently and involved each airline listed in the document having to send in passenger information (aka PNR data) for every passenger that “completed a successful domestic trip” during June 2004. A sample of some of the fields that were required to be sent: name, address, phone (if available), itinerary, any comments in the PNR record made by airline personnel, credit card number and expiration date, and any changes made to the booking before the actual flight.

This test data was transmitted to the TSA via physical CD. The requirement was that we “encrypt” it using pkzip (or equivalent) before putting it on the CD. We were to then e-mail the password to the Secure Flight Initiative e-mail address. Although this is far from ideal, it is in fact a big step up. The original process was going to have people simply e-mail the above data to the TSA. They claim to have a secure facility where the data is stored.

As far as the TSA’s retention of the data, the only information we have been given is that as soon as the test phase is over, they will securely delete the data. We were given no choice but had to simply take their word for it.

Rollout of the Secure Flight initiative is scheduled for “next year” sometime. They’re going to start with larger carriers and work their way down to the smaller carriers. It hasn’t been formalized (as far as I know) yet as to what data will be required to be transmitted when. My suspicion is that upon flight takeoff, all PNR data for all passengers on board will be required to be sent. At this point, I still have not heard as to what method will be used for data transmission.

There is another initiative being implemented by the Customs and Border Protection, which is part of the Department of Homeland Security. This (unnamed) initiative is essentially the same thing as the Secure Flight program. That’s right—two government agencies are requiring us to transmit the information separately to each of them. So much for information sharing within the government.

Most larger carriers are complying with this directive by simply allowing the CBP access to their records directly within their
reservation systems (often hosted by folks like Sabre, Worldspan, Galileo, etc). Others (such as the airline I work for) are opting to
only transmit the bare requirements without giving direct access to our system. The data is transmitted over a proprietary data network that is used by the airline industry.

There are a couple of differences between the Secure Flight program and the one being instituted by the CBP. The CBP’s program requires that PNR data for all booked passengers be transmitted:

  • 72 hours before flight time
  • 24 hours before flight time
  • 8 hours before flight time
  • and then again immediately after flight departure

The other major difference is that it looks as though there will be a requirement that we operate in a way that allows them to send a request for data for any flight at any time which we must send back in an automated fashion.

Oh, and just as a kick in the pants, the airlines are expected to pay the costs for all these data transmissions (to the tune of several thousand dollars a month).

Posted on December 22, 2004 at 10:06 AM10 Comments

How Not to Test Airport Security

If this were fiction, no one would believe it. From MSNBC:

Four days after police at Charles de Gaulle Airport slipped some plastic explosives into a random passenger’s bag as part of an exercise for sniffer dogs, it is still missing—and authorities are stumped and embarrassed.

It’s perfectly reasonable to plant an explosive-filled suitcase in an airport in order to test security. It is not okay to plant it in someone’s bag without his knowledge and permission. (The explosive residue could remain on the suitcase long after the test, and might be picked up by one of those trace mass spectrometers that detects the chemical residue associated with bombs.) But if you are going to plant plastic explosives in the suitcase of some innocent passenger, shouldn’t you at least write down which suitcase it was?

Posted on December 20, 2004 at 9:13 AM19 Comments

Burglars and "Feeling Secure"

From Confessions of a Master Jewel Thief by Bill Mason (Villard, 2003):

Nothing works more in a thief’s favor than people feeling secure. That’s why places that are heavily alarmed and guarded can sometimes be the easiest targets. The single most important factor in security—more than locks, alarms, sensors, or armed guards—is attitude. A building protected by nothing more than a cheap combination lock but inhabited by people who are alert and risk-aware is much safer than one with the world’s most sophisticated alarm system whose tenants assume they’re living in an impregnable fortress.

The author, a burglar, found that luxury condos were an excellent target. Although they had much more security technology than other buildings, they were vulnerable because no one believed a thief could get through the lobby.

Posted on December 17, 2004 at 9:21 AM4 Comments

The Electronic Privacy Information Center (EPIC)

For many Americans, the end of the year is charitable contribution time. (The reasons are tax-related.) While there is no shortage of worthy causes around the world, I would like to suggest contributing at least something to EPIC.

Since its founding ten years ago, EPIC has worked to protect privacy, freedom of expression, and democratic values, and to promote the Public Voice in decisions concerning the future of the Internet. They maintain one of the most extensive websites on privacy and free speech issues on the Internet. They litigate Freedom of Information Act, First Amendment, and privacy cases. They publish books on open government and privacy. They train law school students about the Internet and the public interest. They testify frequently before Congress about emerging civil liberties issues. They provide an extensive listing of privacy resources as well as a guide to practical privacy tools.

Remember when it became public that JetBlue (and other airlines) provided passenger information to the U.S. government in violation of their own privacy policies? Or when it was revealed that the CAPPS-II airline passenger profiling system would be used for other, non-terrorism, purposes? EPIC’s FOIA work uncovered those stories.

December 15th is the 213th anniversary of the signing of the Bill of Rights. Read through it again today, and notice how the different laws protect the security of Americans. I’m proud to be a member of EPIC’s Advisory Board. They do good work, and we’re all a lot more secure because of it.

EPIC’s website

U.S. Bill of Rights

Posted on December 15, 2004 at 9:10 AM1 Comments

The Electronic Privacy Information Center (EPIC)

For many Americans, the end of the year is charitable contribution time. (The reasons are tax-related.) While there is no shortage of worthy causes around the world, I would like to suggest contributing at least something to EPIC.

Since its founding ten years ago, EPIC has worked to protect privacy, freedom of expression, and democratic values, and to promote the Public Voice in decisions concerning the future of the Internet. They maintain one of the most extensive websites on privacy and free speech issues on the Internet. They litigate Freedom of Information Act, First Amendment, and privacy cases. They publish books on open government and privacy. They train law school students about the Internet and the public interest. They testify frequently before Congress about emerging civil liberties issues. They provide an extensive listing of privacy resources as well as a guide to practical privacy tools.

Remember when it became public that JetBlue (and other airlines) provided passenger information to the U.S. government in violation of their own privacy policies? Or when it was revealed that the CAPPS-II airline passenger profiling system would be used for other, non-terrorism, purposes? EPIC’s FOIA work uncovered those stories.

December 15th is the 213th anniversary of the signing of the Bill of Rights. Read through it again today, and notice how the different laws protect the security of Americans. I’m proud to be a member of EPIC’s Advisory Board. They do good work, and we’re all a lot more secure because of it.

EPIC’s website

U.S. Bill of Rights

Posted on December 15, 2004 at 9:10 AM0 Comments

Security Notes from All Over: Israeli Airport Security Questioning

In both Secrets and Lies and Beyond Fear, I discuss a key difference between attackers and defenders: the ability to concentrate resources. The defender must defend against all possible attacks, while the attacker can concentrate his forces on one particular avenue of attack. This precept is fundamental to a lot of security, and can be seen very clearly in counterterrorism. A country is in the position of the interior; it must defend itself against all possible terrorist attacks: airplane terrorism, chemical bombs, threats at the ports, threats through the mails, lone lunatics with automatic weapons, assassinations, etc, etc, etc. The terrorist just needs to find one weak spot in the defenses, and exploit that. This concentration versus diffusion of resources is one reason why the defender’s job is so much harder than the attackers.

This same principle guides security questioning at the Ben Gurion Airport in Israel. In this example, the attacker is the security screener and the defender is the terrorist. (It’s important to remember that “attacker” and “defender” are not moral labels, but tactical ones. Sometimes the defenders are the good guys and the attackers are the bad guys. In this case, the bad guy is trying to defend his cover story against the good guy who is attacking it.)

Security is impressively tight at the airport, and includes a potentially lengthy interview by a trained security screener. The screener asks each passenger questions, trying to determine if he’s a security risk. But instead of asking different questions—where do you live, what do you do for a living, where were you born—the screener asks questions that follow a storyline: “Where are you going? Who do you know there? How did you meet him? What were you doing there?” And so on.

See the ability to concentrate resources? The defender—the terrorist trying to sneak aboard the airplane—needs a cover story sufficiently broad to be able to respond to any line of questioning. So he might memorize the answers to several hundred questions. The attacker—the security screener—could ask questions scattershot, but instead concentrates his questioning along one particular line. The theory is that eventually the defender will reach the end of his memorized story, and that the attacker will then notice the subtle changes in the defender as he starts to make up answers.

Posted on December 14, 2004 at 9:26 AM19 Comments

The Doghouse: Internet Security Foundation

This organization wants to sell their tool to view passwords in textboxes “hidden” by asterisks on Windows. They claim it’s “a glaring security hole in Microsoft Windows” and a “grave security risk.” Their webpage is thick with FUD, and warns that criminals and terrorists can easily clean out your bank accounts because of this problem.

Of course the problem isn’t that users type passwords into their computers. The problem is that programs don’t store passwords securely. The problem is that programs pass passwords around in plaintext. The problem is that users choose lousy passwords, and then store them insecurely. The problem is that financial applications are still relying on passwords for security, rather than two-factor authentication.

But the “Internet Security Foundation” is trying to make as much noise as possible. They even have this nasty letter to Bill Gates that you can sign (36 people had signed, the last time I looked). I’m not sure what their angle is, but I don’t like it.

Posted on December 13, 2004 at 1:32 PM12 Comments

Safe Personal Computing

I am regularly asked what average Internet users can do to ensure their security. My first answer is usually, “Nothing—you’re screwed.”

But that’s not true, and the reality is more complicated. You’re screwed if you do nothing to protect yourself, but there are many things you can do to increase your security on the Internet.

Two years ago, I published a list of PC security recommendations. The idea was to give home users concrete actions they could take to improve security. This is an update of that list: a dozen things you can do to improve your security.

General: Turn off the computer when you’re not using it, especially if you have an “always on” Internet connection.

Laptop security: Keep your laptop with you at all times when not at home; treat it as you would a wallet or purse. Regularly purge unneeded data files from your laptop. The same goes for PDAs. People tend to store more personal data—including passwords and PINs—on PDAs than they do on laptops.

Backups: Back up regularly. Back up to disk, tape or CD-ROM. There’s a lot you can’t defend against; a recent backup will at least let you recover from an attack. Store at least one set of backups off-site (a safe-deposit box is a good place) and at least one set on-site. Remember to destroy old backups. The best way to destroy CD-Rs is to microwave them on high for five seconds. You can also break them in half or run them through better shredders.

Operating systems: If possible, don’t use Microsoft Windows. Buy a Macintosh or use Linux. If you must use Windows, set up Automatic Update so that you automatically receive security patches. And delete the files “command.com” and “cmd.exe.”

Applications: Limit the number of applications on your machine. If you don’t need it, don’t install it. If you no longer need it, uninstall it. Look into one of the free office suites as an alternative to Microsoft Office. Regularly check for updates to the applications you use and install them. Keeping your applications patched is important, but don’t lose sleep over it.

Browsing: Don’t use Microsoft Internet Explorer, period. Limit use of cookies and applets to those few sites that provide services you need. Set your browser to regularly delete cookies. Don’t assume a Web site is what it claims to be, unless you’ve typed in the URL yourself. Make sure the address bar shows the exact address, not a near-miss.

Web sites: Secure Sockets Layer (SSL) encryption does not provide any assurance that the vendor is trustworthy or that its database of customer information is secure.

Think before you do business with a Web site. Limit the financial and personal data you send to Web sites—don’t give out information unless you see a value to you. If you don’t want to give out personal information, lie. Opt out of marketing notices. If the Web site gives you the option of not storing your information for later use, take it. Use a credit card for online purchases, not a debit card.

Passwords: You can’t memorize good enough passwords any more, so don’t bother. For high-security Web sites such as banks, create long random passwords and write them down. Guard them as you would your cash: i.e., store them in your wallet, etc.

Never reuse a password for something you care about. (It’s fine to have a single password for low-security sites, such as for newspaper archive access.) Assume that all PINs can be easily broken and plan accordingly.

Never type a password you care about, such as for a bank account, into a non-SSL encrypted page. If your bank makes it possible to do that, complain to them. When they tell you that it is OK, don’t believe them; they’re wrong.

E-mail : Turn off HTML e-mail. Don’t automatically assume that any e-mail is from the “From” address.

Delete spam without reading it. Don’t open messages with file attachments, unless you know what they contain; immediately delete them. Don’t open cartoons, videos and similar “good for a laugh” files forwarded by your well-meaning friends; again, immediately delete them.

Never click links in e-mail unless you’re sure about the e-mail; copy and paste the link into your browser instead. Don’t use Outlook or Outlook Express. If you must use Microsoft Office, enable macro virus protection; in Office 2000, turn the security level to “high” and don’t trust any received files unless you have to. If you’re using Windows, turn off the “hide file extensions for known file types” option; it lets Trojan horses masquerade as other types of files. Uninstall the Windows Scripting Host if you can get along without it. If you can’t, at least change your file associations, so that script files aren’t automatically sent to the Scripting Host if you double-click them.

Antivirus and anti-spyware software : Use it—either a combined program or two separate programs. Download and install the updates, at least weekly and whenever you read about a new virus in the news. Some antivirus products automatically check for updates. Enable that feature and set it to “daily.”

Firewall : Spend $50 for a Network Address Translator firewall device; it’s likely to be good enough in default mode. On your laptop, use personal firewall software. If you can, hide your IP address. There’s no reason to allow any incoming connections from anybody.

Encryption: Install an e-mail and file encryptor (like PGP). Encrypting all your e-mail or your entire hard drive is unrealistic, but some mail is too sensitive to send in the clear. Similarly, some files on your hard drive are too sensitive to leave unencrypted.

None of the measures I’ve described are foolproof. If the secret police wants to target your data or your communications, no countermeasure on this list will stop them. But these precautions are all good network-hygiene measures, and they’ll make you a more difficult target than the computer next door. And even if you only follow a few basic measures, you’re unlikely to have any problems.

I’m stuck using Microsoft Windows and Office, but I use Opera for Web browsing and Eudora for e-mail. I use Windows Update to automatically get patches and install other patches when I hear about them. My antivirus software updates itself regularly. I keep my computer relatively clean and delete applications that I don’t need. I’m diligent about backing up my data and about storing data files that are no longer needed offline.

I’m suspicious to the point of near-paranoia about e-mail attachments and Web sites. I delete cookies and spyware. I watch URLs to make sure I know where I am, and I don’t trust unsolicited e-mails. I don’t care about low-security passwords, but try to have good passwords for accounts that involve money. I still don’t do Internet banking. I have my firewall set to deny all incoming connections. And I turn my computer off when I’m not using it.

That’s basically it. Really, it’s not that hard. The hardest part is developing an intuition about e-mail and Web sites. But that just takes experience.

This essay previously appeared on CNet

Posted on December 13, 2004 at 9:59 AM45 Comments

Canada and the USA PATRIOT Act

The Information & Privacy Commissioner for the Province of British Columbia, Canada, has just published an extensive report titled “Privacy and the USA Patriot Act – Implications for British Columbia Public Sector Outsourcing.”

It’s an interesting trend. It’s one thing for countries to complain about U.S. counterterrorism policies, but it’s quite another for countries to reduce their commerce with the U.S. The latter will get noticed in Washington far quicker than the former.

Posted on December 10, 2004 at 8:48 AM2 Comments

The Digital Person

Last week, I stayed at the St. Regis hotel in Washington, DC. It was my first visit, and the management gave me a questionnaire, asking me things like my birthday, my spouse’s name and birthday, my anniversary, and my favorite fruits, drinks, and sweets. The purpose was clear; the hotel wanted to be able to offer me a more personalized service the next time I visited. And it was a purpose I agreed with; I wanted more personalized service. But I was very uneasy about filling out the form.

It wasn’t that the information was particularly private. I make no secret of my birthday, or anniversary, or food preferences. Much of that information is even floating around the Web somewhere. Secrecy wasn’t the issue.

The issue was control. In the United States, information about a person is owned by the person who collects it, not by the person it is about. There are specific exceptions in the law, but they’re few and far between. There are no broad data protection laws, as you find in the European Union. There are no Privacy Commissioners, as you find in Canada. Privacy law in the United States is largely about secrecy: if the information is not secret, there’s little you can do to control its dissemination.

As a result, enormous databases exist that are filled with personal information. These databases are owned by marketing firms, credit bureaus, and the government. Amazon knows what books we buy. Our supermarket knows what foods we eat. Credit card companies know quite a lot about our purchasing habits. Credit bureaus know about our financial history, and what they don’t know is contained in bank records. Health insurance records contain details about our health and well-being. Government records contain our Social Security numbers, birthdates, addresses, mother’s maiden names, and a host of other things. Many driver’s license records contain digital pictures.

All of this data is being combined, indexed, and correlated. And it’s being used for all sorts of things. Targeted marketing campaigns are just the tip of the iceberg. This information is used by potential employers to judge our suitability as employees, by potential landlords to determine our suitability as renters, and by the government to determine our likelihood of being a terrorist.

Some stores are beginning to use our data to determine whether we are desirable customers or not. If customers take advantage of too many discount offers or make too many returns, they may be profiled as “bad” customers and be treated differently from the “good” customers.

And with alarming frequency, our data is being abused by identity thieves. The businesses that gather our data don’t care much about keeping it secure. So identity theft is a problem where those who suffer from it—the individuals—are not in a position to improve security, and those who are in a position to improve security don’t suffer from the problem.

The issue here is not about secrecy, it’s about control. The issue is that both government and commercial organizations are building “digital dossiers” about us, and that these dossiers are being used to judge and categorize us through some secret process.

A new book by George Washington University Law Professor Daniel Solove examines the problem of the growing accumulation of personal information in enormous databases. The book is called The Digital Person: Technology and Privacy in the Information Age, and it is a fascinating read.

Solove’s book explores this problem from a legal perspective, explaining what the problem is, how current U.S. law fails to deal with it, and what we should do to protect privacy today. It’s an unusually perceptive discussion of one of the most
vexing problems of the digital age—our loss of control over our personal information. It’s a fascinating journey into the almost surreal ways personal information is hoarded, used, and abused in the digital age.

Solove argues that our common conceptualization of the privacy problem as Big Brother—some faceless organization knowing our most intimate secrets—is only one facet of the issue. A better metaphor can be found in Franz Kafka’s The Trial. In the book, a vast faceless bureaucracy constructs a huge dossier about a person, who can’t find out what information exists about him in the dossier, why the information has been gathered, or what it will be used for. Privacy is not about intimate secrets; it’s about who has control of the millions of pieces of personal data that we leave like droppings as we go through our daily life. And until the U.S. legal system recognizes this fact, Americans will continue to live in an world where they have little control over their digital person.

In the end, I didn’t complete the questionnaire from the St. Regis Hotel. While I was fine with the St. Regis in Washington, DC, having that information to make my subsequent stays a little more personal, and was probably fine with that information being shared among other St. Regis hotels, I wasn’t comfortable with the St. Regis doing whatever they wanted with that information. I wasn’t comfortable with them selling the information to a marketing database. I wasn’t comfortable with anyone being able to buy that information. I wasn’t comfortable with that information ending up in a database of my habits, my preferences, my proclivities. It wasn’t the primary use of that information that bothered me, it was the secondary uses.

Solove has done much more thinking about this issue than I have. His book provides a clear account of the social problems involving information privacy, and haunting predictions of current U.S. legal policies. Even more importantly, the legal solutions he provides are compelling and worth serious consideration. I recommend his book highly.

The book’s website

Order the book on Amazon

Posted on December 9, 2004 at 9:18 AM12 Comments

Phishing by Cell Phone

From an alert reader:

I don’t know whether to tell you, or RISKS, or the cops, but I just received an automated call on my cellphone that asked for the last four digits of my Social Security number. The script went:

Hello! This is not a solicitation! We have an important message for J-O-H-N DOE (my first name was spelled out, but the last name was pronounced). If this is J-O-H-N Doe, Press 1 now!

(after pressing 1:)

For your security, please enter the last four digits of your Social Security Number!

I have no idea who it was, because I’ll be—damned—if I’d give out ANY digits of my SSN to an unidentified party. My cell’s display is broken so I’m not sure whether there was any caller ID information on it, but I also know that can be forged. What company expects its customers to give up critical data like that during an unidentified, unsolicited call?

Sadly, there probably are well-meaning people writing automatic telephone scripts that ask this sort of question. But this could very well be a phishing scheme: someone trying to trick the listener into divulging personal information.

In general, my advice is to not divulge this sort of information when you are called. There’s simply no way to verify who the caller is. Far safer is for you to make the call.

For example, I regularly receive calls from the anti-fraud division of my credit card company checking up on particular charges. I always hang up on them and call them back, using the phone number on the back of my card. That gives me more confidence that I’m speaking to a legitimate representative of my credit card company.

Posted on December 7, 2004 at 1:58 PM48 Comments

Airline Security and the TSA

Recently I received this e-mail from an anonymous Transportation Security Association employee—those are the guys that screen you at airports—about something I wrote about airline security:

I was going through my email archives and found a link to a story. Apparently you enjoy attacking TSA, and relish in stories where others will do it for you. I work for TSA, and understand that a lot of what they do is little more than “window dressing” (your words). However, very few can argue that they are a lot more effective than the rent-a-cop agencies that were supposed to be securing the airports pre-9/11.

Specifically to the story, it has all the overtones of Urban Legend: overly emotional, details about the event but only giving names of self and “pet,” overly verbose, etc. Bottom line, that the TSA screener and supervisor told our storyteller that the fish was “in no way… allowed to pass through security” is in direct violation of publicly accessible TSA policy. Fish may be unusual, but they’re certainly not forbidden.

I’m disappointed, Bruce. Usually you’re well researched. Your articles and books are very well documented and cross-referenced. However, when it comes to attacking TSA, you seem to take some stories at face value without verifying the facts and TSA policies. I’m also disappointed that you would popularize a story that implicitly tells people to hide their “prohibited items” from security. I have personally witnessed people get arrested for thinking they were clever in hiding something they shouldn’t be carrying anyway.

For those who don’t want to follow the story, it’s about a college student who was told by TSA employees that she could not take her fish on the airplane for security reasons. She then smuggled the fish aboard by hiding it in her carry-on luggage. Final score: fish 1, TSA 0.

To the points in the letter:

  1. You may be right that the story is an urban legend. But it did appear in a respectable newspaper, and I hope the newspaper did at least some fact-checking. I may have been overly optimistic.

  2. You are certainly right that pets are allowed on board airplanes. But just because something is official TSA policy doesn’t mean it’s necessarily followed in the field. There have been many instances of TSA employees inventing rules. It doesn’t surprise me in the least that one of them refused to allow a fish on an airplane.

  3. I am happy to popularize a story that implicitly tells people to hide prohibited items from airline security. I’m even happy to explicitly tell people to hide prohibited items from airline security. A friend of mine recently figured out how to reliably sneak her needlepoint scissors through security—they’re the foldable kind, and she slips them against a loose leaf binder—and I am pleased to publicize that. Hell, I’ve even explained how to fly on someone else’s airline ticket and make your own knife on board an airplane [Beyond Fear, page 85].

  4. I think airline passenger screening is inane. It’s invasive, expensive, time-consuming, and doesn’t make us safer. I think that civil disobedience is a perfectly reasonable reaction.

  5. Honestly, you won’t get arrested if you simply play dumb when caught. Unless, that is, you’re smuggling an actual gun or bomb aboard an aircraft, in which case you probably deserve to get arrested.

Posted on December 6, 2004 at 9:15 AM28 Comments

Sensible Security from New Zealand

I like the way this guy thinks about security as a trade-off:

In the week United States-led forces invaded Iraq, the service was receiving a hoax bomb call every two or three hours, but not one aircraft was delayed. Security experts decided the cost of halting flights far outweighed the actual risk to those on board.

It’s a short article, and in it Mark Everitt, General Manager of the New Zealand Aviation Security Service, says that small knives should be allowed on flights, and that sky marshals should not.

Before 9/11, New Zealand domestic flights had no security at all, because there simply wasn’t anywhere to hijack a flight to.

Posted on December 3, 2004 at 10:00 AM8 Comments

Striking Back Against Spammers

From The Register:

Lycos Europe has started to distribute a special screensaver (http://makelovenotspam.com/intl) in a controversial bid to battle spam. The program—titled Make Love Not Spam, and available for Windows and the Mac OS—sends a request to view a spam source site. When a large number of screensavers send their requests at the same time the spam web page becomes overloaded and slow.

I don’t like spam either, but this is not how to go about defeating it. It’s vigilante justice, and it’s morally and ethically wrong.

I’ve written about it before:

…vigilantism: citizens and companies taking the law into their own hands and going after their assailants. Viscerally, it’s an appealing idea. But it’s a horrible one, and one that society after society has eschewed.

Our society does not give us the right of revenge, and wouldn’t work very well if it did. Our laws give us the right to justice, in either the criminal or civil context. Justice is all we can expect if we want to enjoy our constitutional freedoms, personal safety, and an orderly society.

Anyone accused of a crime deserves a fair trial. He deserves the right to defend himself, the right to face his accuser, the right to an attorney, and the right to be held innocent until proven guilty.

Vigilantism flies in the face of these rights. It punishes people before they have been found guilty. Angry mobs lynching someone suspected of murder is wrong, even if that person is actually guilty.

As emotionally satisfying as it might be to get back at the spammers, as much as the spammers deserve it, please think twice before downloading and using this tool.

UPDATE: Another danger—this kind of thing easily escalates as those counterattacking are, in turn, attacked back.

Posted on December 2, 2004 at 9:38 AM18 Comments

An Impressive Car Theft

The armored Mercedes belonging to the CEO of DaimlerChrysler has been stolen:

The black company car, which is worth about 800,000 euros ($1 million), disappeared on the night of Oct. 26, police spokesman Klaus-Peter Arand said in a telephone interview. The limousine, which sports a 12-cylinder engine and is equipped with a broadcasting device to help retrieve the car, hasn’t yet been found, the police said.

There are two types of thieves, whether they be car thieves or otherwise. First, there are the thieves that want a car, any car. And second, there are the thieves that want one particular car. Against the first type, any security measure that makes your car harder to steal than the car next to it is good enough. Against the second type, even a sophisticated GPS tracking system might not be enough.

Posted on December 1, 2004 at 11:01 AM17 Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.