Blog: 2004 Archives

Wi-Fi Shielding Paint

I have no idea how well this works, but it’s a clever idea. From Information Week:

Force Field Wireless makes three products that it says can dramatically reduce the leakage of wireless signals from a room or building.

One odd side-point from the article:

Force Field has been trying to interest the Department of Homeland Security, but discussions are ongoing, Wray says. “Ironically, we have had foreign governments contact us—from the Middle East. Kind of scary.” Wray says he won’t sell to them.

I wonder what’s so scary about selling metal paint to a Middle East government. Maybe the company thinks they will use the paint to “cover up” their misdeeds or poison political prisoners?

Posted on December 30, 2004 at 5:52 PM23 Comments

Canadian Airport Security Loses Uniforms

From CBC News:

1,127 uniform items belonging to Canadian airport screeners were lost or stolen in a nine-month period.

I’m not sure if this is an interesting story or not. We know that a uniform isn’t necessarily a reliable authentication tool, yet we use them anyway.

Losing 1,127 uniforms is bad, because they can be used to impersonate officials. But even if the 1,127 uniforms are found, they can be faked. Can you tell the difference between a legitimate uniform and a decent fake? I can’t.

The real story is the informal nature of most of our real-world authentication systems, and how they can be exploited.

I wrote about this in Beyond Fear (page 199):

Many authentication systems are even more informal. When someone knocks on your door wearing an electric company uniform, you assume she’s there to read the meter. Similarly with deliverymen, service workers, and parking lot attendants. When I return my rental car, I don’t think twice about giving the keys to someone wearing the correct color uniform. And how often do people inspect a police officer’s badge? The potential for intimidation makes this security system even less effective.

Uniforms are easy to fake. In the wee hours of the morning on 18 March 1990, two men entered the Isabella Stuart Gardner Museum in Boston disguised as policemen. They duped the guards, tied them up, and proceeded to steal a dozen paintings by Rembrandt, Vermeer, Manet, and Degas, valued at $300 million. (Thirteen years later, the crime is still unsolved and the art is still missing.) During the Battle of the Bulge in World War II, groups of German commandos operated behind American lines. Dressed as American troops, they tried to deliver false orders to units in an effort to disrupt American plans. Hannibal used the same trick—to greater success—dressing up soldiers who were fluent in Latin in the uniforms of Roman officials and using them to open city gates.

Spies actually take advantage of this authentication problem when recruiting agents. They sometimes recruit a spy by pretending to be working for some third country. For example, a Russian agent working in the U.S. might not be able to convince an American to spy for Russia, but he can pretend to be working for France and might be able to convince the person to spy for that country. This is called “false flag recruitment.” How’s the recruit going to authenticate the nationality of the person he’s spying for?

There’s some fascinating psychology involved in this story. We all authenticate using visual cues, and official uniforms are a big part of that. (When a policeman, or an employee from the local electric company, comes to your door and asks to come in, how to you authenticate him? His uniform and his badge or ID.)

Posted on December 29, 2004 at 8:37 AM17 Comments

Bad Quote

In a story on a computer glitch that forced Comair to cancel 1,100 flighs on Christmas Day, I was quoted in an AP story as saying:

“If this kind of thing could happen by accident, what would happen if the bad guys did this on purpose?” he said.

I’m sure I said that, but I wish the reporter hadn’t used it. It’s just the sort of fear-mongering that I object to when others do it.

Posted on December 28, 2004 at 8:58 AM1 Comments

Physical Access Control

In Los Angeles, the “HOLLYWOOD” sign is protected by a fence and a locked gate. Because several different agencies need access to the sign for various purposes, the chain locking the gate is formed by several locks linked together. Each of the agencies has the key to its own lock, and not the key to any of the others. Of course, anyone who can open one of the locks can open the gate.

This is a nice example of a multiple-user access-control system. It’s simple, and it works. You can also make it as complicated as you want, with different locks in parallel and in series.

Posted on December 23, 2004 at 8:36 AM22 Comments

Airline Passenger Profiling

From an anonymous reader who works for the airline industry in the United States:

There are two initiatives in the works, neither of which leaves me feeling very good about privacy rights.

The first is being put together by the TSA and is called the “Secure Flight Initiative.” An initial test of this program was performed recently and involved each airline listed in the document having to send in passenger information (aka PNR data) for every passenger that “completed a successful domestic trip” during June 2004. A sample of some of the fields that were required to be sent: name, address, phone (if available), itinerary, any comments in the PNR record made by airline personnel, credit card number and expiration date, and any changes made to the booking before the actual flight.

This test data was transmitted to the TSA via physical CD. The requirement was that we “encrypt” it using pkzip (or equivalent) before putting it on the CD. We were to then e-mail the password to the Secure Flight Initiative e-mail address. Although this is far from ideal, it is in fact a big step up. The original process was going to have people simply e-mail the above data to the TSA. They claim to have a secure facility where the data is stored.

As far as the TSA’s retention of the data, the only information we have been given is that as soon as the test phase is over, they will securely delete the data. We were given no choice but had to simply take their word for it.

Rollout of the Secure Flight initiative is scheduled for “next year” sometime. They’re going to start with larger carriers and work their way down to the smaller carriers. It hasn’t been formalized (as far as I know) yet as to what data will be required to be transmitted when. My suspicion is that upon flight takeoff, all PNR data for all passengers on board will be required to be sent. At this point, I still have not heard as to what method will be used for data transmission.

There is another initiative being implemented by the Customs and Border Protection, which is part of the Department of Homeland Security. This (unnamed) initiative is essentially the same thing as the Secure Flight program. That’s right—two government agencies are requiring us to transmit the information separately to each of them. So much for information sharing within the government.

Most larger carriers are complying with this directive by simply allowing the CBP access to their records directly within their
reservation systems (often hosted by folks like Sabre, Worldspan, Galileo, etc). Others (such as the airline I work for) are opting to
only transmit the bare requirements without giving direct access to our system. The data is transmitted over a proprietary data network that is used by the airline industry.

There are a couple of differences between the Secure Flight program and the one being instituted by the CBP. The CBP’s program requires that PNR data for all booked passengers be transmitted:

  • 72 hours before flight time
  • 24 hours before flight time
  • 8 hours before flight time
  • and then again immediately after flight departure

The other major difference is that it looks as though there will be a requirement that we operate in a way that allows them to send a request for data for any flight at any time which we must send back in an automated fashion.

Oh, and just as a kick in the pants, the airlines are expected to pay the costs for all these data transmissions (to the tune of several thousand dollars a month).

Posted on December 22, 2004 at 10:06 AM10 Comments

How Not to Test Airport Security

If this were fiction, no one would believe it. From MSNBC:

Four days after police at Charles de Gaulle Airport slipped some plastic explosives into a random passenger’s bag as part of an exercise for sniffer dogs, it is still missing—and authorities are stumped and embarrassed.

It’s perfectly reasonable to plant an explosive-filled suitcase in an airport in order to test security. It is not okay to plant it in someone’s bag without his knowledge and permission. (The explosive residue could remain on the suitcase long after the test, and might be picked up by one of those trace mass spectrometers that detects the chemical residue associated with bombs.) But if you are going to plant plastic explosives in the suitcase of some innocent passenger, shouldn’t you at least write down which suitcase it was?

Posted on December 20, 2004 at 9:13 AM19 Comments

Burglars and "Feeling Secure"

From Confessions of a Master Jewel Thief by Bill Mason (Villard, 2003):

Nothing works more in a thief’s favor than people feeling secure. That’s why places that are heavily alarmed and guarded can sometimes be the easiest targets. The single most important factor in security—more than locks, alarms, sensors, or armed guards—is attitude. A building protected by nothing more than a cheap combination lock but inhabited by people who are alert and risk-aware is much safer than one with the world’s most sophisticated alarm system whose tenants assume they’re living in an impregnable fortress.

The author, a burglar, found that luxury condos were an excellent target. Although they had much more security technology than other buildings, they were vulnerable because no one believed a thief could get through the lobby.

Posted on December 17, 2004 at 9:21 AM4 Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.