Entries Tagged "DHS"

Page 32 of 39

Smart Profiling from the DHS

About time:

Here’s how it works: Select TSA employees will be trained to identify suspicious individuals who raise red flags by exhibiting unusual or anxious behavior, which can be as simple as changes in mannerisms, excessive sweating on a cool day, or changes in the pitch of a person’s voice. Racial or ethnic factors are not a criterion for singling out people, TSA officials say. Those who are identified as suspicious will be examined more thoroughly; for some, the agency will bring in local police to conduct face-to-face interviews and perhaps run the person’s name against national criminal databases and determine whether any threat exists. If such inquiries turn up other issues countries with terrorist connections, police officers can pursue the questioning or alert Federal counterterrorism agents. And of course the full retinue of baggage x-rays, magnetometers and other checks for weapons will continue.

Posted on May 23, 2006 at 6:20 AMView Comments

"The TSA's Constitution-Free Zone"

Interesting first-person account of someone on the U.S. Terrorist Watch List:

To sum up, if you run afoul of the nation’s “national security” apparatus, you’re completely on your own. There are no firm rules, no case law, no real appeals processes, no normal array of Constitutional rights, no lawyers to help, and generally none of the other things that we as American citizens expect to be able to fall back on when we’ve been (justly or unjustly) identified by the government as wrong-doers.

Posted on May 12, 2006 at 1:38 PMView Comments

The DHS Secretly Shares European Passenger Data in Violation of Agreement

From the ACLU:

In 2003, the United States and the European Union reached an agreement under which the EU would share Passenger Name Record (PNR) data with the U.S., despite the lack of privacy laws in the United States adequate to ensure Europeans’ privacy. In return, DHS agreed that the passenger data would not be used for any purpose other than preventing acts of terrorism or other serious crimes. It is now clear that DHS did not abide by that agreement.

Posted on May 8, 2006 at 6:34 AMView Comments

Software Failure Causes Airport Evacuation

Last month I wrote about airport passenger screening, and mentioned that the X-ray equipment inserts “test” bags into the stream in order to keep screeners more alert. That system failed pretty badly earlier this week at Atlanta’s Hartsfield-Jackson Airport, when a false alarm resulted in a two-hour evacuation of the entire airport.

The screening system injects test images onto the screen. Normally the software flashes the words “This is a test” on the screen after a brief delay, but this time the software failed to indicate that. The screener noticed the image (of a “suspicious device,” according to CNN) and, per procedure, screeners manually checked the bags on the conveyor belt for it. They couldn’t find it, of course, but they evacuated the airport and spent two hours vainly searching for it.

Hartsfield-Jackson is the country’s busiest passenger airport. It’s Delta’s hub city. The delays were felt across the country for the rest of the day.

Okay, so what went wrong here? Clearly the software failed. Just as clearly the screener procedures didn’t fail—everyone did what they were supposed to do.

What is less obvious is that the system failed. It failed, because it was not designed to fail well. A small failure—in this case, a software glitch in a single X-ray machine—cascaded in such a way as to shut down the entire airport. This kind of failure magnification is common in poorly designed security systems. Better would be for there to be individual X-ray machines at the gates—I’ve seen this design at several European airports—so that when there’s a problem the effects are restricted to that gate.

Of course, this distributed security solution would be more expensive. But I’m willing to bet it would be cheaper overall, taking into account the cost of occasionally clearing out an airport.

Posted on April 21, 2006 at 12:49 PMView Comments

DHS Releases RFP for Secure Border Initiative

The Department of Homeland Security has released a Request for Proposal—that’s the document asking industry if anyone can do what it wants—for the Secure Border Initiative. Washington Technology has the story:

The long-awaited request for proposals for Secure Border Initiative-Net was released today by the Homeland Security Department, which is calling the project the “most comprehensive effort in the nation’s history” to gain control of the borders.

The 144-page document outlines the purpose and scope of the border surveillance technology program, which supplements other efforts to control the border and enforce immigration laws.

Posted on April 19, 2006 at 7:12 AMView Comments

Security Screening for New York Helicopters

There’s a helicopter shuttle that runs from Lower Manhattan to Kennedy Airport. It’s basically a luxury item: for $139 you can avoid the drive to the airport. But, of course, security screeners are required for passengers, and that’s causing some concern:

At the request of U.S. Helicopter’s executives, the federal Transportation Security Administration set up a checkpoint, with X-ray and bomb-detection machines, to screen passengers and their luggage at the heliport.

The security agency is spending $560,000 this year to operate the checkpoint with a staff of eight screeners and is considering adding a checkpoint at the heliport at the east end of 34th Street. The agency’s involvement has drawn criticism from some elected officials.

“The bottom line here is that there are not enough screeners to go around,” said Senator Charles E. Schumer, Democrat of New York. “The fact that we are taking screeners that are needed at airports to satisfy a luxury market on the government’s dime is a problem.”

This is not a security problem; it’s an economics problem. And it’s a good illustration of the concept of “externalities.” An externality is an effect of a decision not borne by the decision-maker. In this example, U.S. Helicopter made a business decision to offer this service at a certain price. And customers will make a decision about whether or not the service is worth the money. But there is more to the cost than the $139. The cost of that checkpoint is an externality to both U.S. Helicopter and its customers, because the $560,000 spent on the security checkpoint is paid for by taxpayers. Taxpayers are effectively subsidizing the true cost of the helicopter trip.

The only way to solve this is for the government to bill the airline passengers for the cost of security screening. It wouldn’t be much per ticket, maybe $15. And it would be much less at major airports, because the economies of scale are so much greater.

The article even points out that customers would gladly pay the extra $15 because of another externality: the people who decide whether or not to take the helicopter trip are not the people actually paying for it.

Bobby Weiss, a self-employed stock trader and real estate broker who was U.S. Helicopter’s first paying customer yesterday, said he would pay $300 for a round trip to Kennedy, and he expected most corporate executives would, too.

“It’s $300, but so what? It goes on the expense account,” said Mr. Weiss, adding that he had no qualms about the diversion of federal resources to smooth the path of highfliers. “Maybe a richer guy may save a little time at the expense of a poorer guy who spends a little more time in line.”

What Mr. Weiss is saying is that the costs—both the direct cost and the cost of the security checkpoint—are externalities to him, so he really doesn’t care. Exactly.

Posted on April 4, 2006 at 7:51 AMView Comments

Airport Passenger Screening

It seems like every time someone tests airport security, airport security fails. In tests between November 2001 and February 2002, screeners missed 70 percent of knives, 30 percent of guns and 60 percent of (fake) bombs. And recently (see also this), testers were able to smuggle bomb-making parts through airport security in 21 of 21 attempts. It makes you wonder why we’re all putting our laptops in a separate bin and taking off our shoes. (Although we should all be glad that Richard Reid wasn’t the “underwear bomber.”)

The failure to detect bomb-making parts is easier to understand. Break up something into small enough parts, and it’s going to slip past the screeners pretty easily. The explosive material won’t show up on the metal detector, and the associated electronics can look benign when disassembled. This isn’t even a new problem. It’s widely believed that the Chechen women who blew up the two Russian planes in August 2004 probably smuggled their bombs aboard the planes in pieces.

But guns and knives? That surprises most people.

Airport screeners have a difficult job, primarily because the human brain isn’t naturally adapted to the task. We’re wired for visual pattern matching, and are great at picking out something we know to look for—for example, a lion in a sea of tall grass.

But we’re much less adept at detecting random exceptions in uniform data. Faced with an endless stream of identical objects, the brain quickly concludes that everything is identical and there’s no point in paying attention. By the time the exception comes around, the brain simply doesn’t notice it. This psychological phenomenon isn’t just a problem in airport screening: It’s been identified in inspections of all kinds, and is why casinos move their dealers around so often. The tasks are simply mind-numbing.

To make matters worse, the smuggler can try to exploit the system. He can position the weapons in his baggage just so. He can try to disguise them by adding other metal items to distract the screeners. He can disassemble bomb parts so they look nothing like bombs. Against a bored screener, he has the upper hand.

And, as has been pointed out again and again in essays on the ludicrousness of post-9/11 airport security, improvised weapons are a huge problem. A rock, a battery for a laptop, a belt, the extension handle off a wheeled suitcase, fishing line, the bare hands of someone who knows karate … the list goes on and on.

Technology can help. X-ray machines already randomly insert “test” bags into the stream—keeping screeners more alert. Computer-enhanced displays are making it easier for screeners to find contraband items in luggage, and eventually the computers will be able to do most of the work. It makes sense: Computers excel at boring repetitive tasks. They should do the quick sort, and let the screeners deal with the exceptions.

Sure, there’ll be a lot of false alarms, and some bad things will still get through. But it’s better than the alternative.

And it’s likely good enough. Remember the point of passenger screening. We’re not trying to catch the clever, organized, well-funded terrorists. We’re trying to catch the amateurs and the incompetent. We’re trying to catch the unstable. We’re trying to catch the copycats. These are all legitimate threats, and we’re smart to defend against them. Against the professionals, we’re just trying to add enough uncertainty into the system that they’ll choose other targets instead.

The terrorists’ goals have nothing to do with airplanes; their goals are to cause terror. Blowing up an airplane is just a particular attack designed to achieve that goal. Airplanes deserve some additional security because they have catastrophic failure properties: If there’s even a small explosion, everyone on the plane dies. But there’s a diminishing return on investments in airplane security. If the terrorists switch targets from airplanes to shopping malls, we haven’t really solved the problem.

What that means is that a basic cursory screening is good enough. If I were investing in security, I would fund significant research into computer-assisted screening equipment for both checked and carry-on bags, but wouldn’t spend a lot of money on invasive screening procedures and secondary screening. I would much rather have well-trained security personnel wandering around the airport, both in and out of uniform, looking for suspicious actions.

When I travel in Europe, I never have to take my laptop out of its case or my shoes off my feet. Those governments have had far more experience with terrorism than the U.S. government, and they know when passenger screening has reached the point of diminishing returns. (They also implemented checked-baggage security measures decades before the United States did—again recognizing the real threat.)

And if I were investing in security, I would invest in intelligence and investigation. The best time to combat terrorism is before the terrorist tries to get on an airplane. The best countermeasures have value regardless of the nature of the terrorist plot or the particular terrorist target.

In some ways, if we’re relying on airport screeners to prevent terrorism, it’s already too late. After all, we can’t keep weapons out of prisons. How can we ever hope to keep them out of airports?

A version of this essay originally appeared on Wired.com.

Posted on March 23, 2006 at 7:03 AMView Comments

DHS Privacy and Integrity Report

Last year, the Department of Homeland Security finally got around to appointing its DHS Data Privacy and Integrity Advisory Committee. It was mostly made up of industry insiders instead of anyone with any real privacy experience. (Lance Hoffman from George Washington University was the most notable exception.)

And now, we have something from that committee. On March 7th they published their “Framework for Privacy Analysis of Programs, Technologies, and Applications.”

This document sets forth a recommended framework for analyzing programs, technologies, and applications in light of their effects on privacy and related interests. It is intended as guidance for the Data Privacy and Integrity Advisory Committee (the Committee) to the U.S. Department of Homeland Security (DHS). It may also be useful to the DHS Privacy Office, other DHS components, and other governmental entities that are seeking to reconcile personal data-intensive programs and activities with important social and human values.

It’s surprisingly good.

I like that it is a series of questions a program manager has to answer: about the legal basis for the program, its efficacy against the threat, and its effects on privacy. I am particularly pleased that their questions on pages 3-4 are very similar to the “five steps” I wrote about in Beyond Fear. I am thrilled that the document takes a “trade-off” approach; the last question asks: “Should the program proceed? Do the benefits of the program…justify the costs to privacy interests….?”

I think this is a good starting place for any technology or program with respect to security and privacy. And I hope the DHS actually follows the recommendations in this report.

Posted on March 21, 2006 at 3:07 PMView Comments

1 30 31 32 33 34 39

Sidebar photo of Bruce Schneier by Joe MacInnis.