Entries Tagged "searches"

Page 14 of 15

Searching Bags in Subways

The New York City police will begin randomly searching people’s bags on subways, buses, commuter trains, and ferries.

“The police can and should be aggressively investigating anyone they suspect is trying to bring explosives into the subway,” said Christopher Dunn, associate legal director at the New York Civil Liberties Union. “However, random police searches of people without any suspicion of wrongdoing are contrary to our most basic constitutional values. This is a very troubling announcement.”

If the choice is between random searching and profiling, then random searching is a more effective security countermeasure. But Dunn is correct above when he says that there are some enormous trade-offs in liberty. And I don’t think we’re getting very much security in return.

Especially considering this:

[Police Commissioner Raymond] Kelly stressed that officers posted at subway entrances would not engage in racial profiling, and that passengers are free to “turn around and leave.”

“Okay guys; here are your explosives. If one of you gets singled out for a search, just turn around and leave. And then go back in via another entrance, or take a taxi to the next subway stop.”

And I don’t think they’ll be truly random, either. I think the police doing the searching will profile, because that’s what happens.

It’s another “movie plot threat.” It’s another “public relations security system.” It’s a waste of money, it substantially reduces our liberties, and it won’t make us any safer.

Final note: I often get comments along the lines of “Stop criticizing stuff; tell us what we should do.” My answer is always the same. Counterterrorism is most effective when it doesn’t make arbitrary assumptions about the terrorists’ plans. Stop searching bags on the subways, and spend the money on 1) intelligence and investigation—stopping the terrorists regardless of what their plans are, and 2) emergency response—lessening the impact of a terrorist attack, regardless of what the plans are. Countermeasures that defend against particular targets, or assume particular tactics, or cause the terrorists to make insignificant modifications in their plans, or that surveil the entire population looking for the few terrorists, are largely not worth it.

EDITED TO ADD: A Citizen’s Guide to Refusing New York Subway Searches.

Posted on July 22, 2005 at 6:27 AMView Comments

DNA Identification

Here’s an interesting application of DNA identification. Instead of searching for your DNA at the crime scene, they search for the crime-scene DNA on you.

The system, called Sentry, works by fitting a box containing a powder spray above a doorway which, once primed, goes into alert mode if the door is opened.

It then sprays the powder when there is movement in the doorway again.

The aim is to catch a burglar in the act as stolen items are being removed.

The intruder is covered in the bright red powder, which glows under ultraviolet (UV) light and can only be removed with heavy scrubbing.

However, the harmless synthetic DNA contained in the powder sinks into the skin and takes several days, depending on the person’s metabolism, to work its way out.

Posted on June 22, 2005 at 8:39 AMView Comments

Backscatter X-Ray Technology

Backscatter X-ray technology is a method of using X rays to see inside objects. The science is complicated, but the upshot is that you can see people naked:

The application of this new x-ray technology to airport screening uses high energy x-rays that are more likely to scatter than penetrate materials as compared to lower-energy x-rays used in medical applications. Although this type of x-ray is said to be harmless it can move through other materials, such as clothing.

A passenger is scanned by rastering or moving a single high energy x-ray beam rapidly over their form. The signal strength of detected backscattered x-rays from a known position then allows a highly realistic image to be reconstructed. Since only Compton scattered x-rays are used, the registered image is mainly that of the surface of the object/person being imaged. In the case of airline passenger screening it is her nude form. The image resolution of the technology is high, so details of the human form of airline passengers present privacy challenges.

EPIC’s “Spotlight on Security” page is an excellent resource on this issue.

The TSA has recently announced a proposal to use these machines to screen airport passengers.

I’m not impressed with this security trade-off. Yes, backscatter X-ray machines might be able to detect things that conventional screening might miss. But I already think we’re spending too much effort screening airplane passengers at the expense of screening luggage and airport employees…to say nothing of the money we should be spending on non-airport security.

On the other side, these machines are expensive and the technology is incredibly intrusive. I don’t think that people should be subjected to strip searches before they board airplanes. And I believe that most people would be appalled by the prospect of security screeners seeing them naked.

I believe that there will be a groundswell of popular opposition to this idea. Aside from the usual list of pro-privacy and pro-liberty groups, I expect fundamentalist Christian groups to be appalled by this technology. I think we can get a bevy of supermodels to speak out against the invasiveness of the search.

News article

Posted on June 9, 2005 at 1:04 PMView Comments

TSA Abuse of Power

Woman accidentally leaves a knife in her carry-on luggage, where it’s discovered by screeners.

She says screeners refused to give her paperwork or documentation of her violation, documentation of the pending fine, or a copy of the photograph of the knife.

“They said ‘no’ and they said it’s a national security issue. And I said what about my constitutional rights? And they said ‘not at this point … you don’t have any’.”

Posted on June 7, 2005 at 4:10 PMView Comments

Detecting Nuclear Material in Transport

One of the few good things that’s coming out of the U.S. terrorism policy is some interesting scientific research. This paper discusses detecting nuclear material in transport.

The authors believe that fixed detectors—for example, at ports—simply won’t work. Terrorists are more likely to use highly enriched uranium (HEU), which is harder to detect, than plutonium. This difficulty of detection is more based on its natural rate of reactivity than on some technological hurdle. “The gamma rays and neutrons useful for detecting shielded HEU permit detection only at short distances (2-4 feet or less) and require that there be sufficient time to count a sufficient number of particles (several minutes to hours).”

The authors conclude that the only way to reliably detect shielded HEU is to build detectors into the transport vehicles. These detectors could take hours to record any radioactivity.

Posted on May 4, 2005 at 7:48 AMView Comments

Failures of Airport Screening

According to the AP:

Security at American airports is no better under federal control than it was before the Sept. 11 attacks, a congressman says two government reports will conclude.

The Government Accountability Office, the investigative arm of Congress, and the Homeland Security Department’s inspector general are expected to release their findings soon on the performance of Transportation Security Administration screeners.

This finding will not surprise anyone who has flown recently. How does anyone expect competent security from screeners who don’t know the difference between books and books of matches? Only two books of matches are now allowed on flights; you can take as many reading books as you can carry.

The solution isn’t to privatize the screeners, just as the solution in 2001 wasn’t to make them federal employees. It’s a much more complex problem.

I wrote about it in Beyond Fear (pages 153-4):

No matter how much training they get, airport screeners routinely miss guns and knives packed in carry-on luggage. In part, that’s the result of human beings having developed the evolutionary survival skill of pattern matching: the ability to pick out patterns from masses of random visual data. Is that a ripe fruit on that tree? Is that a lion stalking quietly through the grass? We are so good at this that we see patterns in anything, even if they’re not really there: faces in inkblots, images in clouds, and trends in graphs of random data. Generating false positives helped us stay alive; maybe that wasn’t a lion that your ancestor saw, but it was better to be safe than sorry. Unfortunately, that survival skill also has a failure mode. As talented as we are at detecting patterns in random data, we are equally terrible at detecting exceptions in uniform data. The quality-control inspector at Spacely Sprockets, staring at a production line filled with identical sprockets looking for the one that is different, can’t do it. The brain quickly concludes that all the sprockets are the same, so there’s no point paying attention. Each new sprocket confirms the pattern. By the time an anomalous sprocket rolls off the assembly line, the brain simply doesn’t notice it. This psychological problem has been identified in inspectors of all kinds; people can’t remain alert to rare events, so they slip by.

The tendency for humans to view similar items as identical makes it clear why airport X-ray screening is so difficult. Weapons in baggage are rare, and the people studying the X-rays simply lose the ability to see the gun or knife. (And, at least before 9/11, there was enormous pressure to keep the lines moving rather than double-check bags.) Steps have been put in place to try to deal with this problem: requiring the X-ray screeners to take frequent breaks, artificially imposing the image of a weapon onto a normal bag in the screening system as a test, slipping a bag with a weapon into the system so that screeners learn it can happen and must expect it. Unfortunately, the results have not been very good.

This is an area where the eventual solution will be a combination of machine and human intelligence. Machines excel at detecting exceptions in uniform data, so it makes sense to have them do the boring repetitive tasks, eliminating many, many bags while having a human sort out the final details. Think about the sprocket quality-control inspector: If he sees 10,000 negatives, he’s going to stop seeing the positives. But if an automatic system shows him only 100 negatives for every positive, there’s a greater chance he’ll see them.

Paying the screeners more will attract a smarter class of worker, but it won’t solve the problem.

Posted on April 19, 2005 at 9:22 AMView Comments

Sneaking Items Aboard Aircraft

A Pennsylvania Supreme Court Justice faces a fine—although no criminal charges at the moment—for trying to sneak a knife aboard an aircraft.

Saylor, 58, and his wife entered a security checkpoint Feb. 4 on a trip to Philadelphia when screeners found a small Swiss Army-style knife attached to his key chain.

A police report said he was told the item could not be carried onto a plane and that he needed to place the knife into checked luggage or make other arrangements.

When Saylor returned a short time later to be screened a second time, an X-ray machine detected a knife inside his carry-on luggage, police said.

There are two points worth making here. One: ridiculous rules have a way of turning people into criminals. And two: this is an example of a security failure, not a security success.

Security systems fail in one of two ways. They can fail to stop the bad guy, and they can mistakenly stop the good guy. The TSA likes to measure its success by looking at the forbidden items they have prevented from being carried onto aircraft, but that’s wrong. Every time the TSA takes a pocketknife from an innocent person, that’s a security failure. It’s a false alarm. The system has prevented access where no prevention was required. This, coupled with the widespread belief that the bad guys will find a way around the system, demonstrates what a colossal waste of money it is.

Posted on February 28, 2005 at 8:00 AMView Comments

Airport Screeners Cheat to Pass Tests

According to the San Franciso Chronicle:

The private firm in charge of security at San Francisco International Airport cheated to pass tests aimed at ensuring it could stop terrorists from smuggling weapons onto flights, a former employee contends.

All security systems require trusted people: people that must be trusted in order for the security to work. If the trusted people turn out not to be trustworthy, security fails.

Posted on February 24, 2005 at 8:00 AMView Comments

T-Mobile Hack

For at least seven months last year, a hacker had access to T-Mobile’s customer network. He’s known to have accessed information belonging to 400 customers—names, Social Security numbers, voicemail messages, SMS messages, photos—and probably had the ability to access data belonging to any of T-Mobile’s 16.3 million U.S. customers. But in its fervor to report on the security of cell phones, and T-Mobile in particular, the media missed the most important point of the story: The security of much of our data is not under our control.

This is new. A dozen years ago, if someone wanted to look through your mail, they would have to break into your house. Now they can just break into your ISP. Ten years ago, your voicemail was on an answering machine in your house; now it’s on a computer owned by a telephone company. Your financial data is on Websites protected only by passwords. The list of books you browse, and the books you buy, is stored in the computers of some online bookseller. Your affinity card allows your supermarket to know what food you like. Data that used to be under your direct control is now controlled by others.

We have no choice but to trust these companies with our privacy, even though the companies have little incentive to protect that privacy. T-Mobile suffered some bad press for its lousy security, nothing more. It’ll spend some money improving its security, but it’ll be security designed to protect its reputation from bad PR, not security designed to protect the privacy of its customers.

This loss of control over our data has other effects, too. Our protections against police abuse have been severely watered down. The courts have ruled that the police can search your data without a warrant, as long as that data is held by others. The police need a warrant to read the e-mail on your computer; but they don’t need one to read it off the backup tapes at your ISP. According to the Supreme Court, that’s not a search as defined by the 4th Amendment.

This isn’t a technology problem, it’s a legal problem. The courts need to recognize that in the information age, virtual privacy and physical privacy don’t have the same boundaries. We should be able to control our own data, regardless of where it is stored. We should be able to make decisions about the security and privacy of that data, and have legal recourse should companies fail to honor those decisions. And just as the Supreme Court eventually ruled that tapping a telephone was a Fourth Amendment search, requiring a warrant—even though it occurred at the phone company switching office—the Supreme Court must recognize that reading e-mail at an ISP is no different.

This essay appeared in eWeek.

Posted on February 14, 2005 at 4:26 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.