Entries Tagged "searches"

Page 12 of 16

Man Sues Compaq for False Advertising

Convicted felon Michael Crooker is suing Compaq (now HP) for false advertising. He bought a computer promised to be secure, but the FBI got his data anyway:

He bought it in September 2002, expressly because it had a feature called DriveLock, which freezes up the hard drive if you don’t have the proper password.

The computer’s manual claims that “if one were to lose his Master Password and his User Password, then the hard drive is useless and the data cannot be resurrected even by Compaq’s headquarters staff,” Crooker wrote in the suit.

Crooker has a copy of an ATF search warrant for files on the computer, which includes a handwritten notation: “Computer lock not able to be broken/disabled. Computer forwarded to FBI lab.” Crooker says he refused to give investigators the password, and was told the computer would be broken into “through a backdoor provided by Compaq,” which is now part of HP.

It’s unclear what was done with the laptop, but Crooker says a subsequent search warrant for his e-mail account, issued in January 2005, showed investigators had somehow gained access to his 40 gigabyte hard drive. The FBI had broken through DriveLock and accessed his e-mails (both deleted and not) as well as lists of websites he’d visited and other information. The only files they couldn’t read were ones he’d encrypted using Wexcrypt, a software program freely available on the Internet.

I think this is great. It’s about time that computer companies were held liable for their advertising claims.

But his lawsuit against HP may be a long shot. Crooker appears to face strong counterarguments to his claim that HP is guilty of breach of contract, especially if the FBI made the company provide a backdoor.

“If they had a warrant, then I don’t see how his case has any merit at all,” said Steven Certilman, a Stamford attorney who heads the Technology Law section of the Connecticut Bar Association. “Whatever means they used, if it’s covered by the warrant, it’s legitimate.”

If HP claimed DriveLock was unbreakable when the company knew it was not, that might be a kind of false advertising.

But while documents on HP’s web site do claim that without the correct passwords, a DriveLock’ed hard drive is “permanently unusable,” such warnings may not constitute actual legal guarantees.

According to Certilman and other computer security experts, hardware and software makers are careful not to make themselves liable for the performance of their products.

“I haven’t heard of manufacturers, at least for the consumer market, making a promise of computer security. Usually you buy naked hardware and you’re on your own,” Certilman said. In general, computer warrantees are “limited only to replacement and repair of the component, and not to incidental consequential damages such as the exposure of the underlying data to snooping third parties,” he said. “So I would be quite surprised if there were a gaping hole in their warranty that would allow that kind of claim.”

That point meets with agreement from the noted computer security skeptic Bruce Schneier, the chief technology officer at Counterpane Internet Security in Mountain View, Calif.

“I mean, the computer industry promises nothing,” he said last week. “Did you ever read a shrink-wrapped license agreement? You should read one. It basically says, if this product deliberately kills your children, and we knew it would, and we decided not to tell you because it might harm sales, we’re not liable. I mean, it says stuff like that. They’re absurd documents. You have no rights.”

My final quote in the article:

“Unfortunately, this probably isn’t a great case,” Schneier said. “Here’s a man who’s not going to get much sympathy. You want a defendant who bought the Compaq computer, and then, you know, his competitor, or a rogue employee, or someone who broke into his office, got the data. That’s a much more sympathetic defendant.”

Posted on May 3, 2006 at 9:26 AMView Comments

The Security Risk of Special Cases

In Beyond Fear, I wrote about the inherent security risks of exceptions to a security policy. Here’s an example, from airport security in Ireland.

Police officers are permitted to bypass airport security at the Dublin Airport. They flash their ID, and walk around the checkpoints.

A female member of the airport search unit is undergoing re-training after the incident in which a Department of Transport inspector passed unchecked through security screening.

It is understood that the department official was waved through security checks having flashed an official badge. The inspector immediately notified airport authorities of a failure in vetting procedures. Only gardai are permitted to pass unchecked through security.

There are two ways this failure could have happened. One, security person could have thought that Department of Transportation officials have the same privileges as police officers. And two, the security person could have thought she was being shown a police ID.

This could have just as easily been a bad guy showing a fake police ID. My guess is that the security people don’t check them all that carefully.

The meta-point is that exceptions to security are themselves security vulnerabilities. As soon as you create a system by which some people can bypass airport security checkpoints, you invite the bad guys to try and use that system. There are reasons why you might want to create those alternate paths through security, of course, but the trade-offs should be well thought out.

Posted on April 26, 2006 at 6:05 AMView Comments

London Rejects Subway Scanners

Rare outbreak of security common sense in London:

London Underground is likely to reject the use of passenger scanners designed to detect weapons or explosives as they are “not practical”, a security chief for the capital’s transport authority said on 14 March 2006.

[…]

“Basically, what we know is that it’s not practical,” he told Government Computing News. “People use the tube for speed and are concerned with journey time. It would just be too time consuming. Secondly, there’s just not enough space to put this kind of equipment in.”

“Finally there’s also the risk that you actually create another target with people queuing up and congregating at the screening points.”

Posted on March 23, 2006 at 1:39 PMView Comments

Airport Passenger Screening

It seems like every time someone tests airport security, airport security fails. In tests between November 2001 and February 2002, screeners missed 70 percent of knives, 30 percent of guns and 60 percent of (fake) bombs. And recently (see also this), testers were able to smuggle bomb-making parts through airport security in 21 of 21 attempts. It makes you wonder why we’re all putting our laptops in a separate bin and taking off our shoes. (Although we should all be glad that Richard Reid wasn’t the “underwear bomber.”)

The failure to detect bomb-making parts is easier to understand. Break up something into small enough parts, and it’s going to slip past the screeners pretty easily. The explosive material won’t show up on the metal detector, and the associated electronics can look benign when disassembled. This isn’t even a new problem. It’s widely believed that the Chechen women who blew up the two Russian planes in August 2004 probably smuggled their bombs aboard the planes in pieces.

But guns and knives? That surprises most people.

Airport screeners have a difficult job, primarily because the human brain isn’t naturally adapted to the task. We’re wired for visual pattern matching, and are great at picking out something we know to look for—for example, a lion in a sea of tall grass.

But we’re much less adept at detecting random exceptions in uniform data. Faced with an endless stream of identical objects, the brain quickly concludes that everything is identical and there’s no point in paying attention. By the time the exception comes around, the brain simply doesn’t notice it. This psychological phenomenon isn’t just a problem in airport screening: It’s been identified in inspections of all kinds, and is why casinos move their dealers around so often. The tasks are simply mind-numbing.

To make matters worse, the smuggler can try to exploit the system. He can position the weapons in his baggage just so. He can try to disguise them by adding other metal items to distract the screeners. He can disassemble bomb parts so they look nothing like bombs. Against a bored screener, he has the upper hand.

And, as has been pointed out again and again in essays on the ludicrousness of post-9/11 airport security, improvised weapons are a huge problem. A rock, a battery for a laptop, a belt, the extension handle off a wheeled suitcase, fishing line, the bare hands of someone who knows karate … the list goes on and on.

Technology can help. X-ray machines already randomly insert “test” bags into the stream—keeping screeners more alert. Computer-enhanced displays are making it easier for screeners to find contraband items in luggage, and eventually the computers will be able to do most of the work. It makes sense: Computers excel at boring repetitive tasks. They should do the quick sort, and let the screeners deal with the exceptions.

Sure, there’ll be a lot of false alarms, and some bad things will still get through. But it’s better than the alternative.

And it’s likely good enough. Remember the point of passenger screening. We’re not trying to catch the clever, organized, well-funded terrorists. We’re trying to catch the amateurs and the incompetent. We’re trying to catch the unstable. We’re trying to catch the copycats. These are all legitimate threats, and we’re smart to defend against them. Against the professionals, we’re just trying to add enough uncertainty into the system that they’ll choose other targets instead.

The terrorists’ goals have nothing to do with airplanes; their goals are to cause terror. Blowing up an airplane is just a particular attack designed to achieve that goal. Airplanes deserve some additional security because they have catastrophic failure properties: If there’s even a small explosion, everyone on the plane dies. But there’s a diminishing return on investments in airplane security. If the terrorists switch targets from airplanes to shopping malls, we haven’t really solved the problem.

What that means is that a basic cursory screening is good enough. If I were investing in security, I would fund significant research into computer-assisted screening equipment for both checked and carry-on bags, but wouldn’t spend a lot of money on invasive screening procedures and secondary screening. I would much rather have well-trained security personnel wandering around the airport, both in and out of uniform, looking for suspicious actions.

When I travel in Europe, I never have to take my laptop out of its case or my shoes off my feet. Those governments have had far more experience with terrorism than the U.S. government, and they know when passenger screening has reached the point of diminishing returns. (They also implemented checked-baggage security measures decades before the United States did—again recognizing the real threat.)

And if I were investing in security, I would invest in intelligence and investigation. The best time to combat terrorism is before the terrorist tries to get on an airplane. The best countermeasures have value regardless of the nature of the terrorist plot or the particular terrorist target.

In some ways, if we’re relying on airport screeners to prevent terrorism, it’s already too late. After all, we can’t keep weapons out of prisons. How can we ever hope to keep them out of airports?

A version of this essay originally appeared on Wired.com.

Posted on March 23, 2006 at 7:03 AMView Comments

Airport Security Failure

At LaGuardia, a man successfully walked through the metal detector, but screeners wanted to check his shoes. (Some reports say that his shoes set off an alarm.) But he didn’t wait, and disappeared into the crowd.

The entire Delta Airlines terminal had to be evacuated, and between 2,500 and 3,000 people had to be rescreened. I’m sure the resultant flight delays rippled through the entire system.

Security systems can fail in two ways. They can fail to defend against an attack. And they can fail when there is no attack to defend. The latter failure is often more important, because false alarms are more common than real attacks.

Aside from the obvious security failure—how did this person manage to disappear into the crowd, anyway—it’s painfully obvious that the overall security system did not fail well. Well-designed security systems fail gracefully, without affecting the entire airport terminal. That the only thing the TSA could do after the failure was evacuate the entire terminal and rescreen everyone is a testament to how badly designed the security system is.

Posted on March 14, 2006 at 12:15 PMView Comments

Another No-Fly List Victim

This person didn’t even land in the U.S. His plane flew from Canada to Mexico over U.S. airspace:

Fifteen minutes after the plane left Toronto’s Pearson International Airport, the airline provided customs officials in the United States with a list of passengers. Agents ran the list through a national data base and up popped a name matching Mr. Kahil’s.

[…]

When the plane landed in Acapulco, the Kahils were ushered into a room for questioning. Mug shots were taken of the couple, along with their sons, Karim and Adam, who are 8 and 6. But it was not until a couple of hours later that the Kahils found out why.

Ms. Kahil and the children returned to Canada later that day and Mr. Kahil was put in a detention centre and his passport was confiscated.

Just another case of mistaken identity.

And here’s a story of a four-year-old boy on the watch list.

This program has been a miserable failure in every respect. Not one terrorist caught, ever. (I say this because I believe 100% that if this administration caught anyone through this program, they would be trumpeting it for all to hear.) Thousands of innocents subjected to lengthy and extreme searches every time they fly, prevented from flying, or arrested.

Posted on January 26, 2006 at 3:28 PMView Comments

1 10 11 12 13 14 16

Sidebar photo of Bruce Schneier by Joe MacInnis.