Entries Tagged "searches"

Page 10 of 15

UK Spends Billions to Force Rail Terrorists to Drive a Little Further

Makes no sense:

Passengers at Liverpool’s Lime Street station face airport-style searches and bag-screening, under swingeing new anti-terror measures unveiled yesterday.

And security barriers, vehicle exclusion zones and blast-resistant buildings will be introduced at airports, ports and up to 250 of the busiest train stations, Gordon Brown announced.

Of course, less busy train stations are only a few minutes away by car.

Posted on November 22, 2007 at 6:28 AMView Comments

Dan Egerstad Arrested

I previously wrote about Dan Egerstad, a security researcher who ran a Tor anonymity network and was able to sniff some pretty impressive usernames and passwords.

Swedish police arrested him:

About 9am Egerstad walked downstairs to move his car when he was accosted by the officers in a scene “taken out of a bad movie”, he said in an email interview.

“I got a couple of police IDs in my face while told that they are taking me in for questioning,” he said.

But not before the agents, who had staked out his house in undercover blue and grey Saabs (“something that screams cop to every person in Sweden from miles away”), searched his apartment and confiscated computers, CDs and portable hard drives.

“They broke my wardrobe, short cutted my electricity, pulled out my speakers, phone and other cables having nothing to do with this and been touching my bookkeeping, which they have no right to do,” he said.

While questioning Egerstad at the station, the police “played every trick in the book, good cop, bad cop and crazy mysterious guy in the corner not wanting to tell his name and just staring at me”.

“Well, if they want to try to manipulate, I can play that game too. [I] gave every known body signal there is telling of lies … covered my mouth, scratched my elbow, looked away and so on.”

No charges have been filed. I’m not sure there’s anything wrong with what he did.

Here’s a good article on what he did; it was published just before the arrest.

Posted on November 16, 2007 at 2:27 PMView Comments

Remote-Controlled Toys and the TSA

Remote controlled toys are getting more scrutiny:

Airport screeners are giving additional scrutiny to remote-controlled toys because terrorists could use them to trigger explosive devices, the Transportation Security Administration said Monday.

The TSA suggests travelers place remote-controlled toys in checked luggage.

The TSA stopped short of banning the toys in carry-on bags but suggested travelers place them in checked luggage.

Okay, let’s think this through. The one place where you don’t need a modified remote-controlled toy is in the passenger cabin, because you have your hands available to push any required buttons. But a remote-controlled toy in checked luggage, now that’s a clever idea. I put my modified remote-controlled toy bomb in my checked suitcase, and use the controller to detonate it once I’m in the air.

So maybe we want the remote-controlled toy in carry-on luggage, where there’s a greater chance of detecting it (at the security checkpoint). And maybe we want to require the remote controller to be in checked luggage.

Or maybe….

In any case, it’s a great movie plot.

EDITED TO ADD (10/4): Here are two news stories and the DHS press release.

Posted on October 4, 2007 at 10:20 AMView Comments

Blowback from Banning Backpacks

A high school bans backpacks as a security measure. This also includes purses, which inconveniences girls who need to carry menstrual supplies. So now, girls who are carrying purses get asked by police: “Are you on your period?” The predictable uproar follows.

Maybe they should try transparent backpacks or bulletproof backpacks. (If only someone would invent a transparent bulletproof backpack. Then our children would finally be safe!)

Posted on October 3, 2007 at 12:55 PMView Comments

NASA Employees Sue over Background Checks

This is a big deal:

Jet Propulsion Laboratory scientists and engineers sued NASA and the California Institute of Technology on Thursday, challenging extensive new background checks that the space exploration center and other federal agencies began requiring in the wake of the Sept. 11 terror attacks.

[…]

But according to the lawsuit, the Commerce Department and NASA instituted requirements that employees and contractors permit sweeping background checks to qualify for credentials and refusal would mean the loss of their jobs.

NASA calls on employees to permit investigators to delve into medical, financial and past employment records, and to question friends and acquaintances about everything from their finances to sex lives, according to the suit. The requirements apply to everyone from janitors to visiting professors.

The suit claims violations of the U.S. Constitution’s 4th Amendment protection against unreasonable search and seizure, 14th Amendment protection against invasion of the right to privacy, the Administrative Procedure Act, the Privacy Act, and rights under the California Constitution.

Those in more sensitive positions are asked to disclose financial records, list foreign trips and give the government permission to view their medical history.

Workers also must sign a waiver giving investigators access to virtually all personal information.

[…]

“Many of the plaintiffs only agreed to work for NASA with the understanding that they would not have to work on classified materials or to undergo any type of security clearance,” the suit said.

More details here (check out the “Forum” if you’re really interested) and in this article.

Posted on September 4, 2007 at 12:56 PMView Comments

Interview with National Intelligence Director Mike McConnell

Mike McConnell, U.S. National Intelligence Director, gave an interesting interview to the El Paso Times.

I don’t think he’s ever been so candid before. For example, he admitted that the nation’s telcos assisted the NSA in their massive eavesdropping efforts. We already knew this, of course, but the government has steadfastly maintained that either confirming or denying this would compromise national security.

There are, of course, moments of surreality. He said that it takes 200 hours to prepare a FISA warrant. Ryan Single calculated that since there were 2,167 such warrants in 2006, there must be “218 government employees with top secret clearances sitting in rooms, writing only FISA warrants.” Seems unlikely.

But most notable is this bit:

Q. So you’re saying that the reporting and the debate in Congress means that some Americans are going to die?

A. That’s what I mean. Because we have made it so public. We used to do these things very differently, but for whatever reason, you know, it’s a democratic process and sunshine’s a good thing. We need to have the debate.

Ah, the politics of fear. I don’t care if it’s the terrorists or the politicians, refuse to be terrorized. (More interesting discussions on the interview here, here, here, here, here, and here.)

Posted on August 24, 2007 at 6:30 AMView Comments

Conversation with Kip Hawley, TSA Administrator (Part 2)

This is Part 2 of a five-part series. Link to whole thing.

BS: I hope you’re telling the truth; screening is a difficult problem, and it’s hard to discount all of those published tests and reports. But a lot of the security around these checkpoints is about perception—we want potential terrorists to think there’s a significant chance they won’t get through the checkpoints—so you’re better off maintaining that the screeners are better than reports indicate, even if they’re not.

Backscatter X-ray is another technology that is causing privacy concerns, since it basically allows you to see people naked. Can you explain the benefits of the technology, and what you are doing to protect privacy? Although the machines can distort the images, we know that they can store raw, unfiltered images; the manufacturer Rapiscan is quite proud of the fact. Are the machines you’re using routinely storing images? Can they store images at the screener’s discretion, or is that capability turned off at installation?

KH: We’re still evaluating backscatter and are in the process of running millimeter wave portals right alongside backscatter to compare their effectiveness and the privacy issues. We do not now store images for the test phase (function disabled), and although we haven’t officially resolved the issue, I fully understand the privacy argument and don’t assume that we will store them if and when they’re widely deployed.

BS: When can we keep our shoes on?

KH: Any time after you clear security. Sorry, Bruce, I don’t like it either, but this is not just something leftover from 2002. It is a real, current concern. We’re looking at shoe scanners and ways of using millimeter wave and/or backscatter to get there, but until the technology catches up to the risk, the shoes have to go in the bin.

BS: This feels so much like “cover your ass” security: you’re screening our shoes because everyone knows Richard Reid hid explosives in them, and you’ll be raked over the coals if that particular plot ever happens again. But there are literally thousands of possible plots.

So when does it end? The terrorists invented a particular tactic, and you’re defending against it. But you’re playing a game you can’t win. You ban guns and bombs, so the terrorists use box cutters. You ban small blades and knitting needles, and they hide explosives in their shoes. You screen shoes, so they invent a liquid explosive. You restrict liquids, and they’re going to do something else. The terrorists are going to look at what you’re confiscating, and they’re going to design a plot to bypass your security.

That’s the real lesson of the liquid bombers. Assuming you’re right and the explosive was real, it was an explosive that none of the security measures at the time would have detected. So why play this slow game of whittling down what people can bring onto airplanes? When do you say: “Enough. It’s not about the details of the tactic; it’s about the broad threat”?

KH: In late 2005, I made a big deal about focusing on Improvised Explosives Devices (IEDs) and not chasing all the things that could be used as weapons. Until the liquids plot this summer, we were defending our decision to let scissors and small tools back on planes and trying to add layers like behavior detection and document checking, so it is ironic that you ask this question—I am in vehement agreement with your premise. We’d rather focus on things that can do catastrophic harm (bombs!) and add layers to get people with hostile intent to highlight themselves. We have a responsibility, though, to address known continued active attack methods like shoes and liquids and, unfortunately, have to use our somewhat clunky process for now.

BS: You don’t have a responsibility to screen shoes; you have one to protect air travel from terrorism to the best of your ability. You’re picking and choosing. We know the Chechnyan terrorists who downed two Russian planes in 2004 got through security partly because different people carried the explosive and the detonator. Why doesn’t this count as a continued, active attack method?

I don’t want to even think about how much C4 I can strap to my legs and walk through your magnetometers. Or search the Internet for “BeerBelly.” It’s a device you can strap to your chest to smuggle beer into stadiums, but you can also use it smuggle 40 ounces of dangerous liquid explosive onto planes. The magnetometer won’t detect it. Your secondary screening wandings won’t detect it. Why aren’t you making us all take our shirts off? Will you have to find a printout of the webpage in some terrorist safe house? Or will someone actually have to try it? If that doesn’t bother you, search the Internet for “cell phone gun.”

It’s “cover your ass” security. If someone tries to blow up a plane with a shoe or a liquid, you’ll take a lot of blame for not catching it. But if someone uses any of these other, equally known, attack methods, you’ll be blamed less because they’re less public.

KH: Dead wrong! Our security strategy assumes an adaptive terrorist, and that looking backwards is not a reliable predictor of the next type of attack. Yes, we screen for shoe bombs and liquids, because it would be stupid not to directly address attack methods that we believe to be active. Overall, we are getting away from trying to predict what the object looks like and looking more for the other markers of a terrorist. (Don’t forget, we see two million people a day, so we know what normal looks like.) What he/she does; the way they behave. That way we don’t put all our eggs in the basket of catching them in the act. We can’t give them free rein to surveil or do dry-runs; we need to put up obstacles for them at every turn. Working backwards, what do you need to do to be successful in an attack? Find the decision points that show the difference between normal action and action needed for an attack. Our odds are better with this approach than by trying to take away methods, annoying object by annoying object. Bruce, as for blame, that’s nothing compared to what all of us would carry inside if we failed to prevent an attack.

Part 3: The no-fly list

Posted on July 31, 2007 at 6:12 AMView Comments

Conversation with Kip Hawley, TSA Administrator (Part 1)

This is Part 1 of a five-part series. Link to whole thing.

In April, Kip Hawley, the head of the Transportation Security Administration (TSA), invited me to Washington for a meeting. Despite some serious trepidation, I accepted. And it was a good meeting. Most of it was off the record, but he asked me how the TSA could overcome its negative image. I told him to be more transparent, and stop ducking the hard questions. He said that he wanted to do that. He did enjoy writing a guest blog post for Aviation Daily, but having a blog himself didn’t work within the bureaucracy. What else could he do?

This interview, conducted in May and June via e-mail, was one of my suggestions.

Bruce Schneier: By today’s rules, I can carry on liquids in quantities of three ounces or less, unless they’re in larger bottles. But I can carry on multiple three-ounce bottles. Or a single larger bottle with a non-prescription medicine label, like contact lens fluid. It all has to fit inside a one-quart plastic bag, except for that large bottle of contact lens fluid. And if you confiscate my liquids, you’re going to toss them into a large pile right next to the screening station—which you would never do if anyone thought they were actually dangerous.

Can you please convince me there’s not an Office for Annoying Air Travelers making this sort of stuff up?

Kip Hawley: Screening ideas are indeed thought up by the Office for Annoying Air Travelers and vetted through the Directorate for Confusion and Complexity, and then we review them to insure that there are sufficient unintended irritating consequences so that the blogosphere is constantly fueled. Imagine for a moment that TSA people are somewhat bright, and motivated to protect the public with the least intrusion into their lives, not to mention travel themselves. How might you engineer backwards from that premise to get to three ounces and a baggie?

We faced a different kind of liquid explosive, one that was engineered to evade then-existing technology and process. Not the old Bojinka formula or other well-understood ones—TSA already trains and tests on those. After August 10, we began testing different variants with the national labs, among others, and engaged with other countries that have sophisticated explosives capabilities to find out what is necessary to reliably bring down a plane.

We started with the premise that we should prohibit only what’s needed from a security perspective. Otherwise, we would have stuck with a total liquid ban. But we learned through testing that that no matter what someone brought on, if it was in a small enough container, it wasn’t a serious threat. So what would the justification be for prohibiting lip gloss, nasal spray, etc? There was none, other than for our own convenience and the sake of a simple explanation.

Based on the scientific findings and a don’t-intrude-unless-needed-for-security philosophy, we came up with a container size that eliminates an assembled bomb (without having to determine what exactly is inside the bottle labeled “shampoo”), limits the total liquid any one person can bring (without requiring Transportation Security Officers (TSOs) to count individual bottles), and allows for additional security measures relating to multiple people mixing a bomb post-checkpoint. Three ounces and a baggie in the bin gives us a way for people to safely bring on limited quantities of liquids, aerosols and gels.

BS: How will this foil a plot, given that there are no consequences to trying? Airplane contraband falls into two broad categories: stuff you get in trouble for trying to smuggle onboard, and stuff that just gets taken away from you. If I’m caught at a security checkpoint with a gun or a bomb, you’re going to call the police and really ruin my day. But if I have a large bottle of that liquid explosive, you confiscate it with a smile and let me though. So unless you’re 100% perfect in catching this stuff—which you’re not—I can just try again and again until I get it through.

This isn’t like contaminants in food, where if you remove 90% of the particles, you’re 90% safer. None of those false alarms—none of those innocuous liquids taken away from innocent travelers—improve security. We’re only safer if you catch the one explosive liquid amongst the millions of containers of water, shampoo, and toothpaste. I have described two ways to get large amounts of liquids onto airplanes—large bottles labeled “saline solution” and trying until the screeners miss the liquid—not to mention combining multiple little bottles of liquid into one big bottle after the security checkpoint.

I want to assume the TSA is both intelligent and motivated to protect us. I’m taking your word for it that there is an actual threat—lots of chemists disagree—but your liquid ban isn’t mitigating it. Instead, I have the sinking feeling that you’re defending us against a terrorist smart enough to develop his own liquid explosive, yet too stupid to read the rules on TSA’s own website.

KH: I think your premise is wrong. There are consequences to coming to an airport with a bomb and having some of the materials taken away at the checkpoint. Putting aside our layers of security for the moment, there are things you can do to get a TSO’s attention at the checkpoint. If a TSO finds you or the contents of your bag suspicious, you might get interviewed and/or have your bags more closely examined. If the TSO throws your liquids in the trash, they don’t find you a threat.

I often read blog posts about how someone could just take all their three-ounce bottles—or take bottles from others on the plane—and combine them into a larger container to make a bomb. I can’t get into the specifics, but our explosives research shows this is not a viable option.

The current system is not the best we’ll ever come up with. In the near future, we’ll come up with an automated system to take care of liquids, and everyone will be happier.

In the meantime, we have begun using hand-held devices that can recognize threat liquids through factory-sealed containers (we will increase their number through the rest of the year) and we have different test strips that are effective when a bottle is opened. Right now, we’re using them on exempt items like medicines, as well as undeclared liquids TSOs find in bags. This will help close the vulnerability and strengthen the deterrent.

BS: People regularly point to security checkpoints missing a knife in their handbag as evidence that security screening isn’t working. But that’s wrong. Complete effectiveness is not the goal; the checkpoints just have to be effective enough so that the terrorists are worried their plan will be uncovered. But in Denver earlier this year, testers sneaked 90% of weapons through. And other tests aren’t much better. Why are these numbers so poor, and why didn’t they get better when the TSA took over airport security?

KH: Your first point is dead on and is the key to how we look at security. The stories about 90% failures are wrong or extremely misleading. We do many kinds of effectiveness tests at checkpoints daily. We use them to guide training and decisions on technology and operating procedures. We also do extensive and very sophisticated Red Team testing, and one of their jobs is to observe checkpoints and go back and figure out—based on inside knowledge of what we do—ways to beat the system. They isolate one particular thing: for example, a particular explosive, made and placed in a way that exploits a particular weakness in technology; our procedures; or the way TSOs do things in practice. Then they will test that particular thing over and over until they identify what corrective action is needed. We then change technology or procedure, or plain old focus on execution. And we repeat the process—forever.

So without getting into specifics on the test results, of course there are times that our evaluations can generate high failure rate numbers on specific scenarios. Overall, though, our ability to detect bomb components is vastly improved and it will keep getting better. (Older scores you may have seen may be “feel good” numbers based on old, easy tests. Don’t go for the sound-bite; today’s TSOs are light-years ahead of even where they were two years ago.)

Part 2: When can we keep our shoes on?

Posted on July 30, 2007 at 6:12 AMView Comments

Third Party Consent and Computer Searches

U.S. courts are weighing in with opinions:

When Ray Andrus’ 91-year-old father gave federal agents permission to search his son’s password-protected computer files and they found child pornography, the case turned a spotlight on how appellate courts grapple with third-party consents to search computers.

[…]

The case was a first for the 10th U.S. Circuit Court of Appeals, and only two other circuits have touched on the issue, the 4th and 6th circuits. The 10th Circuit held that although password-protected computers command a high level of privacy, the legitimacy of a search turns on an officer’s belief that the third party had authority to consent.

The 10th Circuit’s recent 2-1 decision in U.S. v. Andrus, No. 06-3094 (April 25, 2007), recognized for the first time that a password-protected computer is like a locked suitcase or a padlocked footlocker in a bedroom. The digital locks raise the expectation of privacy by the owner. The majority nonetheless refused to suppress the evidence.

Excellent commentary from Jennifer Granick:

The Fourth Amendment generally prohibits warrantless searches of an individual’s home or possessions. There is an exception to the warrant requirement when someone consents to the search. Consent can be given by the person under investigation, or by a third party with control over or mutual access to the property being searched. Because the Fourth Amendment only prohibits “unreasonable searches and seizures,” permission given by a third party who lacks the authority to consent will nevertheless legitimize a warrantless search if the consenter has “apparent authority,” meaning that the police reasonably believed that the person had actual authority to control or use the property.

Under existing case law, only people with a key to a locked closet have apparent authority to consent to a search of that closet. Similarly, only people with the password to a locked computer have apparent authority to consent to a search of that device. In Andrus, the father did not have the password (or know how to use the computer) but the police say they did not have any reason to suspect this because they did not ask and did not turn the computer on. Then, they used forensic software that automatically bypassed any installed password.

The majority held that the police officers not only weren’t obliged to ask whether the father used the computer, they had no obligation to check for a password before performing their forensic search. In dissent, Judge Monroe G. McKay criticized the agents’ intentional blindness to the existence of password protection, when physical or digital locks are such a fundamental part of ascertaining whether a consenting person has actual or apparent authority to permit a police search. “(T)he unconstrained ability of law enforcement to use forensic software such at the EnCase program to bypass password protection without first determining whether such passwords have been enabled … dangerously sidestep(s) the Fourth Amendment.”

[…]

If courts are going to treat computers as containers, and if owners must lock containers in order to keep them private from warrantless searches, then police should be required to look for those locks. Password protected computers and locked containers are an inexact analogy, but if that is how courts are going to do it, then its inappropriate to diminish protections for computers simply because law enforcement chooses to use software that turns a blind eye to owners’ passwords.

Posted on June 5, 2007 at 6:43 AMView Comments

1 8 9 10 11 12 15

Sidebar photo of Bruce Schneier by Joe MacInnis.