Entries Tagged "DHS"

Page 3 of 38

DHS Puts its Head in the Sand

On the subject of the recent Washington Post Snowden document, the DHS sent this e-mail out to at least some of its employees:

From: xxxxx
Sent: Thursday, July 11, 2013 10:28 AM
To: xxxxx
Cc: xxx Security Reps; xxx SSO; xxxx;xxxx
Subject: //// SECURITY ADVISORY//// NEW WASHINGTON POST WEBPAGE ARTICLE—DO NOT CLICK ON THIS LINK

I have been advised that this article is on the Washington Post’s Website today and has a clickable link title “The NSA Slide you never seen” that must not be opened. This link opens up a classified document which will raise the classification level of your Unclassified workstation to the classification of the slide which is reported to be TS/NF. This has been verified by our Mission Partner and the reason for this email.

If opened on your home or work computer you are obligated to report this to the SSO as your computer could then be considered a classified workstation.

Again, please exercise good judgment when visiting these webpages and clicking on such links. You are violating your Non-Disclosure Agreement in which you promise by signing that you will protect Classified National Security Information. You may be subject to any administrative or legal action from the Government.

SSOs, please pass this on to your respective components as this may be a threat to the systems under your jurisdiction.

This is not just ridiculous, it’s idiotic. Why put DHS employees at a disadvantage by trying to prevent them from knowing what the rest of the world knows? The point of classification is to keep something out of the hands of the bad guys. Once a document is public, the bad guys have access to it. The harm is already done. Can someone think of a reason for this DHS policy other than spite?

Posted on July 17, 2013 at 2:45 PMView Comments

The Politics of Security in a Democracy

Terrorism causes fear, and we overreact to that fear. Our brains aren’t very good at probability and risk analysis. We tend to exaggerate spectacular, strange and rare events, and downplay ordinary, familiar and common ones. We think rare risks are more common than they are, and we fear them more than probability indicates we should.

Our leaders are just as prone to this overreaction as we are. But aside from basic psychology, there are other reasons that it’s smart politics to exaggerate terrorist threats, and security threats in general.

The first is that we respond to a strong leader. Bill Clinton famously said: “When people feel uncertain, they’d rather have somebody that’s strong and wrong than somebody who’s weak and right.” He’s right.

The second is that doing something—anything—is good politics. A politician wants to be seen as taking charge, demanding answers, fixing things. It just doesn’t look as good to sit back and claim that there’s nothing to do. The logic is along the lines of: “Something must be done. This is something. Therefore, we must do it.”

The third is that the “fear preacher” wins, regardless of the outcome. Imagine two politicians today. One of them preaches fear and draconian security measures. The other is someone like me, who tells people that terrorism is a negligible risk, that risk is part of life, and that while some security is necessary, we should mostly just refuse to be terrorized and get on with our lives.

Fast-forward 10 years. If I’m right and there have been no more terrorist attacks, the fear preacher takes credit for keeping us safe. But if a terrorist attack has occurred, my government career is over. Even if the incidence of terrorism is as ridiculously low as it is today, there’s no benefit for a politician to take my side of that gamble.

The fourth and final reason is money. Every new security technology, from surveillance cameras to high-tech fusion centers to airport full-body scanners, has a for-profit corporation lobbying for its purchase and use. Given the three other reasons above, it’s easy—and probably profitable—for a politician to make them happy and say yes.

For any given politician, the implications of these four reasons are straightforward. Overestimating the threat is better than underestimating it. Doing something about the threat is better than doing nothing. Doing something that is explicitly reactive is better than being proactive. (If you’re proactive and you’re wrong, you’ve wasted money. If you’re proactive and you’re right but no longer in power, whoever is in power is going to get the credit for what you did.) Visible is better than invisible. Creating something new is better than fixing something old.

Those last two maxims are why it’s better for a politician to fund a terrorist fusion center than to pay for more Arabic translators for the National Security Agency. No one’s going to see the additional appropriation in the NSA’s secret budget. On the other hand, a high-tech computerized fusion center is going to make front page news, even if it doesn’t actually do anything useful.

This leads to another phenomenon about security and government. Once a security system is in place, it can be very hard to dislodge it. Imagine a politician who objects to some aspect of airport security: the liquid ban, the shoe removal, something. If he pushes to relax security, he gets the blame if something bad happens as a result. No one wants to roll back a police power and have the lack of that power cause a well-publicized death, even if it’s a one-in-a-billion fluke.

We’re seeing this force at work in the bloated terrorist no-fly and watch lists; agents have lots of incentive to put someone on the list, but absolutely no incentive to take anyone off. We’re also seeing this in the Transportation Security Administration’s attempt to reverse the ban on small blades on airplanes. Twice it tried to make the change, and twice fearful politicians prevented it from going through with it.

Lots of unneeded and ineffective security measures are perpetrated by a government bureaucracy that is primarily concerned about the security of its members’ careers. They know the voters are more likely to punish them more if they fail to secure against a repetition of the last attack, and less if they fail to anticipate the next one.

What can we do? Well, the first step toward solving a problem is recognizing that you have one. These are not iron-clad rules; they’re tendencies. If we can keep these tendencies and their causes in mind, we’re more likely to end up with sensible security measures that are commensurate with the threat, instead of a lot of security theater and draconian police powers that are not.

Our leaders’ job is to resist these tendencies. Our job is to support politicians who do resist.

This essay originally appeared on CNN.com.

EDITED TO ADD (6/4): This essay has been translated into Swedish.

EDITED TO ADD (6/14): A similar essay, on the politics of terrorism defense.

Posted on May 28, 2013 at 5:09 AMView Comments

Training Baggage Screeners

The research in G. Giguère and B.C. Love, “Limits in decision making arise from limits in memory retrieval,” Proceedings of the National Academy of Sciences v. 19 (2013) has applications in training airport baggage screeners.

Abstract: Some decisions, such as predicting the winner of a baseball game, are challenging in part because outcomes are probabilistic. When making such decisions, one view is that humans stochastically and selectively retrieve a small set of relevant memories that provides evidence for competing options. We show that optimal performance at test is impossible when retrieving information in this fashion, no matter how extensive training is, because limited retrieval introduces noise into the decision process that cannot be overcome. One implication is that people should be more accurate in predicting future events when trained on idealized rather than on the actual distributions of items. In other words, we predict the best way to convey information to people is to present it in a distorted, idealized form. Idealization of training distributions is predicted to reduce the harmful noise induced by immutable bottlenecks in people’s memory retrieval processes. In contrast, machine learning systems that selectively weight (i.e., retrieve) all training examples at test should not benefit from idealization. These conjectures are strongly supported by several studies and supporting analyses. Unlike machine systems, people’s test performance on a target distribution is higher when they are trained on an idealized version of the distribution rather than on the actual target distribution. Optimal machine classifiers modified to selectively and stochastically sample from memory match the pattern of human performance. These results suggest firm limits on human rationality and have broad implications for how to train humans tasked with important classification decisions, such as radiologists, baggage screeners, intelligence analysts, and gamblers.

Posted on May 24, 2013 at 12:17 PMView Comments

TSA Removing Rapiscan Full-Body Scanners from U.S. Airports

This is big news:

The U.S. Transportation Security Administration will remove airport body scanners that privacy advocates likened to strip searches after OSI Systems Inc. (OSIS) couldn’t write software to make passenger images less revealing.

This doesn’t mean the end of full-body scanning. There are two categories of these devices: backscatter X-ray and millimeter wave.

The government said Friday it is abandoning its deployment of so-called backscatter technology machines produced by Rapiscan because the company could not meet deadlines to switch to generic imaging with so-called Automated Target Recognition software, the TSA said. Instead, the TSA will continue to use and deploy more millimeter wave technology scanners produced by L-3 Communications, which has adopted the generic-outline standard.

[…]

Rapiscan had a contract to produce 500 machines for the TSA at a cost of about $180,000 each. The company could be fined and barred from participating in government contracts, or employees could face prison terms if it is found to have defrauded the government. In all, the 250 Rapiscan machines already deployed are to be phased out of airports nationwide and will be replaced with machines produced by L-3 Communications.

And there are still backscatter X-ray machines being deployed, but I don’t think there are very many of them.

TSA has contracted with L-3, Smiths Group Plc (SMIN) and American Science & Engineering Inc. (ASEI) for new body-image scanners, all of which must have privacy software. L-3 and Smiths used millimeter-wave technology. American Science uses backscatter.

This is a big win for privacy. But, more importantly, it’s a big win because the TSA is actually taking privacy seriously. Yes, Congress ordered them to do so. But they didn’t defy Congress; they did it. The machines will be gone by June.

More.

Posted on January 21, 2013 at 6:38 AMView Comments

DHS Gets to Spy on Everyone

This Wall Street Journal investigative piece is a month old, but well worth reading. Basically, the Total Information Awareness program is back with a different name:

The rules now allow the little-known National Counterterrorism Center to examine the government files of U.S. citizens for possible criminal behavior, even if there is no reason to suspect them. That is a departure from past practice, which barred the agency from storing information about ordinary Americans unless a person was a terror suspect or related to an investigation.

Now, NCTC can copy entire government databases—flight records, casino-employee lists, the names of Americans hosting foreign-exchange students and many others. The agency has new authority to keep data about innocent U.S. citizens for up to five years, and to analyze it for suspicious patterns of behavior. Previously, both were prohibited. Data about Americans “reasonably believed to constitute terrorism information” may be permanently retained.

Note that this is government data only, not commercial data. So while it includes “almost any government database, from financial forms submitted by people seeking federally backed mortgages to the health records of people who sought treatment at Veterans Administration hospitals” as well lots of commercial data, it’s data the corporations have already given to the government. It doesn’t include, for example, your detailed cell phone bills or your tweets.

See also this supplementary blog post to the article.

Posted on January 8, 2013 at 6:28 AMView Comments

I Seem to Be a Verb

From “The Insider’s TSA Dictionary“:

Bruce Schneiered: (V, ints) When a passenger uses logic in order to confound and perplex an officer into submission. Ex: “A TSA officer took my Swiss army knife, but let my scissors go. I then asked him wouldn’t it be more dangerous if I were to make my scissors into two blades, or to go into the bathroom on the secure side and sharpen my grandmother’s walking stick with one of the scissor blades into a terror spear. Then after I pointed out that all of our bodies contain a lot more than 3.4 ounces of liquids, the TSA guy got all pissed and asked me if I wanted to fly today. I totally Schneirered [sic] his ass.”

Supposedly the site is by a former TSA employee. I have no idea if that’s true.

Posted on December 28, 2012 at 12:34 PMView Comments

The Terrorist Risk of Food Trucks

This is idiotic:

Public Intelligence recently posted a Powerpoint presentation from the NYC fire department (FDNY) discussing the unique safety issues mobile food trucks present. Along with some actual concerns (many food trucks use propane and/or gasoline-powered generators to cook; some *gasp* aren’t properly licensed food vendors), the presenter decided to toss in some DHS speculation on yet another way terrorists might be killing us in the near future.

The rest of the article explains why the DHS believes we should be terrified of food trucks. And then it says:

The DHS’ unfocused “terrorvision” continues to see a threat in every situation and the department seems to be busying itself crafting a response to every conceivable “threat.” The problem with this “method” is that it turns any slight variation of “everyday activity” into something suspicious. The number of “terrorist implications” grows exponentially while the number of solutions remains the same. This Powerpoint is another example of good, old-fashioned fear mongering, utilizing public servants to spread the message.

Hear hear.

Someone needs to do something; the DHS is out of control.

Posted on November 15, 2012 at 6:45 AMView Comments

On the Ineffectiveness of Airport Security Pat-Downs

I’ve written about it before, but not half as well as this story:

“That search was absolutely useless.” I said. “And just shows how much of all of this is security theatre. You guys are just feeling up passengers for no good effect, which means that you get all the downsides of a search—such as annoyed travellers who feel like they have had their privacy violated—without any of the benefits. I could have hidden half a dozen items on my person that you wouldn’t have had a snowball’s chance in a supernova of finding. That’s what I meant.”

“Sir, are you hiding something?” he said, and as he did, I saw three other security guys coming our way. Oh dear.

“Of course not.” I said. “But if I had wanted to, I could have.”

“Why do you have such a problem with being searched?” another security guy said, presumably the first guy’s supervisor.

“Look, I have absolutely no problem with being searched. But if you’re going to do it, do it properly—the plane is no safer at all after this gentleman half-heartedly stroked me for a couple of seconds” I said.

“How do you mean?” the supervisor asked.

“He was stroking me as if he was trying to get me to sleep with him, not as if he was trying to find anything on me.” I said. “I’ve been searched many, many times, and in this case, I could have hidden things in my socks, taped to my thigh, taped to the small of my back, the insides of my upper arms, under my testicles or anywhere on my buttocks.”

“Why have you been searched so many times?” the supervisor asked sharply.

“I’m a police officer. I help train other police officers. When we search someone, we assume that the person who searches us may have a knife or something else they can use to harm us, so we search properly. And yes, this means that you have to take a firm grip of somebody’s groin, yes, this means that you search even the parts that are less comfortable to have searched, and yes, this means that you’re probably going to incur a couple of sexual harassment accusations along the way.” I nodded at the security guard who had searched me. “This fellow here did by far the most useless search I have ever been subjected to, and if I wanted to, I could have smuggled half a dozen knives onto the flight. I don’t have a problem with being searched at all—in fact, if you guys think it’s necessary, I’d be the first to admit that I look a little bit suspicious before I’ve had my first cup of coffee in the morning—but if you’re going to stroke me gently in front of hundreds of people, you’d better buy me a fucking drink first, is all I am saying.”

The security supervisor was standing there, frozen at my rant.

Posted on November 5, 2012 at 6:19 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.