Automated Targeting System

If you've traveled abroad recently, you've been investigated. You've been assigned a score indicating what kind of terrorist threat you pose. That score is used by the government to determine the treatment you receive when you return to the U.S. and for other purposes as well.

Curious about your score? You can't see it. Interested in what information was used? You can't know that. Want to clear your name if you've been wrongly categorized? You can't challenge it. Want to know what kind of rules the computer is using to judge you? That's secret, too. So is when and how the score will be used.

U.S. customs agencies have been quietly operating this system for several years. Called Automated Targeting System, it assigns a "risk assessment" score to people entering or leaving the country, or engaging in import or export activity. This score, and the information used to derive it, can be shared with federal, state, local and even foreign governments. It can be used if you apply for a government job, grant, license, contract or other benefit. It can be shared with nongovernmental organizations and individuals in the course of an investigation. In some circumstances private contractors can get it, even those outside the country. And it will be saved for 40 years.

Little is known about this program. Its bare outlines were disclosed in the Federal Register in October. We do know that the score is partially based on details of your flight record--where you're from, how you bought your ticket, where you're sitting, any special meal requests--or on motor vehicle records, as well as on information from crime, watch-list and other databases.

Civil liberties groups have called the program Kafkaesque. But I have an even bigger problem with it. It's a waste of money.

The idea of feeding a limited set of characteristics into a computer, which then somehow divines a person's terrorist leanings, is farcical. Uncovering terrorist plots requires intelligence and investigation, not large-scale processing of everyone.

Additionally, any system like this will generate so many false alarms as to be completely unusable. In 2005 Customs & Border Protection processed 431 million people. Assuming an unrealistic model that identifies terrorists (and innocents) with 99.9% accuracy, that's still 431,000 false alarms annually.

The number of false alarms will be much higher than that. The no-fly list is filled with inaccuracies; we've all read about innocent people named David Nelson who can't fly without hours-long harassment. Airline data, too, are riddled with errors.

The odds of this program's being implemented securely, with adequate privacy protections, are not good. Last year I participated in a government working group to assess the security and privacy of a similar program developed by the Transportation Security Administration, called Secure Flight. After five years and $100 million spent, the program still can't achieve the simple task of matching airline passengers against terrorist watch lists.

In 2002 we learned about yet another program, called Total Information Awareness, for which the government would collect information on every American and assign him or her a terrorist risk score. Congress found the idea so abhorrent that it halted funding for the program. Two years ago, and again this year, Secure Flight was also banned by Congress until it could pass a series of tests for accuracy and privacy protection.

In fact, the Automated Targeting System is arguably illegal, as well (a point several congressmen made recently); all recent Department of Homeland Security appropriations bills specifically prohibit the department from using profiling systems against persons not on a watch list.

There is something un-American about a government program that uses secret criteria to collect dossiers on innocent people and shares that information with various agencies, all without any oversight. It's the sort of thing you'd expect from the former Soviet Union or East Germany or China. And it doesn't make us any safer from terrorism.


This essay, without the links, was published in Forbes. They also published a rebuttal by William Baldwin, although it doesn't seen to rebut any of the actual points.

Here's an odd division of labor: a corporate data consultant argues for more openness, while a journalist favors more secrecy.

It's only odd if you don't understand security.

Posted on December 22, 2006 at 11:38 AM • 60 Comments

Comments

antibozoDecember 22, 2006 12:16 PM

The so-called rebuttal is completely incoherent. Responding obliquely to Bruce's point that the false positive rate makes the cost of the system far too high per true positive, Baldwin suggests compensating victims of false positives by giving them $20 for their inconvenience.

The guy doesn't get it at all.

RandomGuyDecember 22, 2006 12:17 PM

Found this gem the other day:

http://computerworld.com/action/article.do?...

Some choice quotes:

"You don't want government employees having to make a lot of their own evaluations of people on the spot..."

"Immigration agents have to evaluate everyone entering the country. If the system were not in place, the agents would have to make the decisions on their own."

Bruce SchneierDecember 22, 2006 12:23 PM

"The so-called rebuttal is completely incoherent. Responding obliquely to Bruce's point that the false positive rate makes the cost of the system far too high per true positive, Baldwin suggests compensating victims of false positives by giving them $20 for their inconvenience."

If I had a bunch of millions to spend on counter-terrorism, I would not want to spend it compensating people for investigating them by accident.

There just as to be a more effective use of the money.

antibozoDecember 22, 2006 12:29 PM

Bruce> If I had a bunch of millions to spend on counter-terrorism, I would not want to spend it compensating people for investigating them by accident.

Actually, now that I think about it, isn't this scenario the exact plot setup in Terry Gilliam's Brazil? Kafkaesque, indeed...

MikeDecember 22, 2006 12:32 PM

Think of the $20 payout as increasing the cost to the government of dealing with a false positive. Right now, the costs of dealing with false positives are largely bourne by the wrongly identified, and there's no measure of accountability on the side of the government. If a false positives are (properly) seen as a security failure, and tracked & accounted for in a real, measurable way, then perhaps it will give an incentive to cut down on them.
Of course, the trick is to do so without actually letting the actual threats slip by.

Mike SherwoodDecember 22, 2006 12:38 PM

The rebuttal seems like a bunch of random, unrelated statements.

It's easy to suggest giving money to people who get selected or delayed when you have no responsibility for the $8-215 million dollars that it would cost. Not to mention, the government isn't in the business of giving people money for being inconvenienced.

The David Nelson point that Bruce made is one I can relate to well having worked with someone having that name. A few years ago, just before the holidays when he was planning to travel to visit family, I gave him an article about common names like David Nelson being on the list.
He was searched on each leg of their trip, while his wife and child were not. Since he knew why this was happening, he took the whole thing in good humor and even showed the article to the people detaining him.

The problem with William Baldwin's IRS analogy is that he's blissfully ignorant of how these data processing systems work. Without realizing it, he is advocating that the IRS audit selection algorithm should remain unchallengeable, even if it gives the same group of people every single time it is run. Even worse, each of these people is given to each auditor such that there is a guarantee that some people will consistently have bad experiences everywhere they go and will have no way of ever improving their situation. It creates a new travel underclass.

Davi OttenheimerDecember 22, 2006 12:38 PM

Excellent essay.

"If I had a bunch of millions to spend on counter-terrorism, I would not want to spend it compensating people for investigating them by accident."

That might actually be his point, that compensation should be doled out as a "disincentive". While there certainly are proponents of such a system (charge more on the front-end -- let demand sort 'em out -- to cover all the waste and injustice on the back-end) it seems to me a morally bankrupt position to take. Reminds me of some of the sadly inefficient and even unjust Soviet systems we used to study and joke about in econ.

Speaking of which, I wonder when/if the movie "Das Leben der Anderen (The Life of Others)" about the East German secret police will hit the American theaters...

antibozoDecember 22, 2006 12:52 PM

Mike, Davi:

Obviously Baldwin intends the $20 payout as a market force to improve the detection accuracy. The problem is that it assumes that the detection accuracy can be markedly improved without fundamentally changing what data is fed into it, which increases cost and diminishes privacy all over again.

AndrewDecember 22, 2006 1:08 PM

>> Baldwin suggests compensating victims of false positives by giving them $20 for their inconvenience.

>> The guy doesn't get it at all.

Legal positivists are the true ideological terrorists of the 21st century. They genuinely believe in a calculus of violating people's rights in exchange for (insultingly little) monetary compensation.

The right to be free of unwarranted government intrusion are not a "cost" which can be "compensated" for.

If you haven't read Posner's latest book, go do that and shudder a bit.

If they really wanted to engage in the calculus, they should allow sellers to set the price of their privacy and stop violating the rules of the free marketplace.

I set the cost of violating my Constitutional rights at $100,000,000 (one hundred million dollars).

I feel certain others might agree.

(I am a former sociologist. Note: former.)

RandomGuyDecember 22, 2006 1:26 PM

@Craig:

Thanks. I think this guy just has way too much faith in computers, and way too little faith in people.

Josh ODecember 22, 2006 1:29 PM

Did congress really find TIA so abhorrent, or was the public backlash simply enough to force them to act.

Michael AshDecember 22, 2006 1:29 PM

It's tremendously amusing that Baldwin defends the program by comparing it to IRS audits, perhaps the aspect of government which is most feared and hated by the average citizen. While I actually agree somewhat with the point he's trying to make in that comparison (targeted strong acts being more effective than harassing everybody just a little bit), it may well be the worst possible comparison he could make in terms of actually selling his idea.

swiss connectionDecember 22, 2006 1:38 PM

@antibozo

"The guy [William Baldwin] doesn't get it at all"

I would like to agree but suspect a more sinister motive. His
article is just plain strategic disinformation.

antibozoDecember 22, 2006 1:38 PM

Michael Ash> While I actually agree somewhat with the point he's trying to make in that comparison (targeted strong acts being more effective than harassing everybody just a little bit)

There's a big difference between IRS audits and this system, namely that the IRS bases its audit decision on a quality assessment of the actual data to be audited. That is, they have your tax forms, reported income, and income history, and they make a judgment on whether your current report is plausibly consistent with your history. (And they still have a lot of false positives.) In contrast, the ATS (apparently, at least) bases its decision on data (travel history) that is almost completely statistically irrelevant to whether the subject is in fact a terrorist.

A better comparison Baldwin could make would between ATS and getting the IRS to give you a terrorist score based on your financials. Arguably, the IRS might do a better job because the input data would be of higher quality. (And they still would have a lot of false positives.)

Richard MigneronDecember 22, 2006 1:54 PM

Well, it's things like this and the fingerprinting and picture taking at customs that makes me never to want to go to the US.

For me, the risks of going to the US are so great for my identity that I'm not about to come visit ever again !

Too bad, I still have so much to see ...

KentDecember 22, 2006 2:09 PM

Thank you. Your digging and commentary are vitally important to me and my friends. You should be Special Adviser to the Secy of Homeland Security. Does he even know your name?

Geoff LaneDecember 22, 2006 3:49 PM

"Immigration agents have to evaluate everyone entering the country. If the system were not in place, the agents would have to make the decisions on their own."


Which, amazingly enough is what they do very successfully every day. The clueless dismissal of a group of people who know the real skills that are so needed on borders is indicative of an arrogance that is scary in somebody who could exercise so much power over the public.

Borders leak. No scheme is perfect. To pretend otherwise, to claim that a new numerology will be better is to display contempt of the public who pay their wages.

Teach the public what is possible; teach the public about the true level of terrorist threat compared to crossing the road or living in a state with weak gun laws and a normal level of mental incompetence.

Chris SmithDecember 22, 2006 4:26 PM

Bruce,
Your posts are excellent, and I did buy your book.
Two ways to improve would be to a) move beyond criticising the existing approaches and actually come out with fresh, implementable ideas to improve security, and
b) enter politics yourself. Either get on a ballot, or, if you can't locate enough cleaning supplies to support public life ;) , at least help manage some sort of political party. Guys like you and Lessig are gold, and the leadership of the future.

Anonymous CowardDecember 22, 2006 5:04 PM

"There is something un-American about a government program that uses secret criteria to collect dossiers on innocent people and shares that information with various agencies, all without any oversight."

"Un-American" is an adjective that is losing its meaning rapidly.

anonymousDecember 22, 2006 5:21 PM

Hah, $500 for missing a plane?

If only I were on the list! Buy cheap tickets, arrive late, collect $500 and go home. (Anyone think the system would learn?)

Redistribution of wealth at its finest, would the $500 be tax free? Aside from the idiotic statements relating ATS to the IRS, though I suppose contrasting two illegal systems is perfectly ok (obscurity is good eh?) the rebuttal makes even less sense economically.

Fraud GuyDecember 22, 2006 11:22 PM

Actually, the disincentive would not work, because it is not the government's money, or the money of the programmers who created the algorithms, but our tax dollars given back to us. Now if the money came out of the pay and assets (when the salary ran out) of the Mr. Chertoff and the chief executive who approved it....would they be so keen to support it?

Other Buried NewsDecember 23, 2006 2:28 AM

This essay was too good to be buried on a pre-holiday-weekend "squid Friday".

In other buried news, DHS' own Privacy Office released a report yesterday on the Secure Flight program: http://www.eff.org/deeplinks/archives/005051.php

It details the discrepancy between how the program's privacy-related stuff was described publicly in the official System of Record Notice (SORN) and how it ended up being implemented.

Among the report's recommendations:
1. Privacy expertise should be embedded into a program from the beginning [...]

6. Privacy notices should be revised and republished when program design plans change materially [...]

7. Program use of commercial data must be made as transparent as possible [...]

I don't know much about the DHS Privacy Office, but I really think they "get it" and their recommendations are spot-on. I hope Bruce writes something on this topic, b/c I'm curious to know what others will think.

Now here's hoping that this report is actually taken seriously by DHS and the congressfolks responsible for oversight.

ShefalyDecember 23, 2006 3:38 AM

"Here's an odd division of labor: a corporate data consultant argues for more openness, while a journalist favors more secrecy."

I think it just shows how much confidence each has in the American values that each is defending.

Clearly the journalist does not have much and he is in plenty company - which is why much American media is frankly not worth reading.

Bobby BDecember 23, 2006 6:49 AM

> take your $20 and shove it up your ass.

I guess they could leave it there during the search ;)

WhatDidYouExpectDecember 23, 2006 11:02 AM

Is anybody really surprised by this? The normal credit score is a risk assessment metric. Yet, it still depends on accuracy of the input data. Insurance companies increasing using an "insurance credit score" to assess the risk to the insurance company that you will submit a claim (they have switched from providing insurance benefits to providing stockholder value). I have seem some disertation on the risk that one will commit a crime (similar to the insurance credit score), or exhibit personality disorders, and contract diseases. Now, the terrorist risk assessment (and its look-alikes). One or more of these depend on the same questionable quality: hidden algorithms and secrecy of the data and processes. Wait 'till the show trials start.

StopTheFedsDecember 23, 2006 1:40 PM

The government is intent on not receiving citizens' comments on this proposal either. Have any of you seen the horrible regulations.gov website? It amazes me what they did there...

I do have good news though. There's an initiative that makes it a piece of cake to submit your comments to the feds - http://ws.privacyalertnetwork.net/points/point?...

Enjoy. Tell them what you think. It matters.

antibozoDecember 23, 2006 2:21 PM

Fraud Guy> Actually, the disincentive would not work, because it is not the government's money, or the money of the programmers who created the algorithms, but our tax dollars given back to us.

True enough, but the biggest point Baldwin misses is that an overwhelming disincentive is already built in: even at an unrealistically high accuracy rate, there will be so many false positives that the screeners will be unable to keep up with the system, and the quality of their work will go down, not up. Adding a monetary penalty only makes things worse by taking funding away from other measures that could actually work.

What ATS may amount to more than anything else is an attempt to diminish the potential liability incurred by having individuals make the judgment calls themselves. On the theory that someone could interview all the personnel who make these decisions and try to devise an algorithm to model their process, such an algorithm would protect the screening staff from accusations of singling people out based on their own hard-to-defend prejudices. That the accuracy of the algorithm-based decisions might be orders of magnitude lower wouldn't matter to staff who just don't want to be accused of racism. I.e. if they build a system that algorithmically tells them to screen all people who've traveled to the Middle East, then it looks less racist than if they make the same decision themselves. Then they can still single out anyone they like, and with a high rate of false positives, they're almost guaranteed to be backed up by the system's result anyway.

This may be the motivation behind most wholesale surveillance systems. By ostensibly monitoring everyone while secretly focusing solely on the data collected on select individuals or groups already under suspicion, THEY appear more egalitarian than when THEY train the electronic eye only on the people of interest. It's harder to prove individuals are being persecuted when everyone's superficially subject to the same scrutiny.

Erik V. OlsonDecember 23, 2006 2:39 PM

It has a perfectly reasonable use to our government. It will make it easier to find those who speak out against it and punish them.

MoDecember 23, 2006 4:57 PM

@Chris Smith: always the same thing is said by people who attack intelligent critics. Noam Chomsky in an interview answered this one so well: "I do offer positive implementable solutions. It's just that people don't like hearing them."
Bruce is no different from three piece identification to cryptography related articles. The problem is that: a) people don't want to hear that the bank website isn't secure just because it's got SSL, and b) people are actually too lazy to read what is being said... this very article provides very good clues as to what to do.

Anyways, I wanted to comment on what @RandomGuy said, and is actually a question for Bruce himself.
RandomGuy brings up a topic which is rotting away at our society as a whole. (I will give examples shortly)
The quote "If the system were not in place, the agents would have to make the decisions on their own." pretty much sums it up.
'Accountability' is the new god of our times. In this capitalist world, money is the value by which we measure anything. And inevitably, this has led to the creation of entire industries built around the flow of money and the guaranteeing that the people who have it do not lose it. Whether you agree with me or not on the capitalist spin has no importance, but the important thing is that people think the above way because humans are inherently not trustable or accountable in face of the market. (not law)
When an airliner's plane crashes, killing all on board and it is deemed pilot error, the airline but more importantly the company the ensures it has nothing to blame. That's the end of the road of accountability and hence they have to pay up to lawsuits.
It follows that every truely capitalist company's goal is to find a way to set that unknown or undefinable factor into something rigid. Which is where fully automated procedures come in. Now, another venturing company can come and say "I'll pilot your planes with machines" and opens up a huge chain of accountability internally. They calculate on paper "it'll take 4 years to build this bullet proof system, which we can charge 5 million dollars for and we get insurance etc. etc.". Now, when the airliner crashes, there's someone to sue: the company who manufactured the software for the machines.

People argue that it's a good thing to have accountability, that it increases the effectiveness and the state of the art by forcing it up using the free market. But does it really?
What I see it doing is trying to simply remove intangible human skill.

Here's a bunch of illustrating examples: Cirque du Soleil's overriding effort in R&D these days is to build bigger and more precise mechanical special effects. The reason, if you ask me? It's becomming increasingly difficult to find replaceable artists that will do the job, because to Cirque it's an industry, and the artist is just PCB you toss into a mainframe machine. It's sure as heck not the reason why the show is great. So they have to find ways in which to toss people out of canons even higher than before, because they know they can't just pick anyone from the gym team and ask him to do a double back flip *and* at the same time be artistic. It's just not profitable anymore to train these people for more than 3 months. So, they go with predictable machinery, and decreasingly skilled and talented artists in a bid to compensate for the drastic demand for new shows (they have 7 new shows lined up in pre-production).
The result: increasingly plastic looking, unhuman spectacles that bore me as an audience member, and most likely a lot of other people too.

Other examples? Making these 178 point check lists for airport officials. It's nothing but a way for the underpaid undeducated working people to wipe their hands clean of responsability, and at the same time have the higher ups pretend that the chain of liability and accountability has been fulfilled. Does it work? We all know it doesn't. But is that really important? I don't actually think it is.

The problem is that as long as the chain of accountability is closed, the predicate determining whether this system is successful or not is fulfilled because this predicate measures only dollar value, not human value. And in this day and age that predicate almost always ends up being insurance companies and lawyers.

The bottom line is: does it matter that a bank's website being SSL'd is not enough to ensure security? The answer: only if the bank has nobody to sue.
Does it matter if the government can't do it's job making it safer for us? Not as long as it can blame and imprison people. Guilty or not.

This an open ended post. I'd love your comments if you have any.

antibozoDecember 23, 2006 5:13 PM

idleline> Hate to be a stickler for Math but isn't 99.9% of 431 million 1.43 million?

Uh, no.

Assume that ATS incorrectly identifies someone as a terrorist only 1 in 1000 times. That's a 0.1% false positive rate, which is what Bruce means by 99.9% accuracy, and is unrealistically generous given the mediocre quality of the input data, let alone the entire concept. The number of false positives then would be .001 * 431 million, which is 431 thousand.

If we also very generously assume there are 4310 real terrorists in the system, and that ATS identifies every one of them (a zero false negative rate), then for every real terrorist, screeners must investigate 100 others. Will the screeners identify that one real terrorist among the other 100? If so, we'd kind of expect to have heard of it happening, at least once. Has anyone ever heard of a terrorist being discovered in screening during the time ATS is supposed to have been operating?

MoDecember 23, 2006 5:58 PM

PS. I find the intrusion on privacy these days to be a direct result of accountability. E.g. priority number one is to always have someone to blame. Hence anonymity is not an option.

Michael AshDecember 23, 2006 6:47 PM

@Chris Smith, "move beyond criticising the existing approaches and actually come out with fresh, implementable ideas to improve security"

Bruce has done this. He constantly presents realistic, workable, and useful ideas which would make airports simultaneously safer, less intrusive, and cheaper. The problem is that his ideas are too simple to look effective, and so most people dismiss them without even realizing what they've read.

Put Bruce in charge of these things, and I mean really in charge, not just a figurehead post, and I imagine the situation would improve drastically. However, this will never happen, because government's primary goal is to increase its own power, not make us safer, and definitely not spend less money.

John in SammmishDecember 23, 2006 8:33 PM

The question I have not seen ask is how many terrorist have been detained or turned away by the system? In other words, does it work? Is the system another Maginot Line, a false sense of security for a large amount of money? Secrecy is necessary to prevent embarrassment of government waste.

NormDecember 23, 2006 10:34 PM

"Additionally, any system like this will generate so many false alarms as to be completely unusable."

Getting rid of false alarms is easy: If have 10 times too many false alarms, then throw away 9 out of each 10 alarms. Done.

Typically, however, there are better ways.

It's a problem in hypothesis testing. The main result for guidance is the Neyman-Pearson lemma. Here, on the variables we are measuring, we have (A) the distribution of people who are not terrorists and (B) the distribution of terrorists. Then, for a given probability of false alarms, we select the region, where we will raise an alarm, that gives the highest detection rate of terrorists for the false alarm rate we have selected. Doing this is like investing: We regard the false alarms as money and put it where we get the best ROI.

Yes, nearly always accepting a higher false alarm rate will give us a higher detection rate.

We get the false alarm rate we asked for, and we get the highest detection rate we can. But, that doesn't mean that the detection rate we get is very high! If we have a nicely low false alarm rate, then our detection rate may still be so low as to be nearly useless.

I agree that the variables mentioned promise to give poor results.

As is common in many of the classic distribution-free tests, it is possible to proceed with no distributions at all. Here we can still select our false alarm rate, but we will have next to no information about detection rate.

I have some results in distribution-free, multi-dimensional tests and, there, some asymptotic results on a useful but special 'average' detection rate. So, it is possible to say something.

One reason for crude testing is as the first step in 'sequential testing' where, when we get an alarm, we invest the effort in more testing. So, the first rather crude test may be just an initial, first-cut, crude filter.

But, as is very well known in computer security, the 'bad' guys that are the targets of the searching try to look 'normal'. Then, over time they can slowly look less and less normal and, thus, corrupt the historical data of what is regarded as 'normal' and slowly get the detector to accept them without changing their looks.

comeonDecember 23, 2006 11:24 PM

You people are insane. Thousands travel each and every day and never have a problem. If some level of privacy needs to be sacrafices in order to protect then so be it. The one's that we need to worry about are clearly not worrying about 'rules' so people who are smarter than a gate checker and computers must be relied on. Something tells me that you are all the same folks that used to be up in arms about browser cookies being stored on a computer. Aside from the people who were doing something wrong, thing honetly about how much information was really used against you when a company built a profile on you so that the could serve you an ad. Get over yourselves ... big brother could give a sh*t about you ... it's the bad ones that they're really really worried about. Just let them do their damn jobs.

Kjetil JorgensenDecember 24, 2006 4:45 AM

@comeon

Yes! Let's sacrifice the civil liberties of the David Nelsons and go about with our lives. The illusion of safety is clearly worth more than the David Nelsons time and dignity.

As has been mentioned before in this thread, the actual terrorist threat is still insignificant compared to the threat of beeing killed in more mundane ways. This is expensive theatre, and it's eroding civil liberties as an added bonus. This fear-mongering is probably encouraging even more terrorism, as we're appearing as genuinely scared of terrorism, we're feeding them success. Statistically you should be more scared of traffic than terrorism, it might not be as spectacular but it sure as hell is more deadly.

Richard BraakmanDecember 24, 2006 6:41 AM

@comeon:

I suspect that a number of people reading this blog ARE actually "the bad ones", namely people who are openly critical of their government.

And government doesn't have to give a shit about you in order to ruin your life. In fact it's usually the opposite.

gregDecember 24, 2006 6:55 AM

@comeon

You sir are a idoit. We here work with computers. We know how well they don't work.

Big brother couldn't find *the* bad guy if there lives depended on it. So they find whatever the computer spits back at them. Which mite be you next time. And they assume that its never wrong. 99% of the "hits" will be inocent people that have done nothing. While the bad guys get to go though without any extra checks.

This has happed befour and it awalys ends the same. This will too.

"If some level of privacy needs to be sacrafices in order to protect then so be it"

What is security if we have no freedom? Worthless. Its not some level of privacy. Its and undisclosed level of privacy. If they can keep a secret why can't we?

Do the world a favour, and shut up or/and die.

Bruce SchneierDecember 24, 2006 11:06 AM

@comeon

"If some level of privacy needs to be sacrafices in order to protect then so be it."

I think you're missing my argument, here. I'm not saying that we should not sacrafice privacy in order to gain security. I'm saying that, in this case, we're sacracifing privacy, liberty, and security from the government (remember, there was a good reason the constitution was written to protect people from the government), and we're not getting any security from terrorism in return.

rfidDecember 24, 2006 11:18 AM

It's a problem in hypothesis testing. The main result for guidance is the Neyman-Pearson lemma. Here, on the variables we are measuring, we have (A) the distribution of people who are not terrorists and (B) the distribution of terrorists. Then, for a given probability of false alarms, we select the region, where we will raise an alarm, that gives the highest detection rate of terrorists for the false alarm rate we have selected. Doing this is like investing: We regard the false alarms as money and put it where we get the best ROI.

Davi OttenheimerDecember 24, 2006 1:38 PM

"Just let them do their damn jobs."

Heh, the circular reasoning is dizzying. Insanity indeed.

I'll try that "it's my damn job" argument next time I want to stop someone from storing browser cookies on their computer.

;)

DualityDecember 25, 2006 5:37 AM

Bruce has clearly explained on numerous occasions the impracticability of having to make perhaps 1000 arrests/investigations to stop one terrorist.

In reality, the ratio might be closer to 10,000 or 100,000 to one, even more astronomically impractical.

Why does such advice so frequently fall on deaf ears? Could it be because any arrest releases thousands of dollars in funds? Denying people flight also earns money: the TSA can show they're doing their job, and a hiccup in a person's travel plans might increase that person's spend by hundreds or thousands of dollars.

Bruce, whether deliberately or naively, frames his discussions about the real security value of various measures, rarely or never touching on the systemic biases that pushes security decisions to maximize funding.

Similarly, the US criminal justice system is to my perceptions strongly oriented to maintaining its funding sources than to bring 'justice' in the abstract.

We can criticize what we see as wrong. What I'd like to know is how effective is such opinion at influencing organizational structures that in many cases are driven by organizational survival rather than abstract principles they are mandated to be upholding.

I suppose framing security issues has a value for Bruce's clients, colleagues and readers sufficient to justify his commentary.

There is a risk of mistaking his clear analysis of issues for optimism. The fact that security practice is often diammetrically at odds with Bruce's recommendations speaks to me that it is because demand for practical security is overwhelmed by other biases.

That the decision makers opt for poor security or security in appearance only is not because they are stupid (as so many antiestablishment commentators are wont to label them) but because, in my opinion, they are responding to other considerations.

AlexDecember 25, 2006 11:22 AM

Doesn't anyone find it odd that we (Americans) are now doing what we used to criticize the USSR of doing back in the days? I'm confused at how we used to abhor such behavior and now embrace it for ourselves. At least we still have free speech (for the moment)...

antibozoDecember 25, 2006 1:10 PM

Alex> Doesn't anyone find it odd that we (Americans) are now doing what we used to criticize the USSR of doing back in the days?

Yeah, but we're the *good* guys. ;^)

LisaDecember 26, 2006 12:40 AM

@Anonymous Coward

""Un-American" is an adjective that is losing its meaning rapidly."

It's not like Un-American had much of a meaning in the first place. Just ask those who were targeted by the House Committee on Un-American Activities.

JohnDecember 28, 2006 7:58 AM

Doesn't anyone find it odd that we (Americans) are now doing what we used to criticize the USSR of doing back in the days?

CaradocDecember 28, 2006 11:10 AM

Actually, it is not necessary for the government to keep a dossier on fliers. Instead, it puts hooks into the ticketing system and runs a check on every PNR. Once it has name/address/credit card it can transmit a query to a commercial database like Acxion's and run checks on various watch lists. From this aggregation of data, it makes a decision and then discards the data. There are a huge number of PNRs to process, but today's machines can handle the load.

This approach is actually superior to building permanent dossiers, since it is always up to date.

srcdbgrJanuary 18, 2007 8:58 AM

There is something un-American about a government program that uses
secret criteria to collect dossiers on innocent people and shares that
information with various agencies, all without any oversight. It's the
sort of thing you'd expect from the former Soviet Union or East Germany
or China.

Sorry to disagree with that statement. From outside the US it seems very American to me. At least for the last few years. The home of the brave and the free ? Not anymore.

AnastaciaMarch 6, 2007 12:42 PM

Has anyone ever heard of 19 USC 482? This allows Customs to search cars and people without suspicion when coming from a foreign country. Although terrorism may be on everyone's mind, Customs is still looking for drugs and illegal immigrants, and ATS allows them to do this more efficiently. From the DHS press release, they have taken data from former violators, assuming drug smugglers, etc, and they use that in their ATS formula.

I don't like crossing the border as much as the next person, but if I have a pattern of a drug dealer or terrorist, I deserve to be searched, even though I'm a law-abiding U.S. Citizen.

And it's common sense that the government doesn't release it's ATS factors. A terrorist or drug smuggler would take that information and try to act and do the opposite.

Is this violating our privacy? Well, no one's ever had privacy at the border since adopting 19 USC 482. I'm sure everyone complaining about ATS will be the first to say, "why didn't the government protect me?" when the next terrorist attack happens. And what is the government to say? We tried, but using hi-tech data was ruled illegal by some ignorant people.

YourNeighborJune 1, 2009 12:41 AM

Times are indeed changing.

I was minding my own business, and recently found that I was being targeted by a bunch of NSA/FBI/COP nut cases that just had to do everything in their power to find me guilty of something.

Trying to entrap me numerous times, following me, putting various kinds of mind-altering poisons in my apartment environment food and air, recording me with illegal cameras perhaps for years, and finally using some kind of perverted brainwashing to try to make me go insane.

Having failed that, following me wherever I go, and putting tracking devices in my car, and even worse.

And all this without being formally charged. And it still continues.

No. These people could not possibly justify their miserable existence and are doomed to go to hell.

And at the first chance, any chance, I will make sure as many of these idiots loose their jobs as possible. They are incompetent, and do more harm than good.

And now there is talk about new satellites that transmit harmful radiation daily to slowly kill targeted individuals, and disrupt their sleeping patterns drive them insane.

No. This is very much communism, would make any American ashamed of being an American.

mtrucoJanuary 6, 2010 2:44 AM

I have read this message, and I think that the automated trading system is quite good and progressive idea that you want to develop and promote in the future. We are moving forward and automation of all processes of our life is inevitable.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..