Entries Tagged "profiling"

Page 4 of 6

The Difficulty of Profiling Terrorists

Interesting article:

A recently completed Dutch study of 242 Islamic radicals convicted or accused of planning terrorist attacks in Europe from 2001 to 2006 found that most were men of Arab descent who had been born and raised in Europe and came from lower or middle-class backgrounds. They ranged in age from 16 to 59 at the time of their arrests; the average was 27. About one in four had a criminal record.

The author of the study, Edwin Bakker, a researcher at the Clingendael Institute in The Hague, tried to examine almost 20 variables concerning the suspects’ social and economic backgrounds. In general, he determined that no reliable profile existed—their traits were merely an accurate reflection of the overall Muslim immigrant population in Europe. “There is no standard jihadi terrorist in Europe,” the study concluded.

In an interview, Bakker said that many local police agencies have been slow to abandon profiling, but that most European intelligence agencies have concluded it is an unreliable tool for spotting potential terrorists. “How can you single them out? You can’t,” he said. “For the secret services, it doesn’t give them a clue. We should focus more on suspicious behavior and not profiling.”

Posted on March 13, 2007 at 5:42 PMView Comments

Song Parody

“Strangers on my Flight.”

EDITED TO ADD (1/8): This post has generated much more controversy than I expected. Yes, it’s in very poor taste. No, I don’t agree with the sentiment in the words. And no, I don’t know anything about the provenance of the lyrics or the sentiment of the person who wrote or sang them.

I probably should have said that, instead of just posting the link.

I apologize to anyone I offended by including this link. And I am going to close comments on this thread.

Posted on January 5, 2007 at 12:22 PM

Automated Targeting System

If you’ve traveled abroad recently, you’ve been investigated. You’ve been assigned a score indicating what kind of terrorist threat you pose. That score is used by the government to determine the treatment you receive when you return to the U.S. and for other purposes as well.

Curious about your score? You can’t see it. Interested in what information was used? You can’t know that. Want to clear your name if you’ve been wrongly categorized? You can’t challenge it. Want to know what kind of rules the computer is using to judge you? That’s secret, too. So is when and how the score will be used.

U.S. customs agencies have been quietly operating this system for several years. Called Automated Targeting System, it assigns a “risk assessment” score to people entering or leaving the country, or engaging in import or export activity. This score, and the information used to derive it, can be shared with federal, state, local and even foreign governments. It can be used if you apply for a government job, grant, license, contract or other benefit. It can be shared with nongovernmental organizations and individuals in the course of an investigation. In some circumstances private contractors can get it, even those outside the country. And it will be saved for 40 years.

Little is known about this program. Its bare outlines were disclosed in the Federal Register in October. We do know that the score is partially based on details of your flight record—where you’re from, how you bought your ticket, where you’re sitting, any special meal requests—or on motor vehicle records, as well as on information from crime, watch-list and other databases.

Civil liberties groups have called the program Kafkaesque. But I have an even bigger problem with it. It’s a waste of money.

The idea of feeding a limited set of characteristics into a computer, which then somehow divines a person’s terrorist leanings, is farcical. Uncovering terrorist plots requires intelligence and investigation, not large-scale processing of everyone.

Additionally, any system like this will generate so many false alarms as to be completely unusable. In 2005 Customs & Border Protection processed 431 million people. Assuming an unrealistic model that identifies terrorists (and innocents) with 99.9% accuracy, that’s still 431,000 false alarms annually.

The number of false alarms will be much higher than that. The no-fly list is filled with inaccuracies; we’ve all read about innocent people named David Nelson who can’t fly without hours-long harassment. Airline data, too, are riddled with errors.

The odds of this program’s being implemented securely, with adequate privacy protections, are not good. Last year I participated in a government working group to assess the security and privacy of a similar program developed by the Transportation Security Administration, called Secure Flight. After five years and $100 million spent, the program still can’t achieve the simple task of matching airline passengers against terrorist watch lists.

In 2002 we learned about yet another program, called Total Information Awareness, for which the government would collect information on every American and assign him or her a terrorist risk score. Congress found the idea so abhorrent that it halted funding for the program. Two years ago, and again this year, Secure Flight was also banned by Congress until it could pass a series of tests for accuracy and privacy protection.

In fact, the Automated Targeting System is arguably illegal, as well (a point several congressmen made recently); all recent Department of Homeland Security appropriations bills specifically prohibit the department from using profiling systems against persons not on a watch list.

There is something un-American about a government program that uses secret criteria to collect dossiers on innocent people and shares that information with various agencies, all without any oversight. It’s the sort of thing you’d expect from the former Soviet Union or East Germany or China. And it doesn’t make us any safer from terrorism.

This essay, without the links, was published in Forbes. They also published a rebuttal by William Baldwin, although it doesn’t seen to rebut any of the actual points.

Here’s an odd division of labor: a corporate data consultant argues for more openness, while a journalist favors more secrecy.

It’s only odd if you don’t understand security.

Posted on December 22, 2006 at 11:38 AMView Comments

When Computer-Based Profiling Goes Bad

Scary story of someone who was told by his bank that he’s no longer welcome as a customer, because the bank’s computer noticed a deposit that wasn’t “normal.”

After two written complaints and a phone call to customer services, a member of the “Team” finally contacted me. She enquired about a single international deposit into my account, which I then explained to be my study grant for the coming year. Upon this explanation I was told that the bank would not close my account, and I was given a vague explanation of them not expecting students to get large deposits. I found this strange, since it had not been a problem in previous years, and even stranger since my deposit had cleared into my account two days after the letter was sent. In terms of recent “suspicious” transactions, this left only two recent international deposits: one from my parents overseas and one from my savings, neither of which could be classified as large. I’m not an expert on complex behavioural analysis networks and fraud detection within banking systems, but would expect that study grants and family support are not unexpected for students? Moreover, rather than this being an isolated incident, it would seem that HSBC’s “account review” affected a number of people within our student community, some of whom might choose not to question the decision and may be left without bank accounts. This should raise questions about the effectiveness of their fraud detection system, or possibly a flawed behaviour model for a specific demographic.

Expect more of this kind of thing as computers continue to decide who is normal and who is not.

Posted on December 18, 2006 at 6:37 AMView Comments

Forecasting Murderers

There’s new software that can predict who is likely to become a murderer.

Using probation department cases entered into the system between 2002 and 2004, Berk and his colleagues performed a two-year follow-up study—enough time, they theorized, for a person to reoffend if he was going to. They tracked each individual, with particular attention to the people who went on to kill. That created the model. What remains at this stage is to find a way to marry the software to the probation department’s information technology system.

When caseworkers begin applying the model next year they will input data about their individual cases – what Berk calls “dropping ‘Joe’ down the model”—to come up with scores that will allow the caseworkers to assign the most intense supervision to the riskiest cases.

Even a crime as serious as aggravated assault—pistol whipping, for example—”might not mean that much” if the first-time offender is 30, but it is an “alarming indicator” in a first-time offender who is 18, Berk said.

The model was built using adult probation data stripped of personal identifying information for confidentiality. Berk thinks it could be an even more powerful diagnostic tool if he could have access to similarly anonymous juvenile records.

The central public policy question in all of this is a resource allocation problem. With not enough resources to go around, overloaded case workers have to cull their cases to find the ones in most urgent need of attention—the so-called true positives, as epidemiologists say.

But before that can begin in earnest, the public has to decide how many false positives it can afford in order to head off future killers, and how many false negatives (seemingly nonviolent people who nevertheless go on to kill) it is willing to risk to narrow the false positive pool.

Pretty scary stuff, as it gets into the realm of thoughtcrime.

Posted on December 1, 2006 at 7:34 AMView Comments

TSA Security Round-Up

Innocent passenger arrested for trying to bring a rubber-band ball onto an airplane.

Woman passes out on plane after her drugs are confiscated.

San Francisco International Airport screeners were warned in advance of undercover test.

And a cartoon.

We have a serious problem in this country. The TSA operates above, and outside, the law. There’s no due process, no judicial review, no appeal.

EDITED TO ADD (11/21): And six Muslim imams removed from a plane by US Airways because…well because they’re Muslim and that scares people. After they were cleared by the authorities, US Airways refused to sell them a ticket. Refuse to be terrorized, people!

Note that US Airways is the culprit here, not the TSA.

EDITED TO ADD (11/22): Frozen spaghetti sauce confiscated:

You think this is silly, and it is, but a week ago my mother caused a small commotion at a checkpoint at Boston-Logan after screeners discovered a large container of homemade tomato sauce in her bag. What with the preponderance of spaghetti grenades and lasagna bombs, we can all be proud of their vigilance. And, as a liquid, tomato sauce is in clear violation of the Transportation Security Administration’s carry-on statutes. But this time, there was a wrinkle: The sauce was frozen.

No longer in its liquid state, the sauce had the guards in a scramble. According to my mother’s account, a supervisor was called over to help assess the situation. He spent several moments stroking his chin. “He struck me as the type of person who spent most of his life traveling with the circus,” says Mom, who never pulls a punch, “and was only vaguely familiar with the concept of refrigeration.” Nonetheless, drawing from his experiences in grade-school chemistry and at the TSA academy, he sized things up. “It’s not a liquid right now,” he observantly noted. “But it will be soon.”

In the end, the TSA did the right thing and let the woman on with her frozen food.

Posted on November 21, 2006 at 12:51 PMView Comments

Airline Passenger Profiling for Profit

I have previously written and spoken about the privacy threats that come from the confluence of government and corporate interests. It’s not the deliberate police-state privacy invasions from governments that worry me, but the normal-business privacy invasions by corporations—and how corporate privacy invasions pave the way for government privacy invasions and vice versa.

The U.S. government’s airline passenger profiling system was called Secure Flight, and I’ve written about it extensively. At one point, the system was going to perform automatic background checks on all passengers based on both government and commercial databases—credit card databases, phone records, whatever—and assign everyone a “risk score” based on the data. Those with a higher risk score would be searched more thoroughly than those with a lower risk score. It’s a complete waste of time, and a huge invasion of privacy, and the last time I paid attention it had been scrapped.

But the very same system that is useless at picking terrorists out of passenger lists is probably very good at identifying consumers. So what the government rightly decided not to do, the start-up corporation Jetera is doing instead:

Jetera would start with an airline’s information on individual passengers on board a given flight, drawing the name, address, credit card number and loyalty club status from reservations data. Through a process, for which it seeks a patent, the company would match the passenger’s identification data with the mountains of information about him or her available at one of the mammoth credit bureaus, which maintain separately managed marketing as well as credit information. Jetera would tap into the marketing side, showing consumer demographics, purchases, interests, attitudes and the like.

Jetera’s data manipulation would shape the entertainment made available to each passenger during a flight. The passenger who subscribes to a do-it-yourself magazine might be offered a video on woodworking. Catalog purchase records would boost some offerings and downplay others. Sports fans, known through their subscriptions, credit card ticket-buying or booster club memberships, would get “The Natural” instead of “Pretty Woman.”

The article is dated August 21, 2006 and is subscriber-only. Most of it talks about the revenue potential of the model, the funding the company received, and the talks it has had with anonymous airlines. No airline has signed up for the service yet, which would not only include in-flight personalization but pre- and post-flight mailings and other personalized services. Privacy is dealt with at the end of the article:

Jetera sees two legal issues regarding privacy and resolves both in its favor. Nothing Jetera intends to do would violate federal law or airline privacy policies as expressed on their websites. In terms of customer perceptions, Jetera doesn’t intend to abuse anyone’s privacy and will have an “opt-out” opportunity at the point where passengers make inflight entertainment choices.

If an airline wants an opt-out feature at some other point in the process, Jetera will work to provide one, McChesney says. Privacy and customer service will be an issue for each airline, and Jetera will adapt specifically to each.

The U.S. government already collects data from the phone company, from hotels and rental-car companies, and from airlines. How long before it piggy backs onto this system?

The other side to this is in the news, too: commercial databases using government data:

Records once held only in paper form by law enforcement agencies, courts and corrections departments are now routinely digitized and sold in bulk to the private sector. Some commercial databases now contain more than 100 million criminal records. They are updated only fitfully, and expunged records now often turn up in criminal background checks ordered by employers and landlords.

Posted on October 24, 2006 at 11:00 AMView Comments

Security and Class

I don’t think I’ve ever read anyone talking about class issues as they relate to security before:

On July 23, 2003, New York City Council candidate Othniel Boaz Askew was able to shoot and kill council member and rival James Davis with a gun in school headquarters at City Hall, even though entrance to the building required a trip through a magnetometer. How? Askew used his politicians’ privilege—a courtesy wave around from security guards at the magnetometer.

An isolated incident? Hardly. In 2002, undercover investigators from Congress’ auditing arm, the General Accounting Office, used fake law enforcement credentials to get the free pass around the magnetometers at various federal office buildings around the country.

What we see here is class warfare on the security battleground. The reaction to Sept. 11 has led to harassment, busywork, and inconvenience for us all ­ well, almost all. A select few who know the right people, hold the right office or own the right equipment don’t suffer the ordeals. They are waved around security checkpoints or given broad exceptions to security lockdowns.

If you want to know why America’s security is so heavy on busywork and inconvenience and light on practicality, consider this: The people who make the rules don’t have to live with them. Public officials, some law enforcement officers and those who can afford expensive hobbies are often able to pull rank.

Posted on October 19, 2006 at 12:25 PMView Comments

This Is What Vigilantism Looks Like

Another airplane passenger false alarm:

Seth Stein is used to jetting around the world to create stylish holiday homes for wealthy clients. This means the hip architect is familiar with the irritations of heightened airline security post-9/11. But not even he could have imagined being mistaken for an Islamist terrorist and physically pinned to his seat while aboard an American Airlines flight—especially as he has Jewish origins.

Turns out that one of the other passengers decided to take matters into his own hands.

In Mr Stein’s case, he was pounced on as the crew and other travellers looked on. The drama unfolded less than an hour into the flight. As he settled down with a book and a ginger ale, the father-of-three was grabbed from behind and held in a head-lock.

“This guy just told me his name was Michael Wilk, that he was with the New York Police Department, that I’d been acting suspiciously and should stay calm. I could barely find my voice and couldn’t believe it was happening,” said Mr Stein.

“He went into my pocket and took out my passport and my iPod. All the other passengers were looking concerned.” Eventually, cabin crew explained that the captain had run a security check on Mr Stein after being alerted by the policeman and that this had cleared him. The passenger had been asked to go back to his seat before he had restrained Mr Stein. When the plane arrived in New York, Mr Stein was met by apologetic police officers who offered to fast-track him out of the airport.

Even stranger:

In a twist to the story, Mr Stein has since discovered that there is only one Michael Wilk on the NYPD’s official register of officers, but the man retired 25 years ago. Officials have told the architect that his assailant may work for another law enforcement agency but have refused to say which one.

I’ve written about this kind of thing before.

EDITED TO ADD (10/3): Here’s a man booted off a plane for speaking Tamil into his cellphone.

Posted on October 3, 2006 at 12:42 PMView Comments

Behavioral Profiling Nabs Warren Jeffs

This is interesting:

A paper license tag, a salad and stories that didn’t make sense pricked the suspicions of a state trooper who stopped the car of a wanted fugitive polygamist in Las Vegas.

But it was the pumping carotid artery in the neck of Warren Steed Jeffs that convinced Nevada Highway Patrolman Eddie Dutchover that he had cornered someone big.

This is behavioral profiling done right, and it reminds me of the Diana Dean story. (Here’s another example of behavioral profiling done right, and here is an article by Malcolm Gladwell on profiling and generalizations.)

Behavioral profiling is tough to do well. It requires intelligent and well-trained officers. Done badly, it quickly defaults to racial profiling. But done well, it’ll do far more to keep us safe than object profiling (e.g., banning liquids on aircraft).

Posted on August 31, 2006 at 1:11 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.