Entries Tagged "law enforcement"

Page 25 of 46

The Exclusionary Rule and Security

Earlier this month, the Supreme Court ruled that evidence gathered as a result of errors in a police database is admissible in court. Their narrow decision is wrong, and will only ensure that police databases remain error-filled in the future.

The specifics of the case are simple. A computer database said there was a felony arrest warrant pending for Bennie Herring when there actually wasn’t. When the police came to arrest him, they searched his home and found illegal drugs and a gun. The Supreme Court was asked to rule whether the police had the right to arrest him for possessing those items, even though there was no legal basis for the search and arrest in the first place.

What’s at issue here is the exclusionary rule, which basically says that unconstitutionally or illegally collected evidence is inadmissible in court. It might seem like a technicality, but excluding what is called “the fruit of the poisonous tree” is a security system designed to protect us all from police abuse.

We have a number of rules limiting what the police can do: rules governing arrest, search, interrogation, detention, prosecution, and so on. And one of the ways we ensure that the police follow these rules is by forbidding the police to receive any benefit from breaking them. In fact, we design the system so that the police actually harm their own interests by breaking them, because all evidence that stems from breaking the rules is inadmissible.

And that’s what the exclusionary rule does. If the police search your home without a warrant and find drugs, they can’t arrest you for possession. Since the police have better things to do than waste their time, they have an incentive to get a warrant.

The Herring case is more complicated, because the police thought they did have a warrant. The error was not a police error, but a database error. And, in fact, Judge Roberts wrote for the majority: “The exclusionary rule serves to deter deliberate, reckless, or grossly negligent conduct, or in some circumstances recurring or systemic negligence. The error in this case does not rise to that level.”

Unfortunately, Roberts is wrong. Government databases are filled with errors. People often can’t see data about themselves, and have no way to correct the errors if they do learn of any. And more and more databases are trying to exempt themselves from the Privacy Act of 1974, and specifically the provisions that require data accuracy. The legal argument for excluding this evidence was best made by an amicus curiae brief filed by the Electronic Privacy Information Center, but in short, the court should exclude the evidence because it’s the only way to ensure police database accuracy.

We are protected from becoming a police state by limits on police power and authority. This is not a trade-off we make lightly: we deliberately hamper law enforcement’s ability to do its job because we recognize that these limits make us safer. Without the exclusionary rule, your only remedy against an illegal search is to bring legal action against the police—and that can be very difficult. We, the people, would rather have you go free than motivate the police to ignore the rules that limit their power.

By not applying the exclusionary rule in the Herring case, the Supreme Court missed an important opportunity to motivate the police to purge errors from their databases. Constitutional lawyers have written many articles about this ruling, but the most interesting idea comes from George Washington University professor Daniel J. Solove, who proposes this compromise: “If a particular database has reasonable protections and deterrents against errors, then the Fourth Amendment exclusionary rule should not apply. If not, then the exclusionary rule should apply. Such a rule would create an incentive for law enforcement officials to maintain accurate databases, to avoid all errors, and would ensure that there would be a penalty or consequence for errors.”

Increasingly, we are being judged by the trail of data we leave behind us. Increasingly, data accuracy is vital to our personal safety and security. And if errors made by police databases aren’t held to the same legal standard as errors made by policemen, then more and more innocent Americans will find themselves the victims of incorrect data.

This essay originally appeared on the Wall Street Journal website.

EDITED TO ADD (2/1): More on the assault on the exclusionary rule.

EDITED TO ADD (2/9): Here’s another recent court case involving the exclusionary rule, and a thoughtful analysis by Orin Kerr.

Posted on January 28, 2009 at 7:12 AMView Comments

New Police Computer System Impeding Arrests

In Queensland, Australia, policemen are arresting fewer people because their new data-entry system is too annoying:

He said police were growing reluctant to make arrests following the latest phased roll-out of QPRIME, or Queensland Police Records Information Management Exchange.

“They are reluctant to make arrests and they’re showing a lot more discretion in the arrests they make because QPRIME is so convoluted to navigate,” Mr Leavers said. He said minor street offences, some traffic offences and minor property matters were going unchallenged, but not serious offences.

However, Mr Leavers said there had been occasions where offenders were released rather than kept in custody because of the length of time it now took to prepare court summaries.

“There was an occasion where two people were arrested on multiple charges. It took six detectives more than six hours to enter the details into QPRIME,” he said. “It would have taken even longer to do the summary to go to court the next morning, so basically the suspects were released on bail, rather than kept in custody.”

He said jobs could now take up to seven hours to process because of the amount of data entry involved.

This is a good example of how non-security incentives affect security decisions.

Posted on January 22, 2009 at 1:51 PMView Comments

Two Security Camera Studies

From San Francisco:

San Francisco’s Community Safety Camera Program was launched in late 2005 with the dual goals of fighting crime and providing police investigators with a retroactive investigatory tool. The program placed more than 70 non-monitored cameras in mainly high-crime areas throughout the city. This report released today (January 9, 2009) consists of a multi-disciplinary collaboration examining the program’s technical aspects, management and goals, and policy components, as well as a quasi-experimental statistical evaluation of crime reports in order to provide a comprehensive evaluation of the program’s effectiveness. The results find that while the program did result in a 20% reduction in property crime within the view of the cameras, other forms of crime were not affected, including violent crime, one of the primary targets of the program.

From the UK:

The first study of its kind into the effectiveness of surveillance cameras revealed that almost every Scotland Yard murder inquiry uses their footage as evidence.

In 90 murder cases over a one year period, CCTV was used in 86 investigations, and senior officers said it helped to solve 65 cases by capturing the murder itself on film, or tracking the movements of the suspects before or after an attack.

In a third of the cases a good quality still image was taken from the footage from which witnesses identified the killer.

My own writing on security cameras is here. The question isn’t whether they’re useful or not, but whether their benefits are worth the costs.

Posted on January 13, 2009 at 6:58 AMView Comments

Trends in Counterfeit Currency

It’s getting worse:

More counterfeiters are using today’s ink-jet printers, computers and copiers to make money that’s just good enough to pass, he said, even though their product is awful.

In the past, he said, the best American counterfeiters were skilled printers who used heavy offset presses to turn out decent 20s, 50s and 100s. Now that kind of work is rare and almost all comes from abroad.

[…]

Green pointed to a picture hanging in his downtown conference room. It’s a photo from a 1980s Lenexa case that involved heavy printing presses and about 2 million fake dollars.

“That’s what we used to see,” he boomed. “That’s the kind of case we used to make.”

Agents discovered then that someone had purchased such equipment and a special kind of paper and it all went to the Lenexa shop. Then the agents secretly went in there with a court order and planted a tiny video camera on a Playboy calendar.

They streamed video 24/7 for days, stormed in with guns drawn and sent bad guys to federal prison.

Green’s voice sank as he described today’s sad-sack counterfeiters.

These people call up pictures of bills on their computers, buy paper at an office supply store and print out a few bills. They cut the bills apart, go into a store or bar and pass one or two.

Many offenders are involved with drugs, he said, often methamphetamine. If they get caught, so little money is involved that federal prosecutors won’t take the case.

It’s interesting. Counterfeits are becoming easier to detect while people are becoming less skilled in detecting it:

Part of the problem, Green said, is that the government has changed the money so much to foil counterfeiting. With all the new bills out there, citizens and even many police officers don’t know what they’re supposed to look like.

Moreover, many people see paper money less because they use credit or debit cards.

The result: Ink-jet counterfeiting accounted for 60 percent of $103 million in fake money removed from circulation from October 2007 to August 2008, the Secret Service reports. In 1995, the figure was less than 1 percent.

Another article on the topic.

Posted on January 5, 2009 at 6:34 AMView Comments

DHS Reality Show

On ABC:

Every day the men and women of the Department of Homeland Security patrol more than 100,000 miles of America’s borders. This territory includes airports, seaports, land borders, international mail centers, the open seas, mountains, deserts and even cyberspace. Now viewers will get an unprecedented look at the work of these men and women while they use the newest technology to safeguard our country and enforce our laws, in “Homeland Security USA,” which debuts with the episode “This is Your Car on Drugs,” TUESDAY, JANUARY 6 (8:00-9:00 p.m., ET) on ABC.

Sure it’s propaganda, but the agency can use the image boost.

Posted on December 23, 2008 at 1:10 PMView Comments

Arming New York City Police with Machine Guns

I have mixed feelings about this:

The NYPD wants all 1,000 Police Academy recruits trained to use M4 automatic machine guns – which are now carried only by the 400 cops in its elite Emergency Service Unit – in time for the holiday celebration in Times Square.

On the one hand, deploying these weapons seems like a bad idea. On the other hand, training is almost never a bad thing.

Oh, and in case you were worried:

There is no intelligence Times Square will be a target on New Year’s Eve. The area will be on high alert, but has been so for every year since the millennium.

Posted on December 16, 2008 at 3:43 PMView Comments

Audit

As the first digital president, Barack Obama is learning the hard way how difficult it can be to maintain privacy in the information age. Earlier this year, his passport file was snooped by contract workers in the State Department. In October, someone at Immigration and Customs Enforcement leaked information about his aunt’s immigration status. And in November, Verizon employees peeked at his cell phone records.

What these three incidents illustrate is not that computerized databases are vulnerable to hacking—we already knew that, and anyway the perpetrators all had legitimate access to the systems they used—but how important audit is as a security measure.

When we think about security, we commonly think about preventive measures: locks to keep burglars out of our homes, bank safes to keep thieves from our money, and airport screeners to keep guns and bombs off airplanes. We might also think of detection and response measures: alarms that go off when burglars pick our locks or dynamite open bank safes, sky marshals on airplanes who respond when a hijacker manages to sneak a gun through airport security. But audit, figuring out who did what after the fact, is often far more important than any of those other three.

Most security against crime comes from audit. Of course we use locks and alarms, but we don’t wear bulletproof vests. The police provide for our safety by investigating crimes after the fact and prosecuting the guilty: that’s audit.

Audit helps ensure that people don’t abuse positions of trust. The cash register, for example, is basically an audit system. Cashiers have to handle the store’s money. To ensure they don’t skim from the till, the cash register keeps an audit trail of every transaction. The store owner can look at the register totals at the end of the day and make sure the amount of money in the register is the amount that should be there.

The same idea secures us from police abuse, too. The police have enormous power, including the ability to intrude into very intimate aspects of our life in order to solve crimes and keep the peace. This is generally a good thing, but to ensure that the police don’t abuse this power, we put in place systems of audit like the warrant process.

The whole NSA warrantless eavesdropping scandal was about this. Some misleadingly painted it as allowing the government to eavesdrop on foreign terrorists, but the government always had that authority. What the government wanted was to not have to submit a warrant, even after the fact, to a secret FISA court. What they wanted was to not be subject to audit.

That would be an incredibly bad idea. Law enforcement systems that don’t have good audit features designed in, or are exempt from this sort of audit-based oversight, are much more prone to abuse by those in power—because they can abuse the system without the risk of getting caught. Audit is essential as the NSA increases its domestic spying. And large police databases, like the FBI Next Generation Identification System, need to have strong audit features built in.

For computerized database systems like that—systems entrusted with other people’s information—audit is a very important security mechanism. Hospitals need to keep databases of very personal health information, and doctors and nurses need to be able to access that information quickly and easily. A good audit record of who accessed what when is the best way to ensure that those trusted with our medical information don’t abuse that trust. It’s the same with IRS records, credit reports, police databases, telephone records – anything personal that someone might want to peek at during the course of his job.

Which brings us back to President Obama. In each of those three examples, someone in a position of trust inappropriately accessed personal information. The difference between how they played out is due to differences in audit. The State Department’s audit worked best; they had alarm systems in place that alerted superiors when Obama’s passport files were accessed and who accessed them. Verizon’s audit mechanisms worked less well; they discovered the inappropriate account access and have narrowed the culprits down to a few people. Audit at Immigration and Customs Enforcement was far less effective; they still don’t know who accessed the information.

Large databases filled with personal information, whether managed by governments or corporations, are an essential aspect of the information age. And they each need to be accessed, for legitimate purposes, by thousands or tens of thousands of people. The only way to ensure those people don’t abuse the power they’re entrusted with is through audit. Without it, we will simply never know who’s peeking at what.

This essay first appeared on the Wall Street Journal website.

Posted on December 10, 2008 at 2:21 PMView Comments

BNP Database Leaked

This is a big deal.

British National Party (BNP, a far-right nationalist party) membership and contacts list. 12,801 individuals are represented. Contains contact details and notes on selected party members and (possibly) other individuals. The list has been independently verified by Wikileaks staff as predominantly containing current or ex-BNP members, however other individuals who have donated to the BNP or who have had other contact (not necessarily supportive) with the BNP or one of its fronts may also be represented.

Says BBC:

Occupations ascribed to the listed names include teachers, a doctor, nurse, vicar and members of the armed forces.

While there is no ban on many of those professions joining the BNP, its right-wing political stance and whites-only membership policy are seen by many as incompatible with frontline public service.

Police officers, on the other hand, are formally banned from joining, a policy which is recognised in the list.

Alongside the name of a serving officer, the document states that there is “Discretion required re. employment concerns”.

Seems that the BNP database wasn’t hacked from the outside, but that someone on the inside leaked the list.

There’s a lot more leaked BNP documents on the Wikileaks website.

Posted on November 24, 2008 at 6:26 AMView Comments

U.S. Court Rules that Hashing = Searching

Really interesting post by Orin Kerr on whether, by taking hash values of someone’s hard drive, the police conducted a “search”:

District Court Holds that Running Hash Values on Computer Is A Search: The case is United States v. Crist, 2008 WL 4682806 (M.D.Pa. October 22 2008) (Kane, C.J.). It’s a child pornography case involving a warrantless search that raises a very interesting and important question of first impression: Is running a hash a Fourth Amendment search? (For background on what a “hash” is and why it matters, see here).

First, the facts. Crist is behind on his rent payments, and his landlord starts to evict him by hiring Sell to remove Crist’s belongings and throw them away. Sell comes across Crist’s computer, and he hands over the computer to his friend Hipple who he knows is looking for a computer. Hipple starts to look through the files, and he comes across child pornography: Hipple freaks out and calls the police. The police then conduct a warrantless forensic examination of the computer:

In the forensic examination, Agent Buckwash used the following procedure. First, Agent Buckwash created an “MD5 hash value” of Crist’s hard drive. An MD5 hash value is a unique alphanumeric representation of the data, a sort of “fingerprint” or “digital DNA.” When creating the hash value, Agent Buckwash used a “software write protect” in order to ensure that “nothing can be written to that hard drive.” Supp. Tr. 88. Next, he ran a virus scan, during which he identified three relatively innocuous viruses. After that, he created an “image,” or exact copy, of all the data on Crist’s hard drive.

Agent Buckwash then opened up the image (not the actual hard drive) in a software program called EnCase, which is the principal tool in the analysis. He explained that EnCase does not access the hard drive in the traditional manner, i.e., through the computer’s operating system. Rather, EnCase “reads the hard drive itself.” Supp. Tr. 102. In other words, it reads every file-bit by bit, cluster by cluster-and creates a index of the files contained on the hard drive. EnCase can, therefore, bypass user-defined passwords, “break down complex file structures for examination,” and recover “deleted” files as long as those files have not been written over. Supp. Tr. 102-03.

Once in EnCase, Agent Buckwash ran a “hash value and signature analysis on all of the files on the hard drive.” Supp. Tr. 89. In doing so, he was able to “ingerprint” each file in the computer. Once he generated hash values of the files, he compared those hash values to the hash values of files that are known or suspected to contain child pornography. Agent Buckwash discovered five videos containing known child pornography. Attachment 5. He discovered 171 videos containing suspected child pornography.

One of the interesting questions here is whether the search that resulted was within the scope of Hipple’s private search; different courts have approached this question differently. But for now the most interesting question is whether running the hash was a Fourth Amendment search. The Court concluded that it was, and that the evidence of child pornography discovered had to be suppressed:

The Government argues that no search occurred in running the EnCase program because the agents “didn’t look at any files, they simply accessed the computer.” 2d Supp. Tr. 16. The Court rejects this view and finds that the “running of hash values” is a search protected by the Fourth Amendment.

Computers are composed of many compartments, among them a “hard drive,” which in turn is composed of many “platters,” or disks. To derive the hash values of Crist’s computer, the Government physically removed the hard drive from the computer, created a duplicate image of the hard drive without physically invading it, and applied the EnCase program to each compartment, disk, file, folder, and bit.2d Supp. Tr. 18-19. By subjecting the entire computer to a hash value analysis-every file, internet history, picture, and “buddy list” became available for Government review. Such examination constitutes a search.

I think this is generally a correct result: See my article Searches and Seizures in a Digital World, 119 Harv. L. Rev. 531 (2005), for the details. Still, given the lack of analysis here it’s somewhat hard to know what to make of the decision. Which stage was the search—the creating the duplicate? The running of the hash? It’s not really clear. I don’t think it matters very much to this case, because the agent who got the positive hit on the hashes didn’t then get a warrant. Instead, he immediately switched over to the EnCase “gallery view” function to see the images, which seems to be to be undoudtedly a search. Still, it’s a really interesting question.

Posted on November 5, 2008 at 8:28 AMView Comments

Clever Counterterrorism Tactic

Used against the IRA:

One of the most interesting operations was the laundry mat [sic]. Having lost many troops and civilians to bombings, the Brits decided they needed to determine who was making the bombs and where they were being manufactured. One bright fellow recommended they operate a laundry and when asked “what the hell he was talking about,” he explained the plan and it was incorporated—to much success.

The plan was simple: Build a laundry and staff it with locals and a few of their own. The laundry would then send out “color coded” special discount tickets, to the effect of “get two loads for the price of one,” etc. The color coding was matched to specific streets and thus when someone brought in their laundry, it was easy to determine the general location from which a city map was coded.

While the laundry was indeed being washed, pressed and dry cleaned, it had one additional cycle—every garment, sheet, glove, pair of pants, was first sent through an analyzer, located in the basement, that checked for bomb-making residue. The analyzer was disguised as just another piece of the laundry equipment; good OPSEC [operational security]. Within a few weeks, multiple positives had shown up, indicating the ingredients of bomb residue, and intelligence had determined which areas of the city were involved. To narrow their target list, [the laundry] simply sent out more specific coupons [numbered] to all houses in the area, and before long they had good addresses. After confirming addresses, authorities with the SAS teams swooped down on the multiple homes and arrested multiple personnel and confiscated numerous assembled bombs, weapons and ingredients. During the entire operation, no one was injured or killed.

Posted on October 13, 2008 at 1:22 PMView Comments

1 23 24 25 26 27 46

Sidebar photo of Bruce Schneier by Joe MacInnis.