Entries Tagged "child pornography"

Page 2 of 2

U.S. Court Rules that Hashing = Searching

Really interesting post by Orin Kerr on whether, by taking hash values of someone’s hard drive, the police conducted a “search”:

District Court Holds that Running Hash Values on Computer Is A Search: The case is United States v. Crist, 2008 WL 4682806 (M.D.Pa. October 22 2008) (Kane, C.J.). It’s a child pornography case involving a warrantless search that raises a very interesting and important question of first impression: Is running a hash a Fourth Amendment search? (For background on what a “hash” is and why it matters, see here).

First, the facts. Crist is behind on his rent payments, and his landlord starts to evict him by hiring Sell to remove Crist’s belongings and throw them away. Sell comes across Crist’s computer, and he hands over the computer to his friend Hipple who he knows is looking for a computer. Hipple starts to look through the files, and he comes across child pornography: Hipple freaks out and calls the police. The police then conduct a warrantless forensic examination of the computer:

In the forensic examination, Agent Buckwash used the following procedure. First, Agent Buckwash created an “MD5 hash value” of Crist’s hard drive. An MD5 hash value is a unique alphanumeric representation of the data, a sort of “fingerprint” or “digital DNA.” When creating the hash value, Agent Buckwash used a “software write protect” in order to ensure that “nothing can be written to that hard drive.” Supp. Tr. 88. Next, he ran a virus scan, during which he identified three relatively innocuous viruses. After that, he created an “image,” or exact copy, of all the data on Crist’s hard drive.

Agent Buckwash then opened up the image (not the actual hard drive) in a software program called EnCase, which is the principal tool in the analysis. He explained that EnCase does not access the hard drive in the traditional manner, i.e., through the computer’s operating system. Rather, EnCase “reads the hard drive itself.” Supp. Tr. 102. In other words, it reads every file-bit by bit, cluster by cluster-and creates a index of the files contained on the hard drive. EnCase can, therefore, bypass user-defined passwords, “break down complex file structures for examination,” and recover “deleted” files as long as those files have not been written over. Supp. Tr. 102-03.

Once in EnCase, Agent Buckwash ran a “hash value and signature analysis on all of the files on the hard drive.” Supp. Tr. 89. In doing so, he was able to “ingerprint” each file in the computer. Once he generated hash values of the files, he compared those hash values to the hash values of files that are known or suspected to contain child pornography. Agent Buckwash discovered five videos containing known child pornography. Attachment 5. He discovered 171 videos containing suspected child pornography.

One of the interesting questions here is whether the search that resulted was within the scope of Hipple’s private search; different courts have approached this question differently. But for now the most interesting question is whether running the hash was a Fourth Amendment search. The Court concluded that it was, and that the evidence of child pornography discovered had to be suppressed:

The Government argues that no search occurred in running the EnCase program because the agents “didn’t look at any files, they simply accessed the computer.” 2d Supp. Tr. 16. The Court rejects this view and finds that the “running of hash values” is a search protected by the Fourth Amendment.

Computers are composed of many compartments, among them a “hard drive,” which in turn is composed of many “platters,” or disks. To derive the hash values of Crist’s computer, the Government physically removed the hard drive from the computer, created a duplicate image of the hard drive without physically invading it, and applied the EnCase program to each compartment, disk, file, folder, and bit.2d Supp. Tr. 18-19. By subjecting the entire computer to a hash value analysis-every file, internet history, picture, and “buddy list” became available for Government review. Such examination constitutes a search.

I think this is generally a correct result: See my article Searches and Seizures in a Digital World, 119 Harv. L. Rev. 531 (2005), for the details. Still, given the lack of analysis here it’s somewhat hard to know what to make of the decision. Which stage was the search — the creating the duplicate? The running of the hash? It’s not really clear. I don’t think it matters very much to this case, because the agent who got the positive hit on the hashes didn’t then get a warrant. Instead, he immediately switched over to the EnCase “gallery view” function to see the images, which seems to be to be undoudtedly a search. Still, it’s a really interesting question.

Posted on November 5, 2008 at 8:28 AMView Comments

Terrorists and Child Porn, Oh My!

It’s the ultimate movie-plot threat: terrorists using child porn:

It is thought Islamist extremists are concealing messages in digital images and audio, video or other files.

Police are now investigating the link between terrorists and paedophilia in an attempt to unravel the system.

It could lead to the training of child welfare experts to identify signs of terrorist involvement as they monitor pornographic websites.

Of course, terrorists and strangers preying on our children are two of the things that cause the most fear in people. Put them together, and there’s no limit to what sorts of laws you can get passed.

EDITED TO ADD (10/22): Best comment:

Why would terrorists hide incriminating messages inside incriminating photographs? That would be like drug smugglers hiding kilos of cocaine in bales of marijuana.

Posted on October 22, 2008 at 12:57 PMView Comments

Web Entrapment

Frightening sting operation by the FBI. They posted links to supposed child porn videos on boards frequented by those types, and obtained search warrants based on access attempts.

This seems like incredibly flimsy evidence. Someone could post the link as an embedded image, or send out e-mail with the link embedded, and completely mess with the FBI’s data — and the poor innocents’ lives. Such are the problems when the mere clicking on a link is justification for a warrant.

See also this Slashdot thread and this article.

Posted on March 27, 2008 at 2:46 PMView Comments

Third Party Consent and Computer Searches

U.S. courts are weighing in with opinions:

When Ray Andrus’ 91-year-old father gave federal agents permission to search his son’s password-protected computer files and they found child pornography, the case turned a spotlight on how appellate courts grapple with third-party consents to search computers.

[…]

The case was a first for the 10th U.S. Circuit Court of Appeals, and only two other circuits have touched on the issue, the 4th and 6th circuits. The 10th Circuit held that although password-protected computers command a high level of privacy, the legitimacy of a search turns on an officer’s belief that the third party had authority to consent.

The 10th Circuit’s recent 2-1 decision in U.S. v. Andrus, No. 06-3094 (April 25, 2007), recognized for the first time that a password-protected computer is like a locked suitcase or a padlocked footlocker in a bedroom. The digital locks raise the expectation of privacy by the owner. The majority nonetheless refused to suppress the evidence.

Excellent commentary from Jennifer Granick:

The Fourth Amendment generally prohibits warrantless searches of an individual’s home or possessions. There is an exception to the warrant requirement when someone consents to the search. Consent can be given by the person under investigation, or by a third party with control over or mutual access to the property being searched. Because the Fourth Amendment only prohibits “unreasonable searches and seizures,” permission given by a third party who lacks the authority to consent will nevertheless legitimize a warrantless search if the consenter has “apparent authority,” meaning that the police reasonably believed that the person had actual authority to control or use the property.

Under existing case law, only people with a key to a locked closet have apparent authority to consent to a search of that closet. Similarly, only people with the password to a locked computer have apparent authority to consent to a search of that device. In Andrus, the father did not have the password (or know how to use the computer) but the police say they did not have any reason to suspect this because they did not ask and did not turn the computer on. Then, they used forensic software that automatically bypassed any installed password.

The majority held that the police officers not only weren’t obliged to ask whether the father used the computer, they had no obligation to check for a password before performing their forensic search. In dissent, Judge Monroe G. McKay criticized the agents’ intentional blindness to the existence of password protection, when physical or digital locks are such a fundamental part of ascertaining whether a consenting person has actual or apparent authority to permit a police search. “(T)he unconstrained ability of law enforcement to use forensic software such at the EnCase program to bypass password protection without first determining whether such passwords have been enabled … dangerously sidestep(s) the Fourth Amendment.”

[…]

If courts are going to treat computers as containers, and if owners must lock containers in order to keep them private from warrantless searches, then police should be required to look for those locks. Password protected computers and locked containers are an inexact analogy, but if that is how courts are going to do it, then its inappropriate to diminish protections for computers simply because law enforcement chooses to use software that turns a blind eye to owners’ passwords.

Posted on June 5, 2007 at 6:43 AMView Comments

Top 10 Internet Crimes of 2006

According to the Internet Crime Complaint Center and reported in U.S. News and World Report, auction fraud and non-delivery of items purchased are far and away the most common Internet crimes. Identity theft is way down near the bottom.

Although the number of complaints last year­207,492­fell by 10 percent, the overall losses hit a record $198 million. By far the most reported crime: Internet auction fraud, garnering 45 percent of all complaints. Also big was nondelivery of merchandise or payment, which notched second at 19 percent. The biggest money losers: those omnipresent Nigerian scam letters, which fleeced victims on average of $5,100 ­followed by check fraud at $3,744 and investment fraud at $2,694.

[…]

The feds caution that these figures don’t represent a scientific sample of just how much Net crime is out there. They note, for example, that the high number of auction fraud complaints is due, in part, to eBay and other big E-commerce outfits offering customers direct links to the IC3 website. And it’s tough to measure what may be the Web’s biggest scourge, child porn, simply by complaints. Still, the survey is a useful snapshot, even if it tells us what we already know: that the Internet, like the rest of life, is full of bad guys. Caveat emptor.

Posted on April 24, 2007 at 12:25 PMView Comments

Click Fraud and the Problem of Authenticating People

Google’s $6 billion-a-year advertising business is at risk because it can’t be sure that anyone is looking at its ads. The problem is called click fraud, and it comes in two basic flavors.

With network click fraud, you host Google AdSense advertisements on your own website. Google pays you every time someone clicks on its ad on your site. It’s fraud if you sit at the computer and repeatedly click on the ad or — better yet — write a computer program that repeatedly clicks on the ad. That kind of fraud is easy for Google to spot, so the clever network click fraudsters simulate different IP addresses, or install Trojan horses on other people’s computers to generate the fake clicks.

The other kind of click fraud is competitive. You notice your business competitor has bought an ad on Google, paying Google for each click. So you use the above techniques to repeatedly click on his ads, forcing him to spend money — sometimes a lot of money — on nothing. (Here’s a company that will commit click fraud for you.)

Click fraud has become a classic security arms race. Google improves its fraud-detection tools, so the fraudsters get increasingly clever … and the cycle continues. Meanwhile, Google is facing multiple lawsuits from those who claim the company isn’t doing enough. My guess is that everyone is right: It’s in Google’s interest both to solve and to downplay the importance of the problem.

But the overarching problem is both hard to solve and important: How do you tell if there’s an actual person sitting in front of a computer screen? How do you tell that the person is paying attention, hasn’t automated his responses, and isn’t being assisted by friends? Authentication systems are big business, whether based on something you know (passwords), something you have (tokens) or something you are (biometrics). But none of those systems can secure you against someone who walks away and lets another person sit down at the keyboard, or a computer that’s infected with a Trojan.

This problem manifests itself in other areas as well.

For years, online computer game companies have been battling players who use computer programs to assist their play: programs that allow them to shoot perfectly or see information they normally couldn’t see.

Playing is less fun if everyone else is computer-assisted, but unless there’s a cash prize on the line, the stakes are small. Not so with online poker sites, where computer-assisted players — or even computers playing without a real person at all — have the potential to drive all the human players away from the game.

Look around the internet, and you see this problem pop up again and again. The whole point of CAPTCHAs is to ensure that it’s a real person visiting a website, not just a bot on a computer. Standard testing doesn’t work online, because the tester can’t be sure that the test taker doesn’t have his book open, or a friend standing over his shoulder helping him. The solution in both cases is a proctor, of course, but that’s not always practical and obviates the benefits of internet testing.

This problem has even come up in court cases. In one instance, the prosecution demonstrated that the defendant’s computer committed some hacking offense, but the defense argued that it wasn’t the defendant who did it — that someone else was controlling his computer. And in another case, a defendant charged with a child porn offense argued that, while it was true that illegal material was on his computer, his computer was in a common room of his house and he hosted a lot of parties — and it wasn’t him who’d downloaded the porn.

Years ago, talking about security, I complained about the link between computer and chair. The easy part is securing digital information: on the desktop computer, in transit from computer to computer or on massive servers. The hard part is securing information from the computer to the person. Likewise, authenticating a computer is much easier than authenticating a person sitting in front of the computer. And verifying the integrity of data is much easier than verifying the integrity of the person looking at it — in both senses of that word.

And it’s a problem that will get worse as computers get better at imitating people.

Google is testing a new advertising model to deal with click fraud: cost-per-action ads. Advertisers don’t pay unless the customer performs a certain action: buys a product, fills out a survey, whatever. It’s a hard model to make work — Google would become more of a partner in the final sale instead of an indifferent displayer of advertising — but it’s the right security response to click fraud: Change the rules of the game so that click fraud doesn’t matter.

That’s how to solve a security problem.

This essay appeared on Wired.com.

EDITED TO ADD (7/13): Click Monkeys is a hoax site.

EDITED TO ADD (7/25): An evalution of Google’s anti-click-fraud efforts, as part of the Lane Gifts case. I’m not sure if this expert report was done for Google, for Lane Gifts, or for the judge.

Posted on July 13, 2006 at 5:22 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.