Entries Tagged "surveillance"

Page 70 of 92

U.S. Court Rules that Hashing = Searching

Really interesting post by Orin Kerr on whether, by taking hash values of someone’s hard drive, the police conducted a “search”:

District Court Holds that Running Hash Values on Computer Is A Search: The case is United States v. Crist, 2008 WL 4682806 (M.D.Pa. October 22 2008) (Kane, C.J.). It’s a child pornography case involving a warrantless search that raises a very interesting and important question of first impression: Is running a hash a Fourth Amendment search? (For background on what a “hash” is and why it matters, see here).

First, the facts. Crist is behind on his rent payments, and his landlord starts to evict him by hiring Sell to remove Crist’s belongings and throw them away. Sell comes across Crist’s computer, and he hands over the computer to his friend Hipple who he knows is looking for a computer. Hipple starts to look through the files, and he comes across child pornography: Hipple freaks out and calls the police. The police then conduct a warrantless forensic examination of the computer:

In the forensic examination, Agent Buckwash used the following procedure. First, Agent Buckwash created an “MD5 hash value” of Crist’s hard drive. An MD5 hash value is a unique alphanumeric representation of the data, a sort of “fingerprint” or “digital DNA.” When creating the hash value, Agent Buckwash used a “software write protect” in order to ensure that “nothing can be written to that hard drive.” Supp. Tr. 88. Next, he ran a virus scan, during which he identified three relatively innocuous viruses. After that, he created an “image,” or exact copy, of all the data on Crist’s hard drive.

Agent Buckwash then opened up the image (not the actual hard drive) in a software program called EnCase, which is the principal tool in the analysis. He explained that EnCase does not access the hard drive in the traditional manner, i.e., through the computer’s operating system. Rather, EnCase “reads the hard drive itself.” Supp. Tr. 102. In other words, it reads every file-bit by bit, cluster by cluster-and creates a index of the files contained on the hard drive. EnCase can, therefore, bypass user-defined passwords, “break down complex file structures for examination,” and recover “deleted” files as long as those files have not been written over. Supp. Tr. 102-03.

Once in EnCase, Agent Buckwash ran a “hash value and signature analysis on all of the files on the hard drive.” Supp. Tr. 89. In doing so, he was able to “ingerprint” each file in the computer. Once he generated hash values of the files, he compared those hash values to the hash values of files that are known or suspected to contain child pornography. Agent Buckwash discovered five videos containing known child pornography. Attachment 5. He discovered 171 videos containing suspected child pornography.

One of the interesting questions here is whether the search that resulted was within the scope of Hipple’s private search; different courts have approached this question differently. But for now the most interesting question is whether running the hash was a Fourth Amendment search. The Court concluded that it was, and that the evidence of child pornography discovered had to be suppressed:

The Government argues that no search occurred in running the EnCase program because the agents “didn’t look at any files, they simply accessed the computer.” 2d Supp. Tr. 16. The Court rejects this view and finds that the “running of hash values” is a search protected by the Fourth Amendment.

Computers are composed of many compartments, among them a “hard drive,” which in turn is composed of many “platters,” or disks. To derive the hash values of Crist’s computer, the Government physically removed the hard drive from the computer, created a duplicate image of the hard drive without physically invading it, and applied the EnCase program to each compartment, disk, file, folder, and bit.2d Supp. Tr. 18-19. By subjecting the entire computer to a hash value analysis-every file, internet history, picture, and “buddy list” became available for Government review. Such examination constitutes a search.

I think this is generally a correct result: See my article Searches and Seizures in a Digital World, 119 Harv. L. Rev. 531 (2005), for the details. Still, given the lack of analysis here it’s somewhat hard to know what to make of the decision. Which stage was the search—the creating the duplicate? The running of the hash? It’s not really clear. I don’t think it matters very much to this case, because the agent who got the positive hit on the hashes didn’t then get a warrant. Instead, he immediately switched over to the EnCase “gallery view” function to see the images, which seems to be to be undoudtedly a search. Still, it’s a really interesting question.

Posted on November 5, 2008 at 8:28 AMView Comments

Remotely Eavesdropping on Keyboards

Clever work:

The researchers from the Security and Cryptography Laboratory at Ecole Polytechnique Federale de Lausanne are able to capture keystrokes by monitoring the electromagnetic radiation of PS/2, universal serial bus, or laptop keyboards. They’ve outline four separate attack methods, some that work at a distance of as much as 65 feet from the target.

In one video demonstration, researchers Martin Vuagnoux and Sylvain Pasini sniff out the the keystrokes typed into a standard keyboard using a large antenna that’s about 20 to 30 feet away in an adjacent room.

Website here.

Posted on October 23, 2008 at 12:48 PMView Comments

Terrorist Fear Mongering Seems to be Working Less Well, Part II

Last week I wrote about a story that indicated that terrorist fear mongering is working less well. Here’s another story, this one from Canada: two pipeline bombings in Northern British Columbia:

Investigators are treating the explosions as acts of vandalism, not terrorism, Shields said.

“Under the Criminal Code, it would be characterized as mischief, which is an intentional vandalism. We don’t want to characterize this as terrorism. They were very isolated locations and there would seem there was no intent to hurt people,” he said.

It’s not all good, though. Here’s a story from Philadelphia, where a subway car is criticized because people can see out the front. Because, um, because terrorist will be able to see out the front, and we all know how dangerous terrorists are:

Marcus Ruef, a national vice president with the Brotherhood of Locomotive Engineers and Trainmen, compared a train cab to an airliner cockpit and said a cab should be similarly secure. He invoked post-9/11 security concerns as a reason to provide a full cab that prevents passengers from seeing the rails and signals ahead.

“We don’t think the forward view of the right-of-way should be available to whoever wants to watch … and the conductor and the engineer should be able to talk privately,” Ruef said.

Pat Nowakowski, SEPTA chief of operations, said the smaller cabs pose no security risk. “I have never heard that from a security expert,” he said.

At least there was pushback against that kind of idiocy.

And from the UK:

Transport Secretary Geoff Hoon has said the government is prepared to go “quite a long way” with civil liberties to “stop terrorists killing people”.

He was responding to criticism of plans for a database of mobile and web records, saying it was needed because terrorists used such communications.

By not monitoring this traffic, it would be “giving a licence to terrorists to kill people”, he said.

I hope there will be similar pushback against this “choice.”

EDITED TO ADD (11/13): Seems like the Philadelphia engineers have another agenda—the cabs in the new trains are too small—and they’re just using security as an excuse.

Posted on October 22, 2008 at 6:44 AMView Comments

NSA's Warrantless Eavesdropping Targets Innocent Americans

Remember when the U.S. government said it was only spying on terrorists? Anyone with any common sense knew it was lying—power without oversight is always abused—but even I didn’t think
it was this bad:

Faulk says he and others in his section of the NSA facility at Fort Gordon routinely shared salacious or tantalizing phone calls that had been intercepted, alerting office mates to certain time codes of “cuts” that were available on each operator’s computer.

“Hey, check this out,” Faulk says he would be told, “there’s good phone sex or there’s some pillow talk, pull up this call, it’s really funny, go check it out. It would be some colonel making pillow talk and we would say, ‘Wow, this was crazy’,” Faulk told ABC News.

Warrants are a security device. They protect us against government abuse of power.

Posted on October 15, 2008 at 12:39 PMView Comments

Data Mining for Terrorists Doesn't Work

According to a massive report from the National Research Council, data mining for terrorists doesn’t work. Here’s a good summary:

The report was written by a committee whose members include William Perry, a professor at Stanford University; Charles Vest, the former president of MIT; W. Earl Boebert, a retired senior scientist at Sandia National Laboratories; Cynthia Dwork of Microsoft Research; R. Gil Kerlikowske, Seattle’s police chief; and Daryl Pregibon, a research scientist at Google.

They admit that far more Americans live their lives online, using everything from VoIP phones to Facebook to RFID tags in automobiles, than a decade ago, and the databases created by those activities are tempting targets for federal agencies. And they draw a distinction between subject-based data mining (starting with one individual and looking for connections) compared with pattern-based data mining (looking for anomalous activities that could show illegal activities).

But the authors conclude the type of data mining that government bureaucrats would like to do—perhaps inspired by watching too many episodes of the Fox series 24—can’t work. “If it were possible to automatically find the digital tracks of terrorists and automatically monitor only the communications of terrorists, public policy choices in this domain would be much simpler. But it is not possible to do so.”

A summary of the recommendations:

  • U.S. government agencies should be required to follow a systematic process to evaluate the effectiveness, lawfulness, and consistency with U.S. values of every information-based program, whether classified or unclassified, for detecting and countering terrorists before it can be deployed, and periodically thereafter.
  • Periodically after a program has been operationally deployed, and in particular before a program enters a new phase in its life cycle, policy makers should (carefully review) the program before allowing it to continue operations or to proceed to the next phase.
  • To protect the privacy of innocent people, the research and development of any information-based counterterrorism program should be conducted with synthetic population data… At all stages of a phased deployment, data about individuals should be rigorously subjected to the full safeguards of the framework.
  • Any information-based counterterrorism program of the U.S. government should be subjected to robust, independent oversight of the operations of that program, a part of which would entail a practice of using the same data mining technologies to “mine the miners and track the trackers.”
  • Counterterrorism programs should provide meaningful redress to any individuals inappropriately harmed by their operation.
  • The U.S. government should periodically review the nation’s laws, policies, and procedures that protect individuals’ private information for relevance and effectiveness in light of changing technologies and circumstances. In particular, Congress should re-examine existing law to consider how privacy should be protected in the context of information-based programs (e.g., data mining) for counterterrorism.

Here are more news articles on the report. I explained why data mining wouldn’t find terrorists back in 2005.

EDITED TO ADD (10/10): More commentary:

As the NRC report points out, not only is the training data lacking, but the input data that you’d actually be mining has been purposely corrupted by the terrorists themselves. Terrorist plotters actively disguise their activities using operational security measures (opsec) like code words, encryption, and other forms of covert communication. So, even if we had access to a copious and pristine body of training data that we could use to generalize about the “typical terrorist,” the new data that’s coming into the data mining system is suspect.

To return to the credit reporting analogy, credit scores would be worthless to lenders if everyone could manipulate their credit history (e.g., hide past delinquencies) the way that terrorists can manipulate the data trails that they leave as they buy gas, enter buildings, make phone calls, surf the Internet, etc.

So this application of data mining bumps up against the classic GIGO (garbage in, garbage out) problem in computing, with the terrorists deliberately feeding the system garbage. What this means in real-world terms is that the success of our counter-terrorism data mining efforts is completely dependent on the failure of terrorist cells to maintain operational security.

The combination of the GIGO problem and the lack of suitable training data combine to make big investments in automated terrorist identification a futile and wasteful effort. Furthermore, these two problems are structural, so they’re not going away. All legitimate concerns about false positives and corrosive effects on civil liberties aside, data mining will never give authorities the ability to identify terrorists or terrorist networks with any degree of confidence.

Posted on October 10, 2008 at 6:35 AMView Comments

$20M Cameras at New York's Freedom Tower are Pretty Sophisticated

They’re trying to detect anomalies:

If you have ever wondered how security guards can possibly keep an unfailingly vigilant watch on every single one of dozens of television monitors, each depicting a different scene, the answer seems to be (as you suspected): they can’t.

Instead, they can now rely on computers to constantly analyze the patterns, sizes, speeds, angles and motion picked up by the camera and determine—based on how they have been programmed—whether this constitutes a possible threat. In which case, the computer alerts the security guard whose own eyes may have been momentarily diverted. Or shut.

An alarm can be raised, for instance, if the computer discerns a vehicle that has been standing still for too long (say, a van in the drop-off lane of an airport terminal) or a person who is loitering while everyone else is in motion. By the same token, it will spot the individual who is moving rapidly while everyone else is shuffling along. It can spot a package that has been left behind and identify which figure in the crowd abandoned it. Or pinpoint the individual who is moving the wrong way down a one-way corridor.

Because one person’s “abnormal situation” is another person’s “hot dog vendor attracting a small crowd,” the computers can be programmed to discern between times of the day and days of the week.

Certainly interesting.

Posted on September 25, 2008 at 6:32 AMView Comments

NSA Snooping on Cell Phone Calls

From CNet:

A recent article in the London Review of Books revealed that a number of private companies now sell off-the-shelf data-mining solutions to government spies interested in analyzing mobile-phone calling records and real-time location information. These companies include ThorpeGlen, VASTech, Kommlabs, and Aqsacom—all of which sell “passive probing” data-mining services to governments around the world.

ThorpeGlen, a U.K.-based firm, offers intelligence analysts a graphical interface to the company’s mobile-phone location and call-record data-mining software. Want to determine a suspect’s “community of interest“? Easy. Want to learn if a single person is swapping SIM cards or throwing away phones (yet still hanging out in the same physical location)? No problem.

In a Web demo (PDF) (mirrored here) to potential customers back in May, ThorpeGlen’s vice president of global sales showed off the company’s tools by mining a dataset of a single week’s worth of call data from 50 million users in Indonesia, which it has crunched in order to try and discover small anti-social groups that only call each other.

Posted on September 17, 2008 at 12:49 PMView Comments

1 68 69 70 71 72 92

Sidebar photo of Bruce Schneier by Joe MacInnis.