Entries Tagged "laws"

Page 18 of 35

Anti-Terror Law Mission Creep in the U.K.

First terrorists, then trash cans:

More than half of town halls admit using anti-terror laws to spy on families suspected of putting their rubbish out on the wrong day.

Their tactics include putting secret cameras in tin cans, on lamp posts and even in the homes of ‘friendly’ residents.

The local authorities admitted that one of their main aims was to catch householders who put their bins out early.

EDITED TO ADD (11/13): A better article on the subject.

Posted on November 7, 2008 at 8:18 AMView Comments

The Ill Effects of Banning Security Research

The Indian police are having trouble with SIM card cloning:

Police had no idea that one SIM card could be used simultaneously from two handsets before the detention of Nazir Ahmed for interrogation. Nazir was picked up from Morigaon after an SMS from his mobile number in the name of ISF-IM claimed responsibility for Thursday’s blasts in Assam.

Nazir had a Reliance connection and an Eve handset. Each handset of this particular model has a unique International Mobile Equipment Identity (IMEI) number. Cops found that two IMEI numbers were using the same SIM. Accordingly there were two record sheets of calls and SMSes from Nazir’s mobile number. The record of the SMS to the media was found in only one sheet, which forced police to believe that Nazir’s SIM might have been cloned and someone else was using the duplicate card, with or without the owner’s knowledge.

“We stumbled upon this technological surprise that Nazir Ahmed’s SIM card was used in two handsets,” Assam IG (Law and Order) Bhaskarjyoti Mahanta said.

So far, not that interesting. There are lots of vulnerabilities in technological systems, and it’s generally a race between the good guys and the bad guys to see who finds them first. It’s the last sentence of this article that’s significant:

The experts said no one has actually done any research on SIM card cloning because the activity is illegal in the country.

If the good guys can’t even participate, the bad guys will always win.

Posted on November 6, 2008 at 6:26 AMView Comments

U.S. Court Rules that Hashing = Searching

Really interesting post by Orin Kerr on whether, by taking hash values of someone’s hard drive, the police conducted a “search”:

District Court Holds that Running Hash Values on Computer Is A Search: The case is United States v. Crist, 2008 WL 4682806 (M.D.Pa. October 22 2008) (Kane, C.J.). It’s a child pornography case involving a warrantless search that raises a very interesting and important question of first impression: Is running a hash a Fourth Amendment search? (For background on what a “hash” is and why it matters, see here).

First, the facts. Crist is behind on his rent payments, and his landlord starts to evict him by hiring Sell to remove Crist’s belongings and throw them away. Sell comes across Crist’s computer, and he hands over the computer to his friend Hipple who he knows is looking for a computer. Hipple starts to look through the files, and he comes across child pornography: Hipple freaks out and calls the police. The police then conduct a warrantless forensic examination of the computer:

In the forensic examination, Agent Buckwash used the following procedure. First, Agent Buckwash created an “MD5 hash value” of Crist’s hard drive. An MD5 hash value is a unique alphanumeric representation of the data, a sort of “fingerprint” or “digital DNA.” When creating the hash value, Agent Buckwash used a “software write protect” in order to ensure that “nothing can be written to that hard drive.” Supp. Tr. 88. Next, he ran a virus scan, during which he identified three relatively innocuous viruses. After that, he created an “image,” or exact copy, of all the data on Crist’s hard drive.

Agent Buckwash then opened up the image (not the actual hard drive) in a software program called EnCase, which is the principal tool in the analysis. He explained that EnCase does not access the hard drive in the traditional manner, i.e., through the computer’s operating system. Rather, EnCase “reads the hard drive itself.” Supp. Tr. 102. In other words, it reads every file-bit by bit, cluster by cluster-and creates a index of the files contained on the hard drive. EnCase can, therefore, bypass user-defined passwords, “break down complex file structures for examination,” and recover “deleted” files as long as those files have not been written over. Supp. Tr. 102-03.

Once in EnCase, Agent Buckwash ran a “hash value and signature analysis on all of the files on the hard drive.” Supp. Tr. 89. In doing so, he was able to “ingerprint” each file in the computer. Once he generated hash values of the files, he compared those hash values to the hash values of files that are known or suspected to contain child pornography. Agent Buckwash discovered five videos containing known child pornography. Attachment 5. He discovered 171 videos containing suspected child pornography.

One of the interesting questions here is whether the search that resulted was within the scope of Hipple’s private search; different courts have approached this question differently. But for now the most interesting question is whether running the hash was a Fourth Amendment search. The Court concluded that it was, and that the evidence of child pornography discovered had to be suppressed:

The Government argues that no search occurred in running the EnCase program because the agents “didn’t look at any files, they simply accessed the computer.” 2d Supp. Tr. 16. The Court rejects this view and finds that the “running of hash values” is a search protected by the Fourth Amendment.

Computers are composed of many compartments, among them a “hard drive,” which in turn is composed of many “platters,” or disks. To derive the hash values of Crist’s computer, the Government physically removed the hard drive from the computer, created a duplicate image of the hard drive without physically invading it, and applied the EnCase program to each compartment, disk, file, folder, and bit.2d Supp. Tr. 18-19. By subjecting the entire computer to a hash value analysis-every file, internet history, picture, and “buddy list” became available for Government review. Such examination constitutes a search.

I think this is generally a correct result: See my article Searches and Seizures in a Digital World, 119 Harv. L. Rev. 531 (2005), for the details. Still, given the lack of analysis here it’s somewhat hard to know what to make of the decision. Which stage was the search—the creating the duplicate? The running of the hash? It’s not really clear. I don’t think it matters very much to this case, because the agent who got the positive hit on the hashes didn’t then get a warrant. Instead, he immediately switched over to the EnCase “gallery view” function to see the images, which seems to be to be undoudtedly a search. Still, it’s a really interesting question.

Posted on November 5, 2008 at 8:28 AMView Comments

Terrorist Fear Mongering Seems to be Working Less Well, Part II

Last week I wrote about a story that indicated that terrorist fear mongering is working less well. Here’s another story, this one from Canada: two pipeline bombings in Northern British Columbia:

Investigators are treating the explosions as acts of vandalism, not terrorism, Shields said.

“Under the Criminal Code, it would be characterized as mischief, which is an intentional vandalism. We don’t want to characterize this as terrorism. They were very isolated locations and there would seem there was no intent to hurt people,” he said.

It’s not all good, though. Here’s a story from Philadelphia, where a subway car is criticized because people can see out the front. Because, um, because terrorist will be able to see out the front, and we all know how dangerous terrorists are:

Marcus Ruef, a national vice president with the Brotherhood of Locomotive Engineers and Trainmen, compared a train cab to an airliner cockpit and said a cab should be similarly secure. He invoked post-9/11 security concerns as a reason to provide a full cab that prevents passengers from seeing the rails and signals ahead.

“We don’t think the forward view of the right-of-way should be available to whoever wants to watch … and the conductor and the engineer should be able to talk privately,” Ruef said.

Pat Nowakowski, SEPTA chief of operations, said the smaller cabs pose no security risk. “I have never heard that from a security expert,” he said.

At least there was pushback against that kind of idiocy.

And from the UK:

Transport Secretary Geoff Hoon has said the government is prepared to go “quite a long way” with civil liberties to “stop terrorists killing people”.

He was responding to criticism of plans for a database of mobile and web records, saying it was needed because terrorists used such communications.

By not monitoring this traffic, it would be “giving a licence to terrorists to kill people”, he said.

I hope there will be similar pushback against this “choice.”

EDITED TO ADD (11/13): Seems like the Philadelphia engineers have another agenda—the cabs in the new trains are too small—and they’re just using security as an excuse.

Posted on October 22, 2008 at 6:44 AMView Comments

"Scareware" Vendors Sued

This is good:

Microsoft Corp. and the state of Washington this week filed lawsuits against a slew of “scareware” purveyors, scam artists who use fake security alerts to frighten consumers into paying for worthless computer security software.

The case filed by the Washington attorney general’s office names Texas-based Branch Software and its owner James Reed McCreary IV, alleging that McCreary’s company caused targeted PCs to pop up misleading security alerts about security threats on the victims’ computers. The alerts warned users that their systems were “damaged and corrupted” and instructed them to visit a Web site to purchase a copy of Registry Cleaner XP for $39.95.

I would have thought that existing scam laws would be enough, but Washington state actually has a specific law about this sort of thing:

The lawsuits were filed under Washington’s Computer Spyware Act, which among other things punishes individuals who prey on user concerns regarding spyware or other threats. Specifically, the law makes it illegal to misrepresent the extent to which software is required for computer security or privacy, and it provides actual damages or statutory damages of $100,000 per violation, whichever is greater.

Posted on October 2, 2008 at 7:03 AMView Comments

Hand Grenades as Weapons of Mass Destruction

I get that this is terrorism:

A 24-year-old convert to Islam has been sentenced to 35 years in prison for plotting to set off hand grenades in a crowded shopping mall during the Christmas season.

But I thought “weapons of mass destruction” was reserved for nuclear, chemical, and biological weapons.

He was arrested in 2006 on charges of scheming to use weapons of mass destruction at the Cherryvale Mall in the northern Illinois city of Rockford.

Like the continuing cheapening of the word “terrorism,” we are now cheapening the term “weapons of mass destruction.”

Edited: The link above now leads to a revised story that doesn’t use the term “weapons of mass destruction.” A version that does can still be found here.

Posted on October 1, 2008 at 6:37 AMView Comments

Monitoring P2P Networks

Interesting paper: “Challenges and Directions for Monitoring P2P File Sharing Networks or Why My Printer Received a DMCA Takedown Notice“:

Abstract—We reverse engineer copyright enforcement in the popular BitTorrent file sharing network and find that a common approach for identifying infringing users is not conclusive. We describe simple techniques for implicating arbitrary network endpoints in illegal content sharing and demonstrate the effectiveness of these techniques experimentally, attracting real DMCA complaints for nonsense devices, e.g., IP printers and a wireless access point. We then step back and evaluate the challenges and possible future directions for pervasive monitoring in P2P file sharing networks.

Webpage on the research.

Posted on August 22, 2008 at 12:08 PMView Comments

Memo to the Next President

Obama has a cyber security plan.

It’s basically what you would expect: Appoint a national cyber security advisor, invest in math and science education, establish standards for critical infrastructure, spend money on enforcement, establish national standards for securing personal data and data-breach disclosure, and work with industry and academia to develop a bunch of needed technologies.

I could comment on the plan, but with security the devil is always in the details—and, of course, at this point there are few details. But since he brought up the topic—McCain supposedly is “working on the issues” as well—I have three pieces of policy advice for the next president, whoever he is. They’re too detailed for campaign speeches or even position papers, but they’re essential for improving information security in our society. Actually, they apply to national security in general. And they’re things only government can do.

One, use your immense buying power to improve the security of commercial products and services. One property of technological products is that most of the cost is in the development of the product rather than the production. Think software: The first copy costs millions, but the second copy is free.

You have to secure your own government networks, military and civilian. You have to buy computers for all your government employees. Consolidate those contracts, and start putting explicit security requirements into the RFPs. You have the buying power to get your vendors to make serious security improvements in the products and services they sell to the government, and then we all benefit because they’ll include those improvements in the same products and services they sell to the rest of us. We’re all safer if information technology is more secure, even though the bad guys can use it, too.

Two, legislate results and not methodologies. There are a lot of areas in security where you need to pass laws, where the security externalities are such that the market fails to provide adequate security. For example, software companies who sell insecure products are exploiting an externality just as much as chemical plants that dump waste into the river. But a bad law is worse than no law. A law requiring companies to secure personal data is good; a law specifying what technologies they should use to do so is not. Mandating software liabilities for software failures is good, detailing how is not. Legislate for the results you want and implement the appropriate penalties; let the market figure out how—that’s what markets are good at.

Three, broadly invest in research. Basic research is risky; it doesn’t always pay off. That’s why companies have stopped funding it. Bell Labs is gone because nobody could afford it after the AT&T breakup, but the root cause was a desire for higher efficiency and short-term profitability—not unreasonable in an unregulated business. Government research can be used to balance that by funding long-term research.

Spread those research dollars wide. Lately, most research money has been redirected through DARPA to near-term military-related projects; that’s not good. Keep the earmark-happy Congress from dictating how the money is spent. Let the NSF, NIH and other funding agencies decide how to spend the money and don’t try to micromanage. Give the national laboratories lots of freedom, too. Yes, some research will sound silly to a layman. But you can’t predict what will be useful for what, and if funding is really peer-reviewed, the average results will be much better. Compared to corporate tax breaks and other subsidies, this is chump change.

If our research capability is to remain vibrant, we need more science and math students with decent elementary and high school preparation. The declining interest is partly from the perception that scientists don’t get rich like lawyers and dentists and stockbrokers, but also because science isn’t valued in a country full of creationists. One way the president can help is by trusting scientific advisers and not overruling them for political reasons.

Oh, and get rid of those post-9/11 restrictions on student visas that are causing so many top students to do their graduate work in Canada, Europe and Asia instead of in the United States. Those restrictions will hurt us immensely in the long run.

Those are the three big ones; the rest is in the details. And it’s the details that matter. There are lots of serious issues that you’re going to have to tackle: data privacy, data sharing, data mining, government eavesdropping, government databases, use of Social Security numbers as identifiers, and so on. It’s not enough to get the broad policy goals right. You can have good intentions and enact a good law, and have the whole thing completely gutted by two sentences sneaked in during rulemaking by some lobbyist.

Security is both subtle and complex, and—unfortunately—doesn’t readily lend itself to normal legislative processes. You’re used to finding consensus, but security by consensus rarely works. On the internet, security standards are much worse when they’re developed by a consensus body, and much better when someone just does them. This doesn’t always work—a lot of crap security has come from companies that have “just done it”—but nothing but mediocre standards come from consensus bodies. The point is that you won’t get good security without pissing someone off: The information broker industry, the voting machine industry, the telcos. The normal legislative process makes it hard to get security right, which is why I don’t have much optimism about what you can get done.

And if you’re going to appoint a cyber security czar, you have to give him actual budgetary authority. Otherwise he won’t be able to get anything done, either.

This essay originally appeared on Wired.com.

Posted on August 12, 2008 at 6:36 AMView Comments

DMCA Does Not Apply to U.S. Government

According to a recent court ruling, we are all subject to the provisions of the DMCA, but the government is not:

The Court of Federal Claims that first heard the case threw it out, and the new Appellate ruling upholds that decision. The reasoning behind the decisions focuses on the US government’s sovereign immunity, which the court describes thusly: “The United States, as [a] sovereign, ‘is immune from suit save as it consents to be sued . . . and the terms of its consent to be sued in any court define that court’s jurisdiction to entertain the suit.'”

In the case of copyright law, the US has given up much of its immunity, but the government retains a few noteworthy exceptions. The one most relevant to this case says that when a government employee is in a position to induce the use of the copyrighted material, “[the provision] does not provide a Government employee a right of action ‘where he was in a position to order, influence, or induce use of the copyrighted work by the Government.'” Given that Davenport used his position as part of the relevant Air Force office to get his peers to use his software, the case fails this test.

But the court also addressed the DMCA claims made by Blueport, and its decision here is quite striking. “The DMCA itself contains no express waiver of sovereign immunity,” the judge wrote, “Indeed, the substantive prohibitions of the DMCA refer to individual persons, not the Government.” Thus, because sovereign immunity is not explicitly eliminated, and the phrasing of the statute does not mention organizations, the DMCA cannot be applied to the US government, even in cases where the more general immunity to copyright claims does not apply.

It appears that Congress took a “do as we say, not as we need to do” approach to strengthening digital copyrights.

Posted on August 8, 2008 at 11:32 AMView Comments

Why You Should Never Talk to the Police

This is an engaging and fascinating video presentation by Professor James Duane of the Regent University School of Law, explaining why—in a criminal matter—you should never, ever, ever talk to the police or any other government agent. It doesn’t matter if you’re guilty or innocent, if you have an alibi or not—it isn’t possible for anything you say to help you, and it’s very possible that innocuous things you say will hurt you.

Definitely worth half an hour of your time.

And this is a video of Virginia Beach Police Department Officer George Bruch, who basically says that Duane is right.

Posted on July 31, 2008 at 12:52 PMView Comments

1 16 17 18 19 20 35

Sidebar photo of Bruce Schneier by Joe MacInnis.