Entries Tagged "courts"

Page 28 of 29

E-Mail Interception Decision Reversed

Is e-mail in transit communications or data in storage? Seems like a basic question, but the answer matters a lot to the police. A U.S. federal Appeals Court has ruled that the interception of e-mail in temporary storage violates the federal wiretap act, reversing an earlier court opinion.

The case and associated privacy issues are summarized here. Basically, different privacy laws protect electronic communications in transit and data in storage; the former is protected much more than the latter. E-mail stored by the sender or the recipient is obviously data in storage. But what about e-mail on its way from the sender to the receiver? On the one hand, it’s obviously communications on transit. But the other side argued that it’s actually stored on various computers as it wends its way through the Internet; hence it’s data in storage.

The initial court decision in this case held that e-mail in transit is just data in storage. Judge Lipez wrote an inspired dissent in the original opinion. In the rehearing en banc (more judges), he wrote the opinion for the majority which overturned the earlier opinion.

The opinion itself is long, but well worth reading. It’s well reasoned, and reflects extraordinary understanding and attention to detail. And a great last line:

If the issue presented be “garden-variety”… this is a garden in need of a weed killer.

I participated in an Amicus Curiae (“friend of the court”) brief in the case. Here’s another amicus brief by six civil liberties organizations.

There’s a larger issue here, and it’s the same one that the entertainment industry used to greatly expand copyright law in cyberspace. They argued that every time a copyrighted work is moved from computer to computer, or CD-ROM to RAM, or server to client, or disk drive to video card, a “copy” is being made. This ridiculous definition of “copy” has allowed them to exert far greater legal control over how people use copyrighted works.

Posted on August 15, 2005 at 7:59 AMView Comments

The MD5 Defense

This is interesting:

A team of Chinese maths enthusiasts have thrown NSW’s speed cameras system into disarray by cracking the technology used to store data about errant motorists.

The NRMA has called for a full audit of the way the state’s 110 enforcement cameras are used after a motorist escaped a conviction by claiming that data was vulnerable to hackers.

A Sydney magistrate, Laurence Lawson, threw out the case because the Roads and Traffic Authority failed to find an expert to testify that its speed camera images were secure.

The motorist’s defence lawyer, Denis Mirabilis, argued successfully that an algorithm known as MD5, which is used to store the time, date, place, numberplate and speed of cars caught on camera, was a discredited piece of technology.

It’s true that MD5 is broken. On the other hand, it’s almost certainly true that the speed cameras were correct. If there’s any lesson here, it’s that theoretical security is important in legal proceedings.

I think that’s a good thing.

Posted on August 11, 2005 at 7:52 AMView Comments

Cisco Harasses Security Researcher

I’ve written about full disclosure, and how disclosing security vulnerabilities is our best mechanism for improving security—especially in a free-market system. (That essay is also worth reading for a general discussion of the security trade-offs.) I’ve also written about how security companies treat vulnerabilities as public-relations problems first and technical problems second. This week at BlackHat, security researcher Michael Lynn and Cisco demonstrated both points.

Lynn was going to present security flaws in Cisco’s IOS, and Cisco went to inordinate lengths to make sure that information never got into the hands of the their consumers, the press, or the public.

Cisco threatened legal action to stop the conference’s organizers from allowing a 24-year-old researcher for a rival tech firm to discuss how he says hackers could seize control of Cisco’s Internet routers, which dominate the market. Cisco also instructed workers to tear 20 pages outlining the presentation from the conference program and ordered 2,000 CDs containing the presentation destroyed.

In the end, the researcher, Michael Lynn, went ahead with a presentation, describing flaws in Cisco’s software that he said could allow hackers to take over corporate and government networks and the Internet, intercepting and misdirecting data communications. Mr. Lynn, wearing a white hat emblazoned with the word “Good,” spoke after quitting his job at Internet Security Systems Inc. Wednesday. Mr. Lynn said he resigned because ISS executives had insisted he strike key portions of his presentation.

Not being able to censor the information, Cisco decided to act as if it were no big deal:

In a release shortly after the presentation, Cisco stated, “It is important to note that the information Lynn presented was not a disclosure of a new vulnerability or a flaw with Cisco IOS software. Lynn’s research explores possible ways to expand exploitations of known security vulnerabilities impacting routers.” And went on to state “Cisco believes that the information Lynn presented at the Blackhat conference today contained proprietary information and was illegally obtained.” The statement also refers to the fact that Lynn stated in his presentation that he used a popular file decompressor to ‘unzip’ the Cisco image before reverse engineering it and finding the flaw, which is against Cisco’s use agreement.

The Cisco propaganda machine is certainly working overtime this week.

The security implications of this are enormous. If companies have the power to censor information about their products they don’t like, then we as consumers have less information with which to make intelligent buying decisions. If companies have the power to squelch vulnerability information about their products, then there’s no incentive for them to improve security. (I’ve written about this in connection to physical keys and locks.) If free speech is subordinate to corporate demands, then we are all much less safe.

Full disclosure is good for society. But because it helps the bad guys as well as the good guys (see my essay on secrecy and security for more discussion of the balance), many of us have championed “responsible disclosure” guidelines that give vendors a head start in fixing vulnerabilities before they’re announced.

The problem is that not all researchers follow these guidelines. And laws limiting free speech do more harm to society than good. (In any case, laws won’t completely fix the problem; we can’t get laws passed in every possible country security researchers live.) So the only reasonable course of action for a company is to work with researchers who alert them to vulnerabilities, but also assume that vulnerability information will sometimes be released without prior warning.

I can’t imagine the discussions inside Cisco that led them to act like thugs. I can’t figure out why they decided to attack Michael Lynn, BlackHat, and ISS rather than turn the situation into a public-relations success. I can’t believe that they thought they could have censored the information by their actions, or even that it was a good idea.

Cisco’s customers want information. They don’t expect perfection, but they want to know the extent of problems and what Cisco is doing about them. They don’t want to know that Cisco tries to stifle the truth:

Joseph Klein, senior security analyst at the aerospace electronic systems division for Honeywell Technology Solutions, said he helped arrange a meeting between government IT professionals and Lynn after the talk. Klein said he was furious that Cisco had been unwilling to disclose the buffer-overflow vulnerability in unpatched routers. “I can see a class-action lawsuit against Cisco coming out of this,” Klein said.

ISS didn’t come out of this looking very good, either:

“A few years ago it was rumored that ISS would hold back on certain things because (they’re in the business of) providing solutions,” [Ali-Reza] Anghaie, [a senior security engineer with an aerospace firm, who was in the audience,] said. “But now you’ve got full public confirmation that they’ll submit to the will of a Cisco or Microsoft, and that’s not fair to their customers…. If they’re willing to back down and leave an employee … out to hang, well what are they going to do for customers?”

Despite their thuggish behavior, this has been a public-relations disaster for Cisco. Now it doesn’t matter what they say—we won’t believe them. We know that the public-relations department handles their security vulnerabilities, and not the engineering department. We know that they think squelching information and muzzling researchers is more important than informing the public. They could have shown that they put their customers first, but instead they demonstrated that short-sighted corporate interests are more important than being a responsible corporate citizen.

And these are the people building the hardware that runs much of our infrastructure? Somehow, I don’t feel very secure right now.

EDITED TO ADD: I am impressed with Lynn’s personal integrity in this matter:

When Mr. Lynn took the stage yesterday, he was introduced as speaking on a different topic, eliciting boos. But those turned to cheers when he asked, “Who wants to hear about Cisco?” As he got started, Mr. Lynn said, “What I just did means I’m about to get sued by Cisco and ISS. Not to put too fine a point on it, but bring it on.”

And this:

Lynn closed his talk by directing the audience to his resume and asking if anyone could give him a job.

“In large part I had to quit to give this presentation because ISS and Cisco would rather the world be at risk, I guess,” Lynn said. “They had to do what’s right for their shareholders; I understand that. But I figured I needed to do what’s right for the country and for the national critical infrastructure.”

There’s a lawsuit against him. I’ll let you know if there’s a legal defense fund.

EDITED TO ADD: The lawsuit has been settled. Some details:

Michael Lynn, a former ISS researcher, and the Black Hat organisers agreed to a permanent injunction barring them from further discussing the presentation Lynn gave on Wednesday. The presentation showed how attackers could take over Cisco routers, a problem that Lynn said could bring the Internet to its knees.

The injunction also requires Lynn to return any materials and disassembled code related to Cisco, according to a copy of the injunction, which was filed in US District Court for the District of Northern California. The injunction was agreed on by attorneys for Lynn, Black Hat, ISS and Cisco.

Lynn is also forbidden to make any further presentations at the Black Hat event, which ended on Thursday, or the following Defcon event. Additionally, Lynn and Black Hat have agreed never to disseminate a video made of Lynn’s presentation and to deliver to Cisco any video recording made of Lynn.

My hope is that Cisco realized that continuing with this would be a public-relations disaster.

EDITED TO ADD: Lynn’s BlackHat presentation is on line.

EDITED TO ADD: The FBI is getting involved.

EDITED TO ADD: The link to the presentation, above, has been replaced with a cease-and-desist letter. A copy of the presentation is now here.

Posted on July 29, 2005 at 4:35 AMView Comments

UK Police and Encryption

From The Guardian:

Police last night told Tony Blair that they need sweeping new powers to counter the terrorist threat, including the right to detain a suspect for up to three months without charge instead of the current 14 days….

They also want to make it a criminal offence for suspects to refuse to cooperate in giving the police full access to computer files by refusing to disclose their encryption keys.

On Channel 4 News today, Sir Ian Blair was asked why the police wanted to extend the time they could hold someone without charges from 14 days to 3 months. Part of his answer was that they sometimes needed to access encrypted computer files and 14 days was not enough time for them to break the encryption.

There’s something fishy going on here.

It’s certainly possible that password-guessing programs are more successful with three months to guess. But the Regulation of Investigatory Powers (RIP) Act, which went into effect in 2000, already allows the police to jail people who don’t surrender encryption keys:

If intercepted communications are encrypted (encoded and made secret), the act will force the individual to surrender the keys (pin numbers which allow users to decipher encoded data), on pain of jail sentences of up to two years.

Posted on July 27, 2005 at 3:00 PMView Comments

Domestic Terrorism (U.S.)

Nice MSNBC piece on domestic terrorism in the U.S.:

The sentencing of Eric Rudolph, who bombed abortion clinics, a gay bar and the Atlanta Olympics, ought to be a milestone in the Global War on Terror. In Birmingham, Ala., on Monday he got life without parole. Next month he’ll stack up a couple more life terms in Georgia, which is the least he deserves. (He escaped the death penalty only because he made a deal to help law-enforcement agents find the explosives he had hidden while on the run in North Carolina.) Rudolph killed two people, but not for want of trying to kill many more. In his 1997 attack on an Atlanta abortion clinic, he set off a second bomb meant to take out bystanders and rescue workers. Unrepentant, of course, Rudolph defended his actions as a moral imperative: “Abortion is murder, and because it is murder I believe deadly force is needed to stop it.” The Birmingham prosecutor declared that Rudolph had “appointed himself judge, jury and executioner.”

Indeed. That’s what all terrorists have in common: the four lunatics in London earlier this month; the 19 men who attacked America on September 11, 2001; Timothy McVeigh in Oklahoma City, and many others. They were all convinced they had noble motives for wreaking their violence. Terrorists are very righteous folks. Which is why the real global war we’re fighting, let’s be absolutely clear, should be one of our shared humanity against the madness of people like these; the rule of man-made laws on the books against the divine law they imagine for themselves. It’s the cause of reason against unreason, of self-criticism against the firm convictions of fanaticism.

David Neiwert has some good commentary on the topic. He also points to this U.S. News and World Report article.

Posted on July 25, 2005 at 9:04 PMView Comments

Security Risks of Airplane WiFi

I’ve already written about the stupidity of worrying about cell phones on airplanes. Now the Department of Homeland Security is worried about broadband Internet.

Federal law enforcement officials, fearful that terrorists will exploit emerging in-flight broadband services to remotely activate bombs or coordinate hijackings, are asking regulators for the power to begin eavesdropping on any passenger’s internet use within 10 minutes of obtaining court authorization.

In joint comments filed with the FCC last Tuesday, the Justice Department, the FBI and the Department of Homeland Security warned that a terrorist could use on-board internet access to communicate with confederates on other planes, on the ground or in different sections of the same plane—all from the comfort of an aisle seat.

“There is a short window of opportunity in which action can be taken to thwart a suicidal terrorist hijacking or remedy other crisis situations on board an aircraft, and law enforcement needs to maximize its ability to respond to these potentially lethal situations,” the filing reads.

Terrorists never use SSH, after all. (I suppose that’s the next thing the DHS is going to try to ban.)

Posted on July 14, 2005 at 12:02 PMView Comments

Defining "Access" in Cyberspace

I’ve been reading a lot of law journal articles. It’s interesting to read legal analyses of some of the computer security problems I’ve been wrestling with.

This is a fascinating paper on the concepts of “access” and “authorized access” in cyberspace. The abstract:

In the last twenty-five years, the federal government and all fifty states have enacted new criminal laws that prohibit unauthorized access to computers. These new laws attempt to draw a line between criminality and free conduct in cyberspace. No one knows what it means to access a computer, however, nor when access becomes unauthorized. The few courts that have construed these terms have offered divergent interpretations, and no scholars have yet addressed the problem. Recent decisions interpreting the federal statute in civil cases suggest that any breach of contract with a computer owner renders use of that computer an unauthorized access. If applied to criminal cases, this approach would broadly criminalize contract law on the Internet, potentially making millions of Americans criminals for the way they write e-mail and surf the Web.

This Article presents a comprehensive inquiry into the meaning of unauthorized access statutes. It begins by explaining why legislatures enacted unauthorized access statutes, and why early beliefs that such statutes solved the problem of computer misuse have proved remarkably naïve. Next, the Article explains how the courts have construed these statutes in an overly broad way that threatens to criminalize a surprising range of innocuous conduct involving computers. In the final section, the Article offers a normative proposal for interpreting access and authorization. This section argues that courts should reject a contract theory of authorization, and should narrow the scope of unauthorized access statutes to circumvention of code-based restrictions on computer privileges. The section justifies this proposal on several grounds. First, the proposal will best mediate the line between securing privacy and protecting the liberty of Internet users. Second, the proposal mirrors criminal law’s traditional treatment of crimes that contain a consent element. Third, the proposed approach is consistent with the basic theories of punishment. Fourth, the proposed interpretation avoids possible constitutional difficulties that may arise under the broader constructions that courts recently have favored.

It’s a long paper, but I recommend reading it if you’re interested in the legal concepts.

Posted on June 14, 2005 at 7:16 AMView Comments

U.S. Medical Privacy Law Gutted

In the U.S., medical privacy is largely governed by a 1996 law called HIPAA. Among many other provisions, HIPAA regulates the privacy and security surrounding electronic medical records. HIPAA specifies civil penalties against companies that don’t comply with the regulations, as well as criminal penalties against individuals and corporations who knowingly steal or misuse patient data.

The civil penalties have long been viewed as irrelevant by the health care industry. Now the criminal penalties have been gutted:

An authoritative new ruling by the Justice Department sharply limits the government’s ability to prosecute people for criminal violations of the law that protects the privacy of medical records.

The criminal penalties, the department said, apply to insurers, doctors, hospitals and other providers—but not necessarily their employees or outsiders who steal personal health data.

In short, the department said, people who work for an entity covered by the federal privacy law are not automatically covered by that law and may not be subject to its criminal penalties, which include a $250,000 fine and 10 years in prison for the most serious violations.

This is a complicated issue. Peter Swire worked extensively on this bill as the President’s Chief Counselor for Privacy, and I am going to quote him extensively. First, a story about someone who was convicted under the criminal part of this statute.

In 2004 the U.S. Attorney in Seattle announced that Richard Gibson was being indicted for violating the HIPAA privacy law. Gibson was a phlebotomist ­ a lab assistant ­ in a hospital. While at work he accessed the medical records of a person with a terminal cancer condition. Gibson then got credit cards in the patient’s name and ran up over $9,000 in charges, notably for video game purchases. In a statement to the court, the patient said he “lost a year of life both mentally and physically dealing with the stress” of dealing with collection agencies and other results of Gibson’s actions. Gibson signed a plea agreement and was sentenced to 16 months in jail.

According to this Justice Department ruling, Gibson was wrongly convicted. I presume his attorney is working on the matter, and I hope he can be re-tried under our identity theft laws. But because Gibson (or someone else like him) was working in his official capacity, he cannot be prosecuted under HIPAA. And because Gibson (or someone like him) was doing something not authorized by his employer, the hospital cannot be prosecuted under HIPAA.

The healthcare industry has been opposed to HIPAA from the beginning, because it puts constraints on their business in the name of security and privacy. This ruling comes after intense lobbying by the industry at the Department of Heath and Human Services and the Justice Department, and is the result of an HHS request for an opinion.

From Swire’s analysis the Justice Department ruling.

For a law professor who teaches statutory interpretation, the OLC opinion is terribly frustrating to read. The opinion reads like a brief for one side of an argument. Even worse, it reads like a brief that knows it has the losing side but has to come out with a predetermined answer.

I’ve been to my share of HIPAA security conferences. To the extent that big health is following the HIPAA law—and to a large extent, they’re waiting to see how it’s enforced—they are doing so because of the criminal penalties. They know that the civil penalties aren’t that large, and are a cost of doing business. But the criminal penalties were real. Now that they’re gone, the pressure on big health to protect patient privacy is greatly diminished.

Again Swire:

The simplest explanation for the bad OLC opinion is politics. Parts of the health care industry lobbied hard to cancel HIPAA in 2001. When President Bush decided to keep the privacy rule—quite possibly based on his sincere personal views—the industry efforts shifted direction. Industry pressure has stopped HHS from bringing a single civil case out of the 13,000 complaints. Now, after a U.S. Attorney’s office had the initiative to prosecute Mr. Gibson, senior officials in Washington have clamped down on criminal enforcement. The participation of senior political officials in the interpretation of a statute, rather than relying on staff attorneys, makes this political theory even more convincing.

This kind of thing is bigger than the security of the healthcare data of Americans. Our administration is trying to collect more data in its attempt to fight terrorism. Part of that is convincing people—both Americans and foreigners—that this data will be protected. When we gut privacy protections because they might inconvenience business, we’re telling the world that privacy isn’t one of our core concerns.

If the administration doesn’t believe that we need to follow its medical data privacy rules, what makes you think they’re following the FISA rules?

Posted on June 7, 2005 at 12:15 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.