Entries Tagged "courts"

Page 27 of 29

ATM Fraud and British Banks

An absolutely great story about phantom ATM withdrawals and British banking from the early 90s. (The story is from the early 90s; it has just become public now.) Read how a very brittle security system, coupled with banks using the legal system to avoid fixing the problem, resulted in lots of innocent people losing money to phantom withdrawals. Read how lucky everyone was that the catastrophic security problem was never discovered by criminals. It’s an amazing story.

See also Ross Anderson’s page on phantom withdrawals.

Oh, and Alistair Kelman assures me that he did not charge 1,750 pounds per hour, only 450 pounds per hour.

Posted on October 24, 2005 at 7:16 AMView Comments

Liabilities and Software Vulnerabilities

My fourth column for Wired discusses liability for software vulnerabilities. Howard Schmidt argued that individual programmers should be liable for vulnerabilities in their code. (There’s a Slashdot thread on Schmidt’s comments.) I say that it should be the software vendors that should be liable, not the individual programmers.

Click on the essay for the whole argument, but here’s the critical point:

If end users can sue software manufacturers for product defects, then the cost of those defects to the software manufacturers rises. Manufacturers are now paying the true economic cost for poor software, and not just a piece of it. So when they’re balancing the cost of making their software secure versus the cost of leaving their software insecure, there are more costs on the latter side. This will provide an incentive for them to make their software more secure.

To be sure, making software more secure will cost money, and manufacturers will have to pass those costs on to users in the form of higher prices. But users are already paying extra costs for insecure software: costs of third-party security products, costs of consultants and security-services companies, direct and indirect costs of losses. Making software manufacturers liable moves those costs around, and as a byproduct causes the quality of software to improve.

This is why Schmidt’s idea won’t work. He wants individual software developers to be liable, and not the corporations. This will certainly give pissed-off users someone to sue, but it won’t reduce the externality and it won’t result in more-secure software.

EDITED TO ADD: Dan Farber has a good commentary on my essay. He says I got Schmidt wrong, that Schmidt wants programmers to be accountable but not liable. Be that as it may, I still think that making software vendors liable is a good idea.

There has been some confusion about this in the comments, that somehow this means that software vendors will be expected to achieve perfection and that they will be 100% liable for anything short of that. Clearly that’s ridiculous, and that’s not the way liabilities work. But equally ridiculous is the notion that software vendors should be 0% liable for defects. Somewhere in the middle there is a reasonable amount of liablity, and that’s what I want the courts to figure out.

EDITED TO ADD: Howard Schmidt writes: “It is unfortunate that my comments were reported inaccurately; at least Dan Farber has been trying to correct the inaccurate reports with his blog. I do not support PERSONAL LIABILITY for the developers NOR do I support liability against vendors. Vendors are nothing more then people (employees included) and anything against them hurts the very people who need to be given better tools, training and support.”

Howard wrote an essay on the topic.

Posted on October 20, 2005 at 5:19 AMView Comments

Judge Roberts, Privacy, and the Future

My second essay for Wired was published today. It’s about the future privacy rulings of the Supreme Court:

Recent advances in technology have already had profound privacy implications, and there’s every reason to believe that this trend will continue into the foreseeable future. Roberts is 50 years old. If confirmed, he could be chief justice for the next 30 years. That’s a lot of future.

Privacy questions will arise from government actions in the “War on Terror”; they will arise from the actions of corporations and individuals. They will include questions of surveillance, profiling and search and seizure. And the decisions of the Supreme Court on these questions will have a profound effect on society.

Posted on September 22, 2005 at 12:28 PMView Comments

DUI Cases Thrown Out Due to Closed-Source Breathalyzer

Really:

Hundreds of cases involving breath-alcohol tests have been thrown out by Seminole County judges in the past five months because the test’s manufacturer will not disclose how the machines work.

I think this is huge. (Think of the implications for voting systems, for one.) And it’s the right decision. Throughout history, the government has had to make the choice: prosecute, or keep your investigative methods secret. They couldn’t have both. If they wanted to keep their methods secret, they had to give up on prosecution.

People have the right to confront their accuser. And people have the right to a public trial. This is the correct decision, and we are all safer because of it.

Posted on September 16, 2005 at 6:46 AMView Comments

Criminals Learn Forensic Science

Criminals are adapting to advances in forensic science:

There is an increasing trend for criminals to use plastic gloves during break-ins and condoms during rapes to avoid leaving their DNA at the scene. Dostie describes a murder case in which the assailant tried to wash away his DNA using shampoo. Police in Manchester in the UK say that car thieves there have started to dump cigarette butts from bins in stolen cars before they abandon them. “Suddenly the police have 20 potential people in the car,” says Rutty.

The article also talks about forensic-science television shows changing the expectations of jurors.

“Jurors who watch CSI believe that those scenarios, where forensic scientists are always right, are what really happens,” says Peter Bull, a forensic sedimentologist at the University of Oxford. It means that in court, juries are not impressed with evidence presented in cautious scientific terms.

Detective sergeant Paul Dostie, of Mammoth Lakes Police Department, California, found the same thing when he conducted a straw poll of forensic investigators and prosecutors. “They all agree that jurors expect more because of CSI shows,” he says. And the “CSI effect” goes beyond juries, says Jim Fraser, director of the Centre for Forensic Science at the University of Strathclyde, UK. “Oversimplification of interpretations on CSI has led to false expectations, especially about the speed of delivery of forensic evidence,” he says.

Posted on September 9, 2005 at 7:16 AMView Comments

The Kutztown 13

Thirteen Pennsylvania high-school kids—Kutztown 13—are being charged with felonies:

They’re being called the Kutztown 13—a group of high schoolers charged with felonies for bypassing security with school-issued laptops, downloading forbidden internet goodies and using monitoring software to spy on district administrators.

The students, their families and outraged supporters say authorities are overreacting, punishing the kids not for any heinous behavior—no malicious acts are alleged—but rather because they outsmarted the district’s technology workers….

The trouble began last fall after the district issued some 600 Apple iBook laptops to every student at the high school about 50 miles northwest of Philadelphia. The computers were loaded with a filtering program that limited Internet access. They also had software that let administrators see what students were viewing on their screens.

But those barriers proved easily surmountable: The administrative password that allowed students to reconfigure computers and obtain unrestricted Internet access was easy to obtain. A shortened version of the school’s street address, the password was taped to the backs of the computers.

The password got passed around and students began downloading such forbidden programs as the popular iChat instant-messaging tool.

At least one student viewed pornography. Some students also turned off the remote monitoring function and turned the tables on their elders_ using it to view administrators’ own computer screens.

There’s more to the story, though. Here’s some good commentary on the issue:

What the parents don’t mention—but the school did in a press release—is that it wasn’t as if the school came down with the Hammer of God out of nowhere.

These kids were caught and punished for doing this stuff, and their parents informed.

Over and over.

Quoth the release:

“Unfortunately, after repeated warnings and disciplinary actions, a few students continued to misuse the school-issued laptops to varying degrees. The disciplinary actions included detentions, in-school suspensions, loss of Internet access, and loss of computer privileges. After each disciplinary action, parents received either written notification or telephone calls.”

What was the parents’ reaction those disciplinary actions? Some of them complained that—despite signing a document agreeing to the acceptable use policy—the kids should be able to do whatever they wanted to with the free machines.

“We signed it, but we didn’t mean it”?

Yes, the kids should be punished. No, a felony comviction is not the way to punish them.

The problem is that the punishment doesn’t fit the crime. Breaking the rules is what kids do. Society needs to deal with that, yes, but it needs to deal with that in a way that doesn’t ruin lives. Deterrence is critical if we are to ever have a lawful society on the internet, but deterrence has to come from rational prosecution. This simply isn’t rational.

EDITED TO ADD (2 Sep): It seems that charges have been dropped.

Posted on August 22, 2005 at 6:56 AMView Comments

E-Mail Interception Decision Reversed

Is e-mail in transit communications or data in storage? Seems like a basic question, but the answer matters a lot to the police. A U.S. federal Appeals Court has ruled that the interception of e-mail in temporary storage violates the federal wiretap act, reversing an earlier court opinion.

The case and associated privacy issues are summarized here. Basically, different privacy laws protect electronic communications in transit and data in storage; the former is protected much more than the latter. E-mail stored by the sender or the recipient is obviously data in storage. But what about e-mail on its way from the sender to the receiver? On the one hand, it’s obviously communications on transit. But the other side argued that it’s actually stored on various computers as it wends its way through the Internet; hence it’s data in storage.

The initial court decision in this case held that e-mail in transit is just data in storage. Judge Lipez wrote an inspired dissent in the original opinion. In the rehearing en banc (more judges), he wrote the opinion for the majority which overturned the earlier opinion.

The opinion itself is long, but well worth reading. It’s well reasoned, and reflects extraordinary understanding and attention to detail. And a great last line:

If the issue presented be “garden-variety”… this is a garden in need of a weed killer.

I participated in an Amicus Curiae (“friend of the court”) brief in the case. Here’s another amicus brief by six civil liberties organizations.

There’s a larger issue here, and it’s the same one that the entertainment industry used to greatly expand copyright law in cyberspace. They argued that every time a copyrighted work is moved from computer to computer, or CD-ROM to RAM, or server to client, or disk drive to video card, a “copy” is being made. This ridiculous definition of “copy” has allowed them to exert far greater legal control over how people use copyrighted works.

Posted on August 15, 2005 at 7:59 AMView Comments

The MD5 Defense

This is interesting:

A team of Chinese maths enthusiasts have thrown NSW’s speed cameras system into disarray by cracking the technology used to store data about errant motorists.

The NRMA has called for a full audit of the way the state’s 110 enforcement cameras are used after a motorist escaped a conviction by claiming that data was vulnerable to hackers.

A Sydney magistrate, Laurence Lawson, threw out the case because the Roads and Traffic Authority failed to find an expert to testify that its speed camera images were secure.

The motorist’s defence lawyer, Denis Mirabilis, argued successfully that an algorithm known as MD5, which is used to store the time, date, place, numberplate and speed of cars caught on camera, was a discredited piece of technology.

It’s true that MD5 is broken. On the other hand, it’s almost certainly true that the speed cameras were correct. If there’s any lesson here, it’s that theoretical security is important in legal proceedings.

I think that’s a good thing.

Posted on August 11, 2005 at 7:52 AMView Comments

Cisco Harasses Security Researcher

I’ve written about full disclosure, and how disclosing security vulnerabilities is our best mechanism for improving security—especially in a free-market system. (That essay is also worth reading for a general discussion of the security trade-offs.) I’ve also written about how security companies treat vulnerabilities as public-relations problems first and technical problems second. This week at BlackHat, security researcher Michael Lynn and Cisco demonstrated both points.

Lynn was going to present security flaws in Cisco’s IOS, and Cisco went to inordinate lengths to make sure that information never got into the hands of the their consumers, the press, or the public.

Cisco threatened legal action to stop the conference’s organizers from allowing a 24-year-old researcher for a rival tech firm to discuss how he says hackers could seize control of Cisco’s Internet routers, which dominate the market. Cisco also instructed workers to tear 20 pages outlining the presentation from the conference program and ordered 2,000 CDs containing the presentation destroyed.

In the end, the researcher, Michael Lynn, went ahead with a presentation, describing flaws in Cisco’s software that he said could allow hackers to take over corporate and government networks and the Internet, intercepting and misdirecting data communications. Mr. Lynn, wearing a white hat emblazoned with the word “Good,” spoke after quitting his job at Internet Security Systems Inc. Wednesday. Mr. Lynn said he resigned because ISS executives had insisted he strike key portions of his presentation.

Not being able to censor the information, Cisco decided to act as if it were no big deal:

In a release shortly after the presentation, Cisco stated, “It is important to note that the information Lynn presented was not a disclosure of a new vulnerability or a flaw with Cisco IOS software. Lynn’s research explores possible ways to expand exploitations of known security vulnerabilities impacting routers.” And went on to state “Cisco believes that the information Lynn presented at the Blackhat conference today contained proprietary information and was illegally obtained.” The statement also refers to the fact that Lynn stated in his presentation that he used a popular file decompressor to ‘unzip’ the Cisco image before reverse engineering it and finding the flaw, which is against Cisco’s use agreement.

The Cisco propaganda machine is certainly working overtime this week.

The security implications of this are enormous. If companies have the power to censor information about their products they don’t like, then we as consumers have less information with which to make intelligent buying decisions. If companies have the power to squelch vulnerability information about their products, then there’s no incentive for them to improve security. (I’ve written about this in connection to physical keys and locks.) If free speech is subordinate to corporate demands, then we are all much less safe.

Full disclosure is good for society. But because it helps the bad guys as well as the good guys (see my essay on secrecy and security for more discussion of the balance), many of us have championed “responsible disclosure” guidelines that give vendors a head start in fixing vulnerabilities before they’re announced.

The problem is that not all researchers follow these guidelines. And laws limiting free speech do more harm to society than good. (In any case, laws won’t completely fix the problem; we can’t get laws passed in every possible country security researchers live.) So the only reasonable course of action for a company is to work with researchers who alert them to vulnerabilities, but also assume that vulnerability information will sometimes be released without prior warning.

I can’t imagine the discussions inside Cisco that led them to act like thugs. I can’t figure out why they decided to attack Michael Lynn, BlackHat, and ISS rather than turn the situation into a public-relations success. I can’t believe that they thought they could have censored the information by their actions, or even that it was a good idea.

Cisco’s customers want information. They don’t expect perfection, but they want to know the extent of problems and what Cisco is doing about them. They don’t want to know that Cisco tries to stifle the truth:

Joseph Klein, senior security analyst at the aerospace electronic systems division for Honeywell Technology Solutions, said he helped arrange a meeting between government IT professionals and Lynn after the talk. Klein said he was furious that Cisco had been unwilling to disclose the buffer-overflow vulnerability in unpatched routers. “I can see a class-action lawsuit against Cisco coming out of this,” Klein said.

ISS didn’t come out of this looking very good, either:

“A few years ago it was rumored that ISS would hold back on certain things because (they’re in the business of) providing solutions,” [Ali-Reza] Anghaie, [a senior security engineer with an aerospace firm, who was in the audience,] said. “But now you’ve got full public confirmation that they’ll submit to the will of a Cisco or Microsoft, and that’s not fair to their customers…. If they’re willing to back down and leave an employee … out to hang, well what are they going to do for customers?”

Despite their thuggish behavior, this has been a public-relations disaster for Cisco. Now it doesn’t matter what they say—we won’t believe them. We know that the public-relations department handles their security vulnerabilities, and not the engineering department. We know that they think squelching information and muzzling researchers is more important than informing the public. They could have shown that they put their customers first, but instead they demonstrated that short-sighted corporate interests are more important than being a responsible corporate citizen.

And these are the people building the hardware that runs much of our infrastructure? Somehow, I don’t feel very secure right now.

EDITED TO ADD: I am impressed with Lynn’s personal integrity in this matter:

When Mr. Lynn took the stage yesterday, he was introduced as speaking on a different topic, eliciting boos. But those turned to cheers when he asked, “Who wants to hear about Cisco?” As he got started, Mr. Lynn said, “What I just did means I’m about to get sued by Cisco and ISS. Not to put too fine a point on it, but bring it on.”

And this:

Lynn closed his talk by directing the audience to his resume and asking if anyone could give him a job.

“In large part I had to quit to give this presentation because ISS and Cisco would rather the world be at risk, I guess,” Lynn said. “They had to do what’s right for their shareholders; I understand that. But I figured I needed to do what’s right for the country and for the national critical infrastructure.”

There’s a lawsuit against him. I’ll let you know if there’s a legal defense fund.

EDITED TO ADD: The lawsuit has been settled. Some details:

Michael Lynn, a former ISS researcher, and the Black Hat organisers agreed to a permanent injunction barring them from further discussing the presentation Lynn gave on Wednesday. The presentation showed how attackers could take over Cisco routers, a problem that Lynn said could bring the Internet to its knees.

The injunction also requires Lynn to return any materials and disassembled code related to Cisco, according to a copy of the injunction, which was filed in US District Court for the District of Northern California. The injunction was agreed on by attorneys for Lynn, Black Hat, ISS and Cisco.

Lynn is also forbidden to make any further presentations at the Black Hat event, which ended on Thursday, or the following Defcon event. Additionally, Lynn and Black Hat have agreed never to disseminate a video made of Lynn’s presentation and to deliver to Cisco any video recording made of Lynn.

My hope is that Cisco realized that continuing with this would be a public-relations disaster.

EDITED TO ADD: Lynn’s BlackHat presentation is on line.

EDITED TO ADD: The FBI is getting involved.

EDITED TO ADD: The link to the presentation, above, has been replaced with a cease-and-desist letter. A copy of the presentation is now here.

Posted on July 29, 2005 at 4:35 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.