Entries Tagged "laws"

Page 30 of 35

Actors Playing New York City Policemen

Did you know you could be arrested for carrying a police uniform in New York City?

With security tighter in the Big Apple since Sept. 11, 2001, the union that represents TV and film actors has begun advising its New York-area members to stop buying police costumes or carrying them to gigs, even if their performances require them.

The Screen Actors Guild said in a statement posted on its Web site on Friday that “an apparent shift in city policy” may put actors at risk of arrest if they are stopped while carrying anything that looks too much like a real police uniform.

The odds that an actor might be stopped and questioned on his or her way to work went up this month when police began conducting random searches of passengers’ bags in New York’s subway system. The guild said two of its members had been detained by security personnel at an airport and a courthouse in recent months for possessing police costumes.

This seems like overkill to me. I understand that a police uniform is an authentication device—not a very good one, but one nonetheless—and we want to make it harder for the bad guys to get one. But there’s no reason to prohibit screen or stage actors from having police uniforms if it’s part of their job. This seems similar to the laws surrounding lockpicks: you can be arrested for carrying them without a good reason, but locksmiths are allowed to own the tools of their trade.

Here’s another bit from the article:

Under police department rules, real officers must be on hand any time an actor dons a police costume during a TV or film production.

I guess that’s to prevent the actor from actually impersonating a policeman. But how often does that actually happen? Is this a good use of police manpower?

Does anyone know how other cities and countries handle this?

Posted on August 25, 2005 at 12:52 PMView Comments

Secure Flight News

According to Wired News, the DHS is looking for someone in Congress to sponsor a bill that eliminates congressional oversight over the Secure Flight program.

The bill would allow them to go ahead with the program regardless of GAO’s assessment. (Current law requires them to meet ten criteria set by Congress; the most recent GAO report said that they did not meet nine of them.) The bill would allow them to use commercial data even though they have not demonstrated its effectiveness. (The DHS funding bill passed by both the House and the Senate prohibits them from using commercial data during passenger screening, because there has been absolutely no test results showing that it is effective.)

In this new bill, all that would be required to go ahead with Secure Flight would be for Secretary Chertoff to say so:

Additionally, the proposed changes would permit Secure Flight to be rolled out to the nation’s airports after Homeland Security chief Michael Chertoff certifies the program will be effective and not overly invasive. The current bill requires independent congressional investigators to make that determination.

Looks like the DHS, being unable to comply with the law, is trying to change it. This is a rogue program that needs to be stopped.

In other news, the TSA has deleted about three million personal records it used for Secure Flight testing. This seems like a good idea, but it prevents people from knowing what data the government had on them—in violation of the Privacy Act.

Civil liberties activist Bill Scannell says it’s difficult to know whether TSA’s decision to destroy records so swiftly is a housecleaning effort or something else.

“Is the TSA just such an incredibly efficient organization that they’re getting rid of things that are no longer needed?” Scannell said. “Or is this a matter of the destruction of evidence?”

Scannell says it’s a fair question to ask in light of revelations that the TSA already violated the Privacy Act last year when it failed to fully disclose the scope of its testing for Secure Flight and its collection of commercial data on individuals.

My previous essay on Secure Flight is here.

Posted on August 15, 2005 at 9:43 AMView Comments

E-Mail Interception Decision Reversed

Is e-mail in transit communications or data in storage? Seems like a basic question, but the answer matters a lot to the police. A U.S. federal Appeals Court has ruled that the interception of e-mail in temporary storage violates the federal wiretap act, reversing an earlier court opinion.

The case and associated privacy issues are summarized here. Basically, different privacy laws protect electronic communications in transit and data in storage; the former is protected much more than the latter. E-mail stored by the sender or the recipient is obviously data in storage. But what about e-mail on its way from the sender to the receiver? On the one hand, it’s obviously communications on transit. But the other side argued that it’s actually stored on various computers as it wends its way through the Internet; hence it’s data in storage.

The initial court decision in this case held that e-mail in transit is just data in storage. Judge Lipez wrote an inspired dissent in the original opinion. In the rehearing en banc (more judges), he wrote the opinion for the majority which overturned the earlier opinion.

The opinion itself is long, but well worth reading. It’s well reasoned, and reflects extraordinary understanding and attention to detail. And a great last line:

If the issue presented be “garden-variety”… this is a garden in need of a weed killer.

I participated in an Amicus Curiae (“friend of the court”) brief in the case. Here’s another amicus brief by six civil liberties organizations.

There’s a larger issue here, and it’s the same one that the entertainment industry used to greatly expand copyright law in cyberspace. They argued that every time a copyrighted work is moved from computer to computer, or CD-ROM to RAM, or server to client, or disk drive to video card, a “copy” is being made. This ridiculous definition of “copy” has allowed them to exert far greater legal control over how people use copyrighted works.

Posted on August 15, 2005 at 7:59 AMView Comments

The MD5 Defense

This is interesting:

A team of Chinese maths enthusiasts have thrown NSW’s speed cameras system into disarray by cracking the technology used to store data about errant motorists.

The NRMA has called for a full audit of the way the state’s 110 enforcement cameras are used after a motorist escaped a conviction by claiming that data was vulnerable to hackers.

A Sydney magistrate, Laurence Lawson, threw out the case because the Roads and Traffic Authority failed to find an expert to testify that its speed camera images were secure.

The motorist’s defence lawyer, Denis Mirabilis, argued successfully that an algorithm known as MD5, which is used to store the time, date, place, numberplate and speed of cars caught on camera, was a discredited piece of technology.

It’s true that MD5 is broken. On the other hand, it’s almost certainly true that the speed cameras were correct. If there’s any lesson here, it’s that theoretical security is important in legal proceedings.

I think that’s a good thing.

Posted on August 11, 2005 at 7:52 AMView Comments

U.S. Crypto Export Controls

Rules on exporting cryptography outside the United States have been renewed:

President Bush this week declared a national emergency based on an “extraordinary threat to the national security.”

This might sound like a code-red, call-out-the-national-guard, we-lost-a-suitcase-nuke type of alarm, but in reality it’s just a bureaucratic way of ensuring that the Feds can continue to control the export of things like computer hardware and encryption products.

And it happens every year or so.

If Bush didn’t sign that “national emergency” paperwork, then the Commerce Department’s Bureau of Industry and Security would lose some of its regulatory power. That’s because Congress never extended the Export Administration Act after it lapsed (it’s complicated).

President Clinton did the same thing. Here’s a longer version of his “national emergency” executive order from 1994.

As a side note, encryption export rules have been dramatically relaxed since the oppressive early days of Janet “Evil PCs” Reno, Al “Clipper Chip” Gore, and Louis “ban crypto” Freeh. But they still exist. Here’s a summary.

To be honest, I don’t know what the rules are these days. I think there is a blanket exemption for mass-market software products, but I’m not sure. I haven’t a clue what the hardware requirements are. But certainly something is working right; we’re seeing more strong encryption in more software—and not just encryption software.

Posted on August 5, 2005 at 7:17 AMView Comments

Dog Poop Girl

Here’s the basic story: A woman and her dog are riding the Seoul subways. The dog poops in the floor. The woman refuses to clean it up, despite being told to by other passangers. Someone takes a picture of her, posts it on the Internet, and she is publicly shamed—and the story will live on the Internet forever. Then, the blogosphere debates the notion of the Internet as a social enforcement tool.

The Internet is changing our notions of personal privacy, and how the public enforces social norms.

Daniel Solove writes:

The dog-shit-girl case involves a norm that most people would seemingly agree to—clean up after your dog. Who could argue with that one? But what about when norm enforcement becomes too extreme? Most norm enforcement involves angry scowls or just telling a person off. But having a permanent record of one’s norm violations is upping the sanction to a whole new level. The blogosphere can be a very powerful norm-enforcing tool, allowing bloggers to act as a cyber-posse, tracking down norm violators and branding them with digital scarlet letters.

And that is why the law might be necessary—to modulate the harmful effects when the norm enforcement system gets out of whack. In the United States, privacy law is often the legal tool called in to address the situation. Suppose the dog poop incident occurred in the United States. Should the woman have legal redress under the privacy torts?

If this incident is any guide, then anyone acting outside the accepted norms of whatever segment of humanity surrounds him had better tread lightly. The question we need to answer is: is this the sort of society we want to live in? And if not, what technological or legal controls do we need to put in place to ensure that we don’t?

Solove again:

I believe that, as complicated as it might be, the law must play a role here. The stakes are too important. While entering law into the picture could indeed stifle freedom of discussion on the Internet, allowing excessive norm enforcement can be stifling to freedom as well.

All the more reason why we need to rethink old notions of privacy. Under existing notions, privacy is often thought of in a binary way ­ something either is private or public. According to the general rule, if something occurs in a public place, it is not private. But a more nuanced view of privacy would suggest that this case involved taking an event that occurred in one context and significantly altering its nature ­ by making it permanent and widespread. The dog-shit-girl would have been just a vague image in a few people’s memory if it hadn’t been for the photo entering cyberspace and spreading around faster than an epidemic. Despite the fact that the event occurred in public, there was no need for her image and identity to be spread across the Internet.

Could the law provide redress? This is a complicated question; certainly under existing doctrine, making a case would have many hurdles. And some will point to practical problems. Bloggers often don’t have deep pockets. But perhaps the possibility of lawsuits might help shape the norms of the Internet. In the end, I strongly doubt that the law alone can address this problem; but its greatest contribution might be to help along the development of blogging norms that will hopefully prevent more cases such as this one from having crappy endings.

Posted on July 29, 2005 at 4:21 PMView Comments

Cisco Harasses Security Researcher

I’ve written about full disclosure, and how disclosing security vulnerabilities is our best mechanism for improving security—especially in a free-market system. (That essay is also worth reading for a general discussion of the security trade-offs.) I’ve also written about how security companies treat vulnerabilities as public-relations problems first and technical problems second. This week at BlackHat, security researcher Michael Lynn and Cisco demonstrated both points.

Lynn was going to present security flaws in Cisco’s IOS, and Cisco went to inordinate lengths to make sure that information never got into the hands of the their consumers, the press, or the public.

Cisco threatened legal action to stop the conference’s organizers from allowing a 24-year-old researcher for a rival tech firm to discuss how he says hackers could seize control of Cisco’s Internet routers, which dominate the market. Cisco also instructed workers to tear 20 pages outlining the presentation from the conference program and ordered 2,000 CDs containing the presentation destroyed.

In the end, the researcher, Michael Lynn, went ahead with a presentation, describing flaws in Cisco’s software that he said could allow hackers to take over corporate and government networks and the Internet, intercepting and misdirecting data communications. Mr. Lynn, wearing a white hat emblazoned with the word “Good,” spoke after quitting his job at Internet Security Systems Inc. Wednesday. Mr. Lynn said he resigned because ISS executives had insisted he strike key portions of his presentation.

Not being able to censor the information, Cisco decided to act as if it were no big deal:

In a release shortly after the presentation, Cisco stated, “It is important to note that the information Lynn presented was not a disclosure of a new vulnerability or a flaw with Cisco IOS software. Lynn’s research explores possible ways to expand exploitations of known security vulnerabilities impacting routers.” And went on to state “Cisco believes that the information Lynn presented at the Blackhat conference today contained proprietary information and was illegally obtained.” The statement also refers to the fact that Lynn stated in his presentation that he used a popular file decompressor to ‘unzip’ the Cisco image before reverse engineering it and finding the flaw, which is against Cisco’s use agreement.

The Cisco propaganda machine is certainly working overtime this week.

The security implications of this are enormous. If companies have the power to censor information about their products they don’t like, then we as consumers have less information with which to make intelligent buying decisions. If companies have the power to squelch vulnerability information about their products, then there’s no incentive for them to improve security. (I’ve written about this in connection to physical keys and locks.) If free speech is subordinate to corporate demands, then we are all much less safe.

Full disclosure is good for society. But because it helps the bad guys as well as the good guys (see my essay on secrecy and security for more discussion of the balance), many of us have championed “responsible disclosure” guidelines that give vendors a head start in fixing vulnerabilities before they’re announced.

The problem is that not all researchers follow these guidelines. And laws limiting free speech do more harm to society than good. (In any case, laws won’t completely fix the problem; we can’t get laws passed in every possible country security researchers live.) So the only reasonable course of action for a company is to work with researchers who alert them to vulnerabilities, but also assume that vulnerability information will sometimes be released without prior warning.

I can’t imagine the discussions inside Cisco that led them to act like thugs. I can’t figure out why they decided to attack Michael Lynn, BlackHat, and ISS rather than turn the situation into a public-relations success. I can’t believe that they thought they could have censored the information by their actions, or even that it was a good idea.

Cisco’s customers want information. They don’t expect perfection, but they want to know the extent of problems and what Cisco is doing about them. They don’t want to know that Cisco tries to stifle the truth:

Joseph Klein, senior security analyst at the aerospace electronic systems division for Honeywell Technology Solutions, said he helped arrange a meeting between government IT professionals and Lynn after the talk. Klein said he was furious that Cisco had been unwilling to disclose the buffer-overflow vulnerability in unpatched routers. “I can see a class-action lawsuit against Cisco coming out of this,” Klein said.

ISS didn’t come out of this looking very good, either:

“A few years ago it was rumored that ISS would hold back on certain things because (they’re in the business of) providing solutions,” [Ali-Reza] Anghaie, [a senior security engineer with an aerospace firm, who was in the audience,] said. “But now you’ve got full public confirmation that they’ll submit to the will of a Cisco or Microsoft, and that’s not fair to their customers…. If they’re willing to back down and leave an employee … out to hang, well what are they going to do for customers?”

Despite their thuggish behavior, this has been a public-relations disaster for Cisco. Now it doesn’t matter what they say—we won’t believe them. We know that the public-relations department handles their security vulnerabilities, and not the engineering department. We know that they think squelching information and muzzling researchers is more important than informing the public. They could have shown that they put their customers first, but instead they demonstrated that short-sighted corporate interests are more important than being a responsible corporate citizen.

And these are the people building the hardware that runs much of our infrastructure? Somehow, I don’t feel very secure right now.

EDITED TO ADD: I am impressed with Lynn’s personal integrity in this matter:

When Mr. Lynn took the stage yesterday, he was introduced as speaking on a different topic, eliciting boos. But those turned to cheers when he asked, “Who wants to hear about Cisco?” As he got started, Mr. Lynn said, “What I just did means I’m about to get sued by Cisco and ISS. Not to put too fine a point on it, but bring it on.”

And this:

Lynn closed his talk by directing the audience to his resume and asking if anyone could give him a job.

“In large part I had to quit to give this presentation because ISS and Cisco would rather the world be at risk, I guess,” Lynn said. “They had to do what’s right for their shareholders; I understand that. But I figured I needed to do what’s right for the country and for the national critical infrastructure.”

There’s a lawsuit against him. I’ll let you know if there’s a legal defense fund.

EDITED TO ADD: The lawsuit has been settled. Some details:

Michael Lynn, a former ISS researcher, and the Black Hat organisers agreed to a permanent injunction barring them from further discussing the presentation Lynn gave on Wednesday. The presentation showed how attackers could take over Cisco routers, a problem that Lynn said could bring the Internet to its knees.

The injunction also requires Lynn to return any materials and disassembled code related to Cisco, according to a copy of the injunction, which was filed in US District Court for the District of Northern California. The injunction was agreed on by attorneys for Lynn, Black Hat, ISS and Cisco.

Lynn is also forbidden to make any further presentations at the Black Hat event, which ended on Thursday, or the following Defcon event. Additionally, Lynn and Black Hat have agreed never to disseminate a video made of Lynn’s presentation and to deliver to Cisco any video recording made of Lynn.

My hope is that Cisco realized that continuing with this would be a public-relations disaster.

EDITED TO ADD: Lynn’s BlackHat presentation is on line.

EDITED TO ADD: The FBI is getting involved.

EDITED TO ADD: The link to the presentation, above, has been replaced with a cease-and-desist letter. A copy of the presentation is now here.

Posted on July 29, 2005 at 4:35 AMView Comments

Automatic Surveillance Via Cell Phone

Your cell phone company knows where you are all the time. (Well, it knows where your phone is whenever it’s on.) Turns out there’s a lot of information to be mined in that data.

Eagle’s Realty Mining project logged 350,000 hours of data over nine months about the location, proximity, activity and communication of volunteers, and was quickly able to guess whether two people were friends or just co-workers….

He and his team were able to create detailed views of life at the Media Lab, by observing how late people stayed at the lab, when they called one another and how much sleep students got.

Given enough data, Eagle’s algorithms were able to predict what people—especially professors and Media Lab employees—would do next and be right up to 85 percent of the time.

This is worrisome from a number of angles: government surveillance, corporate surveillance for marketing purposes, criminal surveillance. I am not mollified by this comment:

People should not be too concerned about the data trails left by their phone, according to Chris Hoofnagle, associate director of the Electronic Privacy Information Center.

“The location data and billing records is protected by statute, and carriers are under a duty of confidentiality to protect it,” Hoofnagle said.

We’re building an infrastructure of surveillance as a side effect of the convenience of carrying our cell phones everywhere.

Posted on July 28, 2005 at 4:09 PM

UK Police and Encryption

From The Guardian:

Police last night told Tony Blair that they need sweeping new powers to counter the terrorist threat, including the right to detain a suspect for up to three months without charge instead of the current 14 days….

They also want to make it a criminal offence for suspects to refuse to cooperate in giving the police full access to computer files by refusing to disclose their encryption keys.

On Channel 4 News today, Sir Ian Blair was asked why the police wanted to extend the time they could hold someone without charges from 14 days to 3 months. Part of his answer was that they sometimes needed to access encrypted computer files and 14 days was not enough time for them to break the encryption.

There’s something fishy going on here.

It’s certainly possible that password-guessing programs are more successful with three months to guess. But the Regulation of Investigatory Powers (RIP) Act, which went into effect in 2000, already allows the police to jail people who don’t surrender encryption keys:

If intercepted communications are encrypted (encoded and made secret), the act will force the individual to surrender the keys (pin numbers which allow users to decipher encoded data), on pain of jail sentences of up to two years.

Posted on July 27, 2005 at 3:00 PMView Comments

Domestic Terrorism (U.S.)

Nice MSNBC piece on domestic terrorism in the U.S.:

The sentencing of Eric Rudolph, who bombed abortion clinics, a gay bar and the Atlanta Olympics, ought to be a milestone in the Global War on Terror. In Birmingham, Ala., on Monday he got life without parole. Next month he’ll stack up a couple more life terms in Georgia, which is the least he deserves. (He escaped the death penalty only because he made a deal to help law-enforcement agents find the explosives he had hidden while on the run in North Carolina.) Rudolph killed two people, but not for want of trying to kill many more. In his 1997 attack on an Atlanta abortion clinic, he set off a second bomb meant to take out bystanders and rescue workers. Unrepentant, of course, Rudolph defended his actions as a moral imperative: “Abortion is murder, and because it is murder I believe deadly force is needed to stop it.” The Birmingham prosecutor declared that Rudolph had “appointed himself judge, jury and executioner.”

Indeed. That’s what all terrorists have in common: the four lunatics in London earlier this month; the 19 men who attacked America on September 11, 2001; Timothy McVeigh in Oklahoma City, and many others. They were all convinced they had noble motives for wreaking their violence. Terrorists are very righteous folks. Which is why the real global war we’re fighting, let’s be absolutely clear, should be one of our shared humanity against the madness of people like these; the rule of man-made laws on the books against the divine law they imagine for themselves. It’s the cause of reason against unreason, of self-criticism against the firm convictions of fanaticism.

David Neiwert has some good commentary on the topic. He also points to this U.S. News and World Report article.

Posted on July 25, 2005 at 9:04 PMView Comments

1 28 29 30 31 32 35

Sidebar photo of Bruce Schneier by Joe MacInnis.