Entries Tagged "benefit denial"

Page 1 of 2

Cell Phone Kill Switches Mandatory in California

California passed a kill-switch law, meaning that all cell phones sold in California must have the capability to be remotely turned off. It was sold as an antitheft measure. If the phone company could remotely render a cell phone inoperative, there would be less incentive to steal one.

I worry more about the side effects: once the feature is in place, it can be used by all sorts of people for all sorts of reasons.

The law raises concerns about how the switch might be used or abused, because it also provides law enforcement with the authority to use the feature to kill phones. And any feature accessible to consumers and law enforcement could be accessible to hackers, who might use it to randomly kill phones for kicks or revenge, or to perpetrators of crimes who might—depending on how the kill switch is implemented—be able to use it to prevent someone from calling for help.

“It’s great for the consumer, but it invites a lot of mischief,” says Hanni Fakhoury, staff attorney for the Electronic Frontier Foundation, which opposes the law. “You can imagine a domestic violence situation or a stalking context where someone kills [a victim’s] phone and prevents them from calling the police or reporting abuse. It will not be a surprise when you see it being used this way.”

I wrote about this in 2008, more generally:

The possibilities are endless, and very dangerous. Making this work involves building a nearly flawless hierarchical system of authority. That’s a difficult security problem even in its simplest form. Distributing that system among a variety of different devices—computers, phones, PDAs, cameras, recorders—with different firmware and manufacturers, is even more difficult. Not to mention delegating different levels of authority to various agencies, enterprises, industries and individuals, and then enforcing the necessary safeguards.

Once we go down this path—giving one device authority over other devices—the security problems start piling up. Who has the authority to limit functionality of my devices, and how do they get that authority? What prevents them from abusing that power? Do I get the ability to override their limitations? In what circumstances, and how? Can they override my override?

The law only affects California, but phone manufacturers won’t sell two different phones. So this means that all cell phones will eventually have this capability. And, of course, the procedural controls and limitations written into the California law don’t apply elsewhere

EDITED TO ADD (9/12): Users can opt out, at least for now: “The bill would authorize an authorized user to affirmatively elect to
disable or opt-out of the technological solution at any time.”

How the bill can be used to disrupt protests.

Posted on August 29, 2014 at 12:31 PMView Comments

Snowden's Dead Man's Switch

Edward Snowden has set up a dead man’s switch. He’s distributed encrypted copies of his document trove to various people, and has set up some sort of automatic system to distribute the key, should something happen to him.

Dead man’s switches have a long history, both for safety (the machinery automatically stops if the operator’s hand goes slack) and security reasons. WikiLeaks did the same thing with the State Department cables.

“It’s not just a matter of, if he dies, things get released, it’s more nuanced than that,” he said. “It’s really just a way to protect himself against extremely rogue behavior on the part of the United States, by which I mean violent actions toward him, designed to end his life, and it’s just a way to ensure that nobody feels incentivized to do that.”

I’m not sure he’s thought this through, though. I would be more worried that someone would kill me in order to get the documents released than I would be that someone would kill me to prevent the documents from being released. Any real-world situation involves multiple adversaries, and it’s important to keep all of them in mind when designing a security system.

Posted on July 18, 2013 at 8:37 AMView Comments

Preventing Cell Phone Theft through Benefit Denial

Adding a remote kill switch to cell phones would deter theft.

Here we can see how the rise of the surveillance state permeates everything about computer security. On the face of it, this is a good idea. Assuming it works—that 1) it’s not possible for thieves to resurrect phones in order to resell them, and 2) that it’s not possible to turn this system into a denial-of-service attack tool—it would deter crime. The general category of security is “benefit denial,” like ink tags attached to garments in retail stores and car radios that no longer function if removed. But given what we now know, do we trust that the government wouldn’t abuse this system and kill phones for other reasons? Do we trust that media companies won’t kill phones it decided were sharing copyrighted materials? Do we trust that phone companies won’t kill phones from delinquent customers? What might have been a straightforward security system becomes a dangerous tool of control, when you don’t trust those in power.

Posted on June 28, 2013 at 1:37 PMView Comments

The Exclusionary Rule and Security

Earlier this month, the Supreme Court ruled that evidence gathered as a result of errors in a police database is admissible in court. Their narrow decision is wrong, and will only ensure that police databases remain error-filled in the future.

The specifics of the case are simple. A computer database said there was a felony arrest warrant pending for Bennie Herring when there actually wasn’t. When the police came to arrest him, they searched his home and found illegal drugs and a gun. The Supreme Court was asked to rule whether the police had the right to arrest him for possessing those items, even though there was no legal basis for the search and arrest in the first place.

What’s at issue here is the exclusionary rule, which basically says that unconstitutionally or illegally collected evidence is inadmissible in court. It might seem like a technicality, but excluding what is called “the fruit of the poisonous tree” is a security system designed to protect us all from police abuse.

We have a number of rules limiting what the police can do: rules governing arrest, search, interrogation, detention, prosecution, and so on. And one of the ways we ensure that the police follow these rules is by forbidding the police to receive any benefit from breaking them. In fact, we design the system so that the police actually harm their own interests by breaking them, because all evidence that stems from breaking the rules is inadmissible.

And that’s what the exclusionary rule does. If the police search your home without a warrant and find drugs, they can’t arrest you for possession. Since the police have better things to do than waste their time, they have an incentive to get a warrant.

The Herring case is more complicated, because the police thought they did have a warrant. The error was not a police error, but a database error. And, in fact, Judge Roberts wrote for the majority: “The exclusionary rule serves to deter deliberate, reckless, or grossly negligent conduct, or in some circumstances recurring or systemic negligence. The error in this case does not rise to that level.”

Unfortunately, Roberts is wrong. Government databases are filled with errors. People often can’t see data about themselves, and have no way to correct the errors if they do learn of any. And more and more databases are trying to exempt themselves from the Privacy Act of 1974, and specifically the provisions that require data accuracy. The legal argument for excluding this evidence was best made by an amicus curiae brief filed by the Electronic Privacy Information Center, but in short, the court should exclude the evidence because it’s the only way to ensure police database accuracy.

We are protected from becoming a police state by limits on police power and authority. This is not a trade-off we make lightly: we deliberately hamper law enforcement’s ability to do its job because we recognize that these limits make us safer. Without the exclusionary rule, your only remedy against an illegal search is to bring legal action against the police—and that can be very difficult. We, the people, would rather have you go free than motivate the police to ignore the rules that limit their power.

By not applying the exclusionary rule in the Herring case, the Supreme Court missed an important opportunity to motivate the police to purge errors from their databases. Constitutional lawyers have written many articles about this ruling, but the most interesting idea comes from George Washington University professor Daniel J. Solove, who proposes this compromise: “If a particular database has reasonable protections and deterrents against errors, then the Fourth Amendment exclusionary rule should not apply. If not, then the exclusionary rule should apply. Such a rule would create an incentive for law enforcement officials to maintain accurate databases, to avoid all errors, and would ensure that there would be a penalty or consequence for errors.”

Increasingly, we are being judged by the trail of data we leave behind us. Increasingly, data accuracy is vital to our personal safety and security. And if errors made by police databases aren’t held to the same legal standard as errors made by policemen, then more and more innocent Americans will find themselves the victims of incorrect data.

This essay originally appeared on the Wall Street Journal website.

EDITED TO ADD (2/1): More on the assault on the exclusionary rule.

EDITED TO ADD (2/9): Here’s another recent court case involving the exclusionary rule, and a thoughtful analysis by Orin Kerr.

Posted on January 28, 2009 at 7:12 AMView Comments

Screaming Cell Phones

Cell phone security:

Does it pay to scream if your cell phone is stolen? Synchronica, a mobile device management company, thinks so. If you use the company’s Mobile Manager service and your handset is stolen, the company, once contacted, will remotely lockdown your phone, erase all its data and trigger it to emit a blood-curdling scream to scare the bejesus out of the thief.

The general category of this sort of security countermeasure is “benefit denial.” It’s like those dye tags on expensive clothing; if you shoplift the clothing and try to remove the tag, dye spills all over the clothes and makes them unwearable. The effectiveness of this kind of thing relies on the thief knowing that the security measure is there, or is reasonably likely to be there. It’s an effective shoplifting deterrent; my guess is that it will be less effective against cell phone thieves.

Remotely erasing data on stolen cell phones is a good idea regardless, though. And since cell phones are far more often lost than stolen, how about the phone calmly announcing that it is lost and it would like to be returned to its owner?

Posted on September 21, 2006 at 12:12 PMView Comments

Getting a Personal Unlock Code for Your O2 Cell Phone

O2 is a UK cell phone network. The company gives you the option of setting up a PIN on your phone. The idea is that if someone steals your phone, they can’t make calls. If they type the PIN incorrectly three times, the phone is blocked. To deal with the problems of phone owners mistyping their PIN—or forgetting it—they can contact O2 and get a Personal Unlock Code (PUK). Presumably, the operator goes through some authentication steps to ensure that the person calling is actually the legitimate owner of the phone.

So far, so good.

But O2 has decided to automate the PUK process. Now anyone on the Internet can visit this website, type in a valid mobile telephone number, and get a valid PUK to reset the PIN—without any authentication whatsoever.

Oops.

EDITED TO ADD (7/4): A representitive from O2 sent me the following:

“Yes, it does seem there is a security risk by O2 supplying such a service, but in fact we believe this risk is very small. The risk is when a customer’s phone is lost or stolen. There are two scenarios in that event:

“Scenario 1 – The phone is powered off. A PIN number would be required at next power on. Although the PUK code will indeed allow you to reset the PIN, you need to know the telephone number of the SIM in order to get it – there is no way to determine the telephone number from the SIM or handset itself. Should the telephone number be known the risk is then same as scenario 2.

“Scenario 2 – The phone remains powered on: Here, the thief can use the phone in any case without having to acquire PUK.

“In both scenarios we have taken the view that the principle security measure is for the customer to report the loss/theft as quickly as possible, so that we can remotely disable both the SIM and also the handset (so that it cannot be used with any other SIM).”

Posted on July 3, 2006 at 2:26 PM

Microsoft Windows Kill Switch

Does Microsoft have the ability to disable Windows remotely? Maybe:

Two weeks ago, I wrote about my serious objections to Microsoft’s latest salvo in the war against unauthorized copies of Windows. Two Windows Genuine Advantage components are being pushed onto users’ machines with insufficient notification and inadequate quality control, and the result is a big mess. (For details, see Microsoft presses the Stupid button.)

Guess what? WGA might be on the verge of getting even messier. In fact, one report claims WGA is about to become a Windows “kill switch” ­ and when I asked Microsoft for an on-the-record response, they refused to deny it.

And this, supposedly from someone at Microsoft Support:

He told me that “in the fall, having the latest WGA will become mandatory and if its not installed, Windows will give a 30 day warning and when the 30 days is up and WGA isn’t installed, Windows will stop working, so you might as well install WGA now.”

The stupidity of this idea is amazing. Not just the inevitability of false positives, but the potential for a hacker to co-opt the controls. I hope this rumor ends up not being true.

Although if they actually do it, the backlash could do more for non-Windows OSs than anything those OSs could do for themselves.

Posted on June 30, 2006 at 11:51 AMView Comments

Risks of Losing Portable Devices

Last July I blogged about the risks of storing ever-larger amounts of data in ever-smaller devices.

Last week I wrote my tenth Wired.com column on the topic:

The point is that it’s now amazingly easy to lose an enormous amount of information. Twenty years ago, someone could break into my office and copy every customer file, every piece of correspondence, everything about my professional life. Today, all he has to do is steal my computer. Or my portable backup drive. Or my small stack of DVD backups. Furthermore, he could sneak into my office and copy all this data, and I’d never know it.

This problem isn’t going away anytime soon.

There are two solutions that make sense. The first is to protect the data. Hard-disk encryption programs like PGP Disk allow you to encrypt individual files, folders or entire disk partitions. Several manufacturers market USB thumb drives with built-in encryption. Some PDA manufacturers are starting to add password protection—not as good as encryption, but at least it’s something—to their devices, and there are some aftermarket PDA encryption programs.

The second solution is to remotely delete the data if the device is lost. This is still a new idea, but I believe it will gain traction in the corporate market. If you give an employee a BlackBerry for business use, you want to be able to wipe the device’s memory if he loses it. And since the device is online all the time, it’s a pretty easy feature to add.

But until these two solutions become ubiquitous, the best option is to pay attention and erase data. Delete old e-mails from your BlackBerry, SMSs from your cell phone and old data from your address books—regularly. Find that call log and purge it once in a while. Don’t store everything on your laptop, only the files you might actually need.

EDITED TO ADD (2/2): A Dutch army officer lost a memory stick with details of an Afgan mission.

Posted on February 1, 2006 at 10:32 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.