Entries Tagged "data destruction"

Page 3 of 4

Privacy Problems with AskEraser

Last week, Ask.com announced a feature called AskEraser (good description here), which erases a user’s search history. While it’s great to see companies using privacy features for competitive advantage, EPIC examined the feature and wrote to the company with some problems:

The first one is the fact that AskEraser uses an opt-out cookie. Cookies are bits of software left on a consumer’s computer that are used to authenticate the user and maintain information such as the user’s site preferences.

Usually, people concerned with privacy delete cookies, so creating an opt-out cookie is “counter-intuitive,” the letter states. Once the AskEraser opt-out cookie is deleted, the privacy setting is lost and the consumer’s search activity will be tracked. Why not have an opt-in cookie instead, the letter suggests.

The second problem is that Ask inserts the exact time that the user enables AskEraser and stores it in the cookie, which could make identifying the computer easier and make it easy for third-party tracking if the cookie were transferred to such parties. The letter recommends using a session cookie that expires once the search result is returned.

Ask’s Frequently Asked Questions for the feature notes that there may be circumstances when Ask is required to comply with a court order and if asked to, it will retain the consumer’s search data even if AskEraser appears to be turned on. Ask should notify consumers when the feature has been disabled so that people are not misled into thinking their searches aren’t being tracked when they actually are, the letter said.

Here’s a copy of the letter, signed by eight privacy organizations. Still no word from Ask.com.

While I have your attention, I want to talk about EPIC. This is exactly the sort of thing the Electronic Privacy Information Center does best. Whether it’s search engine privacy, electronic voting, ID cards, or databases and data mining, EPIC is always at the forefront of these sorts of privacy issues. It’s the end of the year, and lots of people are looking for causes worthy of donation. Here’s EPIC’s donation page; they — well, “we” really, as I’m on the board — can use the support.

Posted on December 21, 2007 at 11:18 AMView Comments

Mesa Airlines Destroys Evidence

How not to delete evidence. First, do something bad. Then, try to delete the data files that prove it. Finally, blame it on adult content.

Hawaiian alleged Murnane — who was placed on a 90-leave by Mesa’s board last week — deleted hundreds of pages of computer records that would have shown that Mesa misappropriated the Hawaiian information.

But Mesa says any deletion was not intentional and they have copies of the deleted files.

“He (Murnane) was cruising on adult Web sites,” said Mesa attorney Max Blecher in a court hearing yesterday. Murnane was just trying to delete the porn sites, he said.

EDITED TO ADD (11/6): In the aftermath, the CFO got fired and Mesa got hit with an $80 million judgment. Ouch.

Posted on October 9, 2007 at 2:02 PMView Comments

Teaching Computers How to Forget

I’ve written about the death of ephemeral conversation, the rise of wholesale surveillance, and the electronic audit trail that now follows us through life. Viktor Mayer-Schönberger, a professor in Harvard’s JFK School of Government, has noticed this too, and believes that computers need to forget.

Why would we want our machines to “forget”? Mayer-Schönberger suggests that we are creating a Benthamist panopticon by archiving so many bits of knowledge for so long. The accumulated weight of stored Google searches, thousands of family photographs, millions of books, credit bureau information, air travel reservations, massive government databases, archived e-mail, etc., can actually be a detriment to speech and action, he argues.

“If whatever we do can be held against us years later, if all our impulsive comments are preserved, they can easily be combined into a composite picture of ourselves,” he writes in the paper. “Afraid how our words and actions may be perceived years later and taken out of context, the lack of forgetting may prompt us to speak less freely and openly.”

In other words, it threatens to make us all politicians.

In contrast to omnibus data protection legislation, Mayer-Schönberger proposes a combination of law and software to ensure that most data is “forgotten” by default. A law would decree that “those who create software that collects and stores data build into their code not only the ability to forget with time, but make such forgetting the default.” Essentially, this means that all collected data is tagged with a new piece of metadata that defines when the information should expire.

In practice, this would mean that iTunes could only store buying data for a limited time, a time defined by law. Should customers explicitly want this time extended, that would be fine, but people must be given a choice. Even data created by users–digital pictures, for example–would be tagged by the cameras that create them to expire in a year or two; pictures that people want to keep could simply be given a date 10,000 years in the future.

Frank Pasquale also comments on the legal implications implicit in this issue. And Paul Ohm wrote a note titled “The Fourth Amendment Right to Delete”:

For years the police have entered homes and offices, hauled away filing cabinets full of records, and searched them back at the police station for evidence. In Fourth Amendment terms, these actions are entry, seizure, and search, respectively, and usually require the police to obtain a warrant. Modern-day police can avoid some of these messy steps with the help of technology: They have tools that duplicate stored records and collect evidence of behavior, all from a distance and without the need for physical entry. These tools generate huge amounts of data that may be searched immediately or stored indefinitely for later analysis. Meanwhile, it is unclear whether the Fourth Amendment’s restrictions apply to these technologies: Are the acts of duplication and collection themselves seizure? Before the data are analyzed, has a search occurred?

EDITED TO ADD (6/14): Interesting presentation earlier this year by Dr. Radia Perlman that represents some work toward this problem. And a counterpoint.

Posted on May 16, 2007 at 6:19 AMView Comments

A Security Market for Lemons

More than a year ago, I wrote about the increasing risks of data loss because more and more data fits in smaller and smaller packages. Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem — I use PGPdisk — but Secustick sounds even better: It automatically erases itself after a set number of bad password attempts. The company makes a bunch of other impressive claims: The product was commissioned, and eventually approved, by the French intelligence service; it is used by many militaries and banks; its technology is revolutionary.

Unfortunately, the only impressive aspect of Secustick is its hubris, which was revealed when Tweakers.net completely broke its security. There’s no data self-destruct feature. The password protection can easily be bypassed. The data isn’t even encrypted. As a secure storage device, Secustick is pretty useless.

On the surface, this is just another snake-oil security story. But there’s a deeper question: Why are there so many bad security products out there? It’s not just that designing good security is hard — although it is — and it’s not just that anyone can design a security product that he himself cannot break. Why do mediocre security products beat the good ones in the marketplace?

In 1970, American economist George Akerlof wrote a paper called “The Market for ‘Lemons‘” (abstract and article for pay here), which established asymmetrical information theory. He eventually won a Nobel Prize for his work, which looks at markets where the seller knows a lot more about the product than the buyer.

Akerlof illustrated his ideas with a used car market. A used car market includes both good cars and lousy ones (lemons). The seller knows which is which, but the buyer can’t tell the difference — at least until he’s made his purchase. I’ll spare you the math, but what ends up happening is that the buyer bases his purchase price on the value of a used car of average quality.

This means that the best cars don’t get sold; their prices are too high. Which means that the owners of these best cars don’t put their cars on the market. And then this starts spiraling. The removal of the good cars from the market reduces the average price buyers are willing to pay, and then the very good cars no longer sell, and disappear from the market. And then the good cars, and so on until only the lemons are left.

In a market where the seller has more information about the product than the buyer, bad products can drive the good ones out of the market.

The computer security market has a lot of the same characteristics of Akerlof’s lemons market. Take the market for encrypted USB memory sticks. Several companies make encrypted USB drives — Kingston Technology sent me one in the mail a few days ago — but even I couldn’t tell you if Kingston’s offering is better than Secustick. Or if it’s better than any other encrypted USB drives. They use the same encryption algorithms. They make the same security claims. And if I can’t tell the difference, most consumers won’t be able to either.

Of course, it’s more expensive to make an actually secure USB drive. Good security design takes time, and necessarily means limiting functionality. Good security testing takes even more time, especially if the product is any good. This means the less-secure product will be cheaper, sooner to market and have more features. In this market, the more-secure USB drive is going to lose out.

I see this kind of thing happening over and over in computer security. In the late 1980s and early 1990s, there were more than a hundred competing firewall products. The few that “won” weren’t the most secure firewalls; they were the ones that were easy to set up, easy to use and didn’t annoy users too much. Because buyers couldn’t base their buying decision on the relative security merits, they based them on these other criteria. The intrusion detection system, or IDS, market evolved the same way, and before that the antivirus market. The few products that succeeded weren’t the most secure, because buyers couldn’t tell the difference.

How do you solve this? You need what economists call a “signal,” a way for buyers to tell the difference. Warranties are a common signal. Alternatively, an independent auto mechanic can tell good cars from lemons, and a buyer can hire his expertise. The Secustick story demonstrates this. If there is a consumer advocate group that has the expertise to evaluate different products, then the lemons can be exposed.

Secustick, for one, seems to have been withdrawn from sale.

But security testing is both expensive and slow, and it just isn’t possible for an independent lab to test everything. Unfortunately, the exposure of Secustick is an exception. It was a simple product, and easily exposed once someone bothered to look. A complex software product — a firewall, an IDS — is very hard to test well. And, of course, by the time you have tested it, the vendor has a new version on the market.

In reality, we have to rely on a variety of mediocre signals to differentiate the good security products from the bad. Standardization is one signal. The widely used AES encryption standard has reduced, although not eliminated, the number of lousy encryption algorithms on the market. Reputation is a more common signal; we choose security products based on the reputation of the company selling them, the reputation of some security wizard associated with them, magazine reviews, recommendations from colleagues or general buzz in the media.

All these signals have their problems. Even product reviews, which should be as comprehensive as the Tweakers’ Secustick review, rarely are. Many firewall comparison reviews focus on things the reviewers can easily measure, like packets per second, rather than how secure the products are. In IDS comparisons, you can find the same bogus “number of signatures” comparison. Buyers lap that stuff up; in the absence of deep understanding, they happily accept shallow data.

With so many mediocre security products on the market, and the difficulty of coming up with a strong quality signal, vendors don’t have strong incentives to invest in developing good products. And the vendors that do tend to die a quiet and lonely death.

This essay originally appeared in Wired.

EDITED TO ADD (4/22): Slashdot thread.

Posted on April 19, 2007 at 7:59 AMView Comments

Screaming Cell Phones

Cell phone security:

Does it pay to scream if your cell phone is stolen? Synchronica, a mobile device management company, thinks so. If you use the company’s Mobile Manager service and your handset is stolen, the company, once contacted, will remotely lockdown your phone, erase all its data and trigger it to emit a blood-curdling scream to scare the bejesus out of the thief.

The general category of this sort of security countermeasure is “benefit denial.” It’s like those dye tags on expensive clothing; if you shoplift the clothing and try to remove the tag, dye spills all over the clothes and makes them unwearable. The effectiveness of this kind of thing relies on the thief knowing that the security measure is there, or is reasonably likely to be there. It’s an effective shoplifting deterrent; my guess is that it will be less effective against cell phone thieves.

Remotely erasing data on stolen cell phones is a good idea regardless, though. And since cell phones are far more often lost than stolen, how about the phone calmly announcing that it is lost and it would like to be returned to its owner?

Posted on September 21, 2006 at 12:12 PMView Comments

Media Sanitization and Encryption

Last week NIST released Special Publication 800-88, Guidelines for Media Sanitization.

There is a new paragraph in this document (page 7) that was not in the draft version:

Encryption is not a generally accepted means of sanitization. The increasing power of computers decreases the time needed to crack cipher text and therefore the inability to recover the encrypted data can not be assured.

I have to admit that this doesn’t make any sense to me. If the encryption is done properly, and if the key is properly chosen, then erasing the key — and all copies — is equivalent to erasing the files. And if you’re using full-disk encryption, then erasing the key is equivalent to sanitizing the drive. For that not to be true means that the encryption program isn’t secure.

I think NIST is just confused.

Posted on September 11, 2006 at 11:43 AMView Comments

Recovering Data from Cell Phones

People sell, give away, and throw away their cell phones without even thinking about the data still on them:

A company, Trust Digital of McLean, Virginia, bought 10 different phones on eBay this summer to test phone-security tools it sells for businesses. The phones all were fairly sophisticated models capable of working with corporate e-mail systems.

Curious software experts at Trust Digital resurrected information on nearly all the used phones, including the racy exchanges between guarded lovers.

The other phones contained:

  • One company’s plans to win a multimillion-dollar federal transportation contract.
  • E-mails about another firm’s $50,000 payment for a software license.
  • Bank accounts and passwords.
  • Details of prescriptions and receipts for one worker’s utility payments.

The recovered information was equal to 27,000 pages — a stack of printouts 8 feet high.

“We found just a mountain of personal and corporate data,” said Nick Magliato, Trust Digital’s chief executive.

In many cases, this was data that the owners erased.

A popular practice among sellers, resetting the phone, often means sensitive information appears to have been erased. But it can be resurrected using specialized yet inexpensive software found on the Internet.

More and more, our data is not really under our control. We store it on devices and third-party websites, or on our own computer. We try to erase it, but we really can’t. We try to control its dissemination, but it’s harder and harder.

Posted on September 5, 2006 at 9:38 AM

Sidebar photo of Bruce Schneier by Joe MacInnis.