Entries Tagged "data breaches"

Page 9 of 12

A Security Market for Lemons

More than a year ago, I wrote about the increasing risks of data loss because more and more data fits in smaller and smaller packages. Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem—I use PGPdisk—but Secustick sounds even better: It automatically erases itself after a set number of bad password attempts. The company makes a bunch of other impressive claims: The product was commissioned, and eventually approved, by the French intelligence service; it is used by many militaries and banks; its technology is revolutionary.

Unfortunately, the only impressive aspect of Secustick is its hubris, which was revealed when Tweakers.net completely broke its security. There’s no data self-destruct feature. The password protection can easily be bypassed. The data isn’t even encrypted. As a secure storage device, Secustick is pretty useless.

On the surface, this is just another snake-oil security story. But there’s a deeper question: Why are there so many bad security products out there? It’s not just that designing good security is hard—although it is—and it’s not just that anyone can design a security product that he himself cannot break. Why do mediocre security products beat the good ones in the marketplace?

In 1970, American economist George Akerlof wrote a paper called “The Market for ‘Lemons‘” (abstract and article for pay here), which established asymmetrical information theory. He eventually won a Nobel Prize for his work, which looks at markets where the seller knows a lot more about the product than the buyer.

Akerlof illustrated his ideas with a used car market. A used car market includes both good cars and lousy ones (lemons). The seller knows which is which, but the buyer can’t tell the difference—at least until he’s made his purchase. I’ll spare you the math, but what ends up happening is that the buyer bases his purchase price on the value of a used car of average quality.

This means that the best cars don’t get sold; their prices are too high. Which means that the owners of these best cars don’t put their cars on the market. And then this starts spiraling. The removal of the good cars from the market reduces the average price buyers are willing to pay, and then the very good cars no longer sell, and disappear from the market. And then the good cars, and so on until only the lemons are left.

In a market where the seller has more information about the product than the buyer, bad products can drive the good ones out of the market.

The computer security market has a lot of the same characteristics of Akerlof’s lemons market. Take the market for encrypted USB memory sticks. Several companies make encrypted USB drives—Kingston Technology sent me one in the mail a few days ago—but even I couldn’t tell you if Kingston’s offering is better than Secustick. Or if it’s better than any other encrypted USB drives. They use the same encryption algorithms. They make the same security claims. And if I can’t tell the difference, most consumers won’t be able to either.

Of course, it’s more expensive to make an actually secure USB drive. Good security design takes time, and necessarily means limiting functionality. Good security testing takes even more time, especially if the product is any good. This means the less-secure product will be cheaper, sooner to market and have more features. In this market, the more-secure USB drive is going to lose out.

I see this kind of thing happening over and over in computer security. In the late 1980s and early 1990s, there were more than a hundred competing firewall products. The few that “won” weren’t the most secure firewalls; they were the ones that were easy to set up, easy to use and didn’t annoy users too much. Because buyers couldn’t base their buying decision on the relative security merits, they based them on these other criteria. The intrusion detection system, or IDS, market evolved the same way, and before that the antivirus market. The few products that succeeded weren’t the most secure, because buyers couldn’t tell the difference.

How do you solve this? You need what economists call a “signal,” a way for buyers to tell the difference. Warranties are a common signal. Alternatively, an independent auto mechanic can tell good cars from lemons, and a buyer can hire his expertise. The Secustick story demonstrates this. If there is a consumer advocate group that has the expertise to evaluate different products, then the lemons can be exposed.

Secustick, for one, seems to have been withdrawn from sale.

But security testing is both expensive and slow, and it just isn’t possible for an independent lab to test everything. Unfortunately, the exposure of Secustick is an exception. It was a simple product, and easily exposed once someone bothered to look. A complex software product—a firewall, an IDS—is very hard to test well. And, of course, by the time you have tested it, the vendor has a new version on the market.

In reality, we have to rely on a variety of mediocre signals to differentiate the good security products from the bad. Standardization is one signal. The widely used AES encryption standard has reduced, although not eliminated, the number of lousy encryption algorithms on the market. Reputation is a more common signal; we choose security products based on the reputation of the company selling them, the reputation of some security wizard associated with them, magazine reviews, recommendations from colleagues or general buzz in the media.

All these signals have their problems. Even product reviews, which should be as comprehensive as the Tweakers’ Secustick review, rarely are. Many firewall comparison reviews focus on things the reviewers can easily measure, like packets per second, rather than how secure the products are. In IDS comparisons, you can find the same bogus “number of signatures” comparison. Buyers lap that stuff up; in the absence of deep understanding, they happily accept shallow data.

With so many mediocre security products on the market, and the difficulty of coming up with a strong quality signal, vendors don’t have strong incentives to invest in developing good products. And the vendors that do tend to die a quiet and lonely death.

This essay originally appeared in Wired.

EDITED TO ADD (4/22): Slashdot thread.

Posted on April 19, 2007 at 7:59 AMView Comments

Misplacing the Blame in Personal Identity Thefts

Really good article:

In a recent dissection of the connection between gaming and violence, the term “folk devil” was used to describe something that can be labeled dangerous in order to assign blame in a case where the causes are complex and unclear. The new paper suggests that hackers have become the folk devils of computer security, stating that “even though the campaign against hackers has successfully cast them as the primary culprits to blame for insecurity in cyberspace, it is not clear that constructing this target for blame has improved the security of personal digital records.”

Part of this argument is based on the contention that many of the criminal groups that engage in illicit access to records are culturally distinct from the hacker community and that the hacker community proper is composed of a number of subcultures, some of which may access personal data without distributing it.

But, even if a more liberal definition of hacker is allowed, they still account for far less than half of the data losses. The report states that “60 percent of the incidents involve missing or stolen hardware, insider abuse or theft, administrative error, or accidentally exposing data online.”

Those figures come from analyzing the data while eliminating a single event, the compromise of 1.6 billion records at Axciom. The Axciom data loss is informative, as it reveals how what could be categorized as a hack involves institutional negligence. The records stolen from the company were taken by an employee that had access to Axciom servers in order to upload data. That employee gained download access because Axciom set the same passwords for both types of access.

Posted on March 23, 2007 at 10:29 AMView Comments

Stealing Data from Disk Drives in Photocopiers

This is a threat I hadn’t thought of before:

Now, experts are warning that photocopiers could be a culprit as well.

That’s because most digital copiers manufactured in the past five years have disk drives—the same kind of data-storage mechanism found in computers—to reproduce documents.

As a result, the seemingly innocuous machines that are commonly used to spit out copies of tax returns for millions of Americans can retain the data being scanned.

If the data on the copier’s disk aren’t protected with encryption or an overwrite mechanism, and if someone with malicious motives gets access to the machine, industry experts say sensitive information from original documents could get into the wrong hands.

Posted on March 21, 2007 at 12:10 PMView Comments

The FBI: Now Losing Fewer Laptops

According to a new report, the FBI has lost 160 laptops, including at least ten with classified information, in the past four years.

But it’s not all bad news:

The results are an improvement on findings in a similar audit in 2002, which reported that 354 weapons and 317 laptops were lost or stolen at the FBI over about two years. They follow the high-profile losses last year of laptops containing personal information from the Veterans Administration and the Internal Revenue Service.

In a statement yesterday, FBI Assistant Director John Miller emphasized that the report showed “significant progress in decreasing the rate of loss for weapons and laptops” at the FBI. The average number of laptops or guns that went missing dropped from about 12 per month to four per month for each category, according to the report.

The FBI: Now losing fewer laptops!

Posted on February 16, 2007 at 12:14 PMView Comments

Kansas City Loses IRS Tapes

Second in our series of stupid comments to the press, here’s Kansas City’s assistant city manager commenting on the fact that they lost 26 computer tapes containing personal information:

“It’s not a situation that if you had a laptop you could access,” Noll said. “You would need some specialized equipment and some specialized knowledge in order to read these tapes.”

While you may be concerned the missing tapes contain your personal information, Cindy Richey, a financial planner, said don’t be too alarmed.

“I think people might be surprised at how much of that is already floating around out there,” Richey said.

Got that? Don’t worry because 1) someone would need a tape drive to read those tapes, and 2) your personal information is all over the net anyway.

Posted on January 24, 2007 at 1:04 PMView Comments

U.S. Government to Encrypt All Laptops

This is a good idea:

To address the issue of data leaks of the kind we’ve seen so often in the last year because of stolen or missing laptops, writes Saqib Ali, the Feds are planning to use Full Disk Encryption (FDE) on all Government-owned computers.

“On June 23, 2006 a Presidential Mandate was put in place requiring all agency laptops to fully encrypt data on the HDD. The U.S. Government is currently conducting the largest single side-by-side comparison and competition for the selection of a Full Disk Encryption product. The selected product will be deployed on Millions of computers in the U.S. federal government space. This implementation will end up being the largest single implementation ever, and all of the information regarding the competition is in the public domain. The evaluation will come to an end in 90 days. You can view all the vendors competing and list of requirements.”

Certainly, encrypting everything is overkill, but it’s much easier than figuring out what to encrypt and what not to. And I really like that there is a open competition to choose which encryption program to use. It’s certainly a high-stakes competition among the vendors, but one that is likely to improve the security of all products. I’ve long said that one of the best things the government can do to improve computer security is to use its vast purchasing power to pressure vendors to improve their security. I would expect the winner to make a lot of sales outside of the contract, and for the losers to correct their deficiencies so they’ll do better next time.

Side note: Key escrow is a requirement, something that makes sense in a government or corporate application:

Capable of secure escrow and recovery of the symetric [sic] encryption key

I wonder if the NSA is involved in the evaluation at all, and if its analysis will be made public.

Posted on January 3, 2007 at 2:00 PMView Comments

Major Privacy Breach at UCLA

Hackers have gained access to a database containing personal information on 800,000 current and former UCLA students.

This is barely worth writing about: yet another database attack exposing personal information. My guess is that everyone in the U.S. has been the victim of at least one of these already. But there was a particular section of the article that caught my eye:

Jim Davis, UCLA’s associate vice chancellor for information technology, described the attack as sophisticated, saying it used a program designed to exploit a flaw in a single software application among the many hundreds used throughout the Westwood campus.

“An attacker found one small vulnerability and was able to exploit it, and then cover their tracks,” Davis said.

It worries me that the associate vice chancellor for information technology doesn’t understand that all attacks work like that.

Posted on December 13, 2006 at 6:43 AMView Comments

Insider Identity Theft

Banks are spending millions preventing outsiders from stealing their customers’ identities, but there is a growing insider threat:

Widespread outsourcing of data management and other services has exposed some weaknesses and made it harder to prevent identity theft by insiders.

“There are lots of weak links,” said Oveissi Field. “Back-up tapes are being sent to offsite storage sites or being mailed and getting into the wrong hands or are lost through carelessness.”

In what many regard as the biggest wake-up call in recent memory for financial institutions, thieves disguised as cleaning staff last year nearly stole the equivalent of more than $400 million from the London branch of Sumitomo Mitsui.

Posted on December 8, 2006 at 8:39 AMView Comments

Recovering Data from Cell Phones

People sell, give away, and throw away their cell phones without even thinking about the data still on them:

A company, Trust Digital of McLean, Virginia, bought 10 different phones on eBay this summer to test phone-security tools it sells for businesses. The phones all were fairly sophisticated models capable of working with corporate e-mail systems.

Curious software experts at Trust Digital resurrected information on nearly all the used phones, including the racy exchanges between guarded lovers.

The other phones contained:

  • One company’s plans to win a multimillion-dollar federal transportation contract.
  • E-mails about another firm’s $50,000 payment for a software license.
  • Bank accounts and passwords.
  • Details of prescriptions and receipts for one worker’s utility payments.

The recovered information was equal to 27,000 pages—a stack of printouts 8 feet high.

“We found just a mountain of personal and corporate data,” said Nick Magliato, Trust Digital’s chief executive.

In many cases, this was data that the owners erased.

A popular practice among sellers, resetting the phone, often means sensitive information appears to have been erased. But it can be resurrected using specialized yet inexpensive software found on the Internet.

More and more, our data is not really under our control. We store it on devices and third-party websites, or on our own computer. We try to erase it, but we really can’t. We try to control its dissemination, but it’s harder and harder.

Posted on September 5, 2006 at 9:38 AM

Sidebar photo of Bruce Schneier by Joe MacInnis.