Entries Tagged "encryption"

Page 46 of 53

More on Kish's Encryption Scheme

Back in 2005, I wrote about Laszlo Kish’s encryption scheme, which promises the security of quantum encryption using thermal noise. I found, and continue to find, the research fascinating—although I don’t have the electrical engineering expertise to know whether or not it’s secure.

There have been developments. Kish has a new paper that not only describes a physical demonstration of the scheme, but also addresses many of the criticisms of his earlier work. And Feng Hao has a new paper that claims the scheme is totally insecure.

Again, I don’t have the EE background to know who’s right. But this is exactly the sort of back-and-forth I want to see.

Posted on June 11, 2007 at 6:49 AMView Comments

Information Leakage in the Slingbox

Interesting:

…despite the use of encryption, a passive eavesdropper can still learn private information about what someone is watching via their Slingbox Pro.

[…]

First, in order to conserve bandwidth, the Slingbox Pro uses something called variable bitrate (VBR) encoding. VBR is a standard approach for compressing streaming multimedia. At a very abstract level, the idea is to only transmit the differences between frames. This means that if a scene changes rapidly, the Slingbox Pro must still transmit a lot of data. But if the scene changes slowly, the Slingbox Pro will only have to transmit a small amount of data—a great bandwidth saver.

Now notice that different movies have different visual effects (e.g., some movies have frequent and rapid scene changes, others don’t). The use of VBR encodings therefore means that the amount data transmitted over time can serve as a fingerprint for a movie. And, since encryption alone won’t fully conceal the number of bytes transmitted, this fingerprint can survive encryption!

We experimented with fingerprinting encrypted Slingbox Pro movie transmissions in our lab. We took 26 of our favorite movies (we tried to pick movies from the same director, or multiple movies in a series), and we played them over our Slingbox Pro. Sometimes we streamed them to a laptop attached to a wired network, and sometimes we streamed them to a laptop connected to an 802.11 wireless network. In all cases the laptop was one hop away.

We trained our system on some of those traces. We then took new query traces for these movies and tried to match them to our database. For over half of the movies, we were able to correctly identify the movie over 98% of the time. This is well above the less than 4% accuracy that one would get by random chance.

More details in the paper.

Posted on June 4, 2007 at 1:24 PMView Comments

307-Digit Number Factored

We have a new factoring record: 307 digits. It’s a special number—2^1039 – 1—but the techniques can be generalized:

Is the writing on the wall for 1024-bit encryption” “The answer to that question is an unqualified yes,” says Lenstra. For the moment the standard is still secure, because it is much more difficult to factor a number made up of two huge prime numbers, such as an RSA number, than it is to factor a number like this one that has a special mathematical form. But the clock is definitely ticking. “Last time, it took nine years for us to generalize from a special to a non-special hard-to factor number (155 digits). I won’t make predictions, but let’s just say it might be a good idea to stay tuned.”

I hope RSA applications would have moved away from 1024-bit security years ago, but for those who haven’t yet: wake up.

EDITED TO ADD (5/21): That’s 1023 bits. (I should have said that.)

Posted on May 21, 2007 at 10:26 AMView Comments

Wiretapping in Italy

Encrypted phones are big business in Italy as a defense against wiretapping:

What has spurred encryption sales is not so much the legal wiretapping authorized by Italian magistrates—though information about those calls is also frequently leaked to the press—but the widespread availability of wiretapping technology over the Internet, which has created a growing pool of amateur eavesdroppers. Those snoops have a ready market in the Italian media for filched celebrity conversations.

Posted on May 2, 2007 at 1:02 PMView Comments

A Security Market for Lemons

More than a year ago, I wrote about the increasing risks of data loss because more and more data fits in smaller and smaller packages. Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem—I use PGPdisk—but Secustick sounds even better: It automatically erases itself after a set number of bad password attempts. The company makes a bunch of other impressive claims: The product was commissioned, and eventually approved, by the French intelligence service; it is used by many militaries and banks; its technology is revolutionary.

Unfortunately, the only impressive aspect of Secustick is its hubris, which was revealed when Tweakers.net completely broke its security. There’s no data self-destruct feature. The password protection can easily be bypassed. The data isn’t even encrypted. As a secure storage device, Secustick is pretty useless.

On the surface, this is just another snake-oil security story. But there’s a deeper question: Why are there so many bad security products out there? It’s not just that designing good security is hard—although it is—and it’s not just that anyone can design a security product that he himself cannot break. Why do mediocre security products beat the good ones in the marketplace?

In 1970, American economist George Akerlof wrote a paper called “The Market for ‘Lemons‘” (abstract and article for pay here), which established asymmetrical information theory. He eventually won a Nobel Prize for his work, which looks at markets where the seller knows a lot more about the product than the buyer.

Akerlof illustrated his ideas with a used car market. A used car market includes both good cars and lousy ones (lemons). The seller knows which is which, but the buyer can’t tell the difference—at least until he’s made his purchase. I’ll spare you the math, but what ends up happening is that the buyer bases his purchase price on the value of a used car of average quality.

This means that the best cars don’t get sold; their prices are too high. Which means that the owners of these best cars don’t put their cars on the market. And then this starts spiraling. The removal of the good cars from the market reduces the average price buyers are willing to pay, and then the very good cars no longer sell, and disappear from the market. And then the good cars, and so on until only the lemons are left.

In a market where the seller has more information about the product than the buyer, bad products can drive the good ones out of the market.

The computer security market has a lot of the same characteristics of Akerlof’s lemons market. Take the market for encrypted USB memory sticks. Several companies make encrypted USB drives—Kingston Technology sent me one in the mail a few days ago—but even I couldn’t tell you if Kingston’s offering is better than Secustick. Or if it’s better than any other encrypted USB drives. They use the same encryption algorithms. They make the same security claims. And if I can’t tell the difference, most consumers won’t be able to either.

Of course, it’s more expensive to make an actually secure USB drive. Good security design takes time, and necessarily means limiting functionality. Good security testing takes even more time, especially if the product is any good. This means the less-secure product will be cheaper, sooner to market and have more features. In this market, the more-secure USB drive is going to lose out.

I see this kind of thing happening over and over in computer security. In the late 1980s and early 1990s, there were more than a hundred competing firewall products. The few that “won” weren’t the most secure firewalls; they were the ones that were easy to set up, easy to use and didn’t annoy users too much. Because buyers couldn’t base their buying decision on the relative security merits, they based them on these other criteria. The intrusion detection system, or IDS, market evolved the same way, and before that the antivirus market. The few products that succeeded weren’t the most secure, because buyers couldn’t tell the difference.

How do you solve this? You need what economists call a “signal,” a way for buyers to tell the difference. Warranties are a common signal. Alternatively, an independent auto mechanic can tell good cars from lemons, and a buyer can hire his expertise. The Secustick story demonstrates this. If there is a consumer advocate group that has the expertise to evaluate different products, then the lemons can be exposed.

Secustick, for one, seems to have been withdrawn from sale.

But security testing is both expensive and slow, and it just isn’t possible for an independent lab to test everything. Unfortunately, the exposure of Secustick is an exception. It was a simple product, and easily exposed once someone bothered to look. A complex software product—a firewall, an IDS—is very hard to test well. And, of course, by the time you have tested it, the vendor has a new version on the market.

In reality, we have to rely on a variety of mediocre signals to differentiate the good security products from the bad. Standardization is one signal. The widely used AES encryption standard has reduced, although not eliminated, the number of lousy encryption algorithms on the market. Reputation is a more common signal; we choose security products based on the reputation of the company selling them, the reputation of some security wizard associated with them, magazine reviews, recommendations from colleagues or general buzz in the media.

All these signals have their problems. Even product reviews, which should be as comprehensive as the Tweakers’ Secustick review, rarely are. Many firewall comparison reviews focus on things the reviewers can easily measure, like packets per second, rather than how secure the products are. In IDS comparisons, you can find the same bogus “number of signatures” comparison. Buyers lap that stuff up; in the absence of deep understanding, they happily accept shallow data.

With so many mediocre security products on the market, and the difficulty of coming up with a strong quality signal, vendors don’t have strong incentives to invest in developing good products. And the vendors that do tend to die a quiet and lonely death.

This essay originally appeared in Wired.

EDITED TO ADD (4/22): Slashdot thread.

Posted on April 19, 2007 at 7:59 AMView Comments

Blu-ray Cracked

The Blu-ray DRM system has been broken, although details are scant. It’s the same person who broke the HD DVD system last month. (Both use AACS.)

As I’ve written previously, both of these systems are supposed to be designed in such a way as to recover from hacks like this. We’re going to find out if the recovery feature works.

Blu-ray and HD DVD both allow for decryption keys to be updated in reaction to attacks, for example by making it impossible to play high-definition movies via playback software known to be weak or flawed. So muslix64 work has effectively sparked off a cat-and-mouse game between hackers and the entertainment industry, where consumers are likely to face compatibility problems while footing the bill for the entertainment industry’s insistence on pushing ultimately flawed DRM technology on an unwilling public.

EDITED TO ADD (1/29): You should read this seven part series on the topic.

Posted on January 26, 2007 at 12:47 PMView Comments

U.S. Government to Encrypt All Laptops

This is a good idea:

To address the issue of data leaks of the kind we’ve seen so often in the last year because of stolen or missing laptops, writes Saqib Ali, the Feds are planning to use Full Disk Encryption (FDE) on all Government-owned computers.

“On June 23, 2006 a Presidential Mandate was put in place requiring all agency laptops to fully encrypt data on the HDD. The U.S. Government is currently conducting the largest single side-by-side comparison and competition for the selection of a Full Disk Encryption product. The selected product will be deployed on Millions of computers in the U.S. federal government space. This implementation will end up being the largest single implementation ever, and all of the information regarding the competition is in the public domain. The evaluation will come to an end in 90 days. You can view all the vendors competing and list of requirements.”

Certainly, encrypting everything is overkill, but it’s much easier than figuring out what to encrypt and what not to. And I really like that there is a open competition to choose which encryption program to use. It’s certainly a high-stakes competition among the vendors, but one that is likely to improve the security of all products. I’ve long said that one of the best things the government can do to improve computer security is to use its vast purchasing power to pressure vendors to improve their security. I would expect the winner to make a lot of sales outside of the contract, and for the losers to correct their deficiencies so they’ll do better next time.

Side note: Key escrow is a requirement, something that makes sense in a government or corporate application:

Capable of secure escrow and recovery of the symetric [sic] encryption key

I wonder if the NSA is involved in the evaluation at all, and if its analysis will be made public.

Posted on January 3, 2007 at 2:00 PMView Comments

More on the Unabomber's Code

Last month I posted about Ted Kaczynski’s pencil-and-paper cryptography. It seems that he invented his own cipher, which the police couldn’t crack until they found a description of the code amongst his personal papers.

The link I found was from KPIX, a CBS affiliate in the San Francisco area. Some time after writing it, I was contacted by the station and asked to comment on some other pieces of the Unabomber’s cryptography for a future story (video online).

There were five new pages of Unabomber evidence that I talked about (1, 2, 3, 4, and 5). All five pages were presented to me as being pages written by the Unabomber, but it seems pretty obvious to me that pages 4 and 5, rather than being Kaczynski’s own key, are notes written by a cryptanalyst trying to break the Unabomber’s code.

In any case, it’s all fascinating.

Posted on January 3, 2007 at 6:59 AMView Comments

1 44 45 46 47 48 53

Sidebar photo of Bruce Schneier by Joe MacInnis.