Entries Tagged "NIST"

Page 5 of 7

New Attack on Threefish

At FSE 2010 this week, Dmitry Khovratovich and Ivica Nikolic presented a paper where they cryptanalyze ARX algorithms (algorithms that use only addition, rotation, and exclusive-OR operations): “Rotational Cryptanalysis of ARX.” In the paper, they demonstrate their attack against Threefish. Their attack breaks 39 (out of 72) rounds of Threefish-256 with a complexity of 2252.4, 42 (out of 72) rounds of Threefish-512 with a complexity of 2507, and 43.5 (out of 80) rounds of Threefish-1024 with a complexity of 21014.5. (Yes, that’s over 21000. Don’t laugh; it really is a valid attack, even though it—or any of these others—will never be practical.)

This is excellent work, and represents the best attacks against Threefish to date. (I suspect that the attacks can be extended a few more rounds with some clever cryptanalytic tricks, but no further.) The security of full Threefish isn’t at risk, of course; there’s still plenty of security margin.

We have always stood by the security of Threefish with any set of non-obviously-bad constants. Still, a trivial modification—changing a single constant in the key schedule—dramatically reduces the number of rounds through which this attack can penetrate. If NIST allows another round of tweaks to the SHA-3 candidate algorithms, we will almost certainly take the opportunity to improve Skein’s security; we’ll change this constant to a value that removes the rotational symmetries that this technique exploits. If they don’t, we’re still confident of the security of Threefish and Skein.

And we’re always pleased to see more cryptanalysis against Threefish and Skein.

Posted on February 7, 2010 at 8:06 AMView Comments

Skein News

Skein is one of the 14 SHA-3 candidates chosen by NIST to advance to the second round. As part of the process, NIST allowed the algorithm designers to implement small “tweaks” to their algorithms. We’ve tweaked the rotation constants of Skein. This change does not affect Skein’s performance in any way.

The revised Skein paper contains the new rotation constants, as well as information about how we chose them and why we changed them, the results of some new cryptanalysis, plus new IVs and test vectors. Revised source code is here.

The latest information on Skein is always here.

Tweaks were due today, September 15. Now the SHA-3 process moves into the second round. According to NIST’s timeline, they’ll choose a set of final round candidate algorithms in 2010, and then a single hash algorithm in 2012. Between now and then, it’s up to all of us to evaluate the algorithms and let NIST know what we want. Cryptanalysis is important, of course, but so is performance.

Here’s my 2008 essay on SHA-3. The second-round algorithms are: BLAKE, Blue Midnight Wish, CubeHash, ECHO, Fugue, Grøstl, Hamsi, JH, Keccak, Luffa, Shabal, SHAvite-3, SIMD, and Skein. You can find details on all of them, as well as the current state of their cryptanalysis, here.

In other news, we’re making Skein shirts available to the public. Those of you who attended the First Hash Function Candidate Conference in Leuven, Belgium, earlier this year might have noticed the stylish black Skein polo shirts worn by the Skein team. Anyone who wants one is welcome to buy it, at cost. Details (with photos) are here. All orders must be received before 1 October, and then we’ll have all the shirts made in one batch.

Posted on September 15, 2009 at 6:10 AMView Comments

SHA-3 Second Round Candidates Announced

NIST has announced the 14 SHA-3 candidates that have advanced to the second round: BLAKE, Blue Midnight Wish, CubeHash, ECHO, Fugue, Grøstl, Hamsi, JH, Keccak, Luffa, Shabal, SHAvite-3, SIMD, and Skein.

In February, I chose my favorites: Arirang, BLAKE, Blue Midnight Wish, ECHO, Grøstl, Keccak, LANE, Shabal, and Skein. Of the ones NIST eventually chose, I am most surprised to see CubeHash and most surprised not to see LANE.

Here’s my 2008 essay on SHA-3. Here’s NIST’s SHA-3 page. And here’s the page on my own submission, Skein.

Posted on July 24, 2009 at 12:15 PMView Comments

MD6 Withdrawn from SHA-3 Competition

In other SHA-3 news, Ron Rivest seems to have withdrawn MD6 from the SHA-3 competition. From an e-mail to a NIST mailing list:

We suggest that MD6 is not yet ready for the next SHA-3 round, and we also provide some suggestions for NIST as the contest moves forward.

Basically, the issue is that in order for MD6 to be fast enough to be competitive, the designers have to reduce the number of rounds down to 30-40, and at those rounds, the algorithm loses its proofs of resistance to differential attacks.

Thus, while MD6 appears to be a robust and secure cryptographic hash algorithm, and has much merit for multi-core processors, our inability to provide a proof of security for a reduced-round (and possibly tweaked) version of MD6 against differential attacks suggests that MD6 is not ready for consideration for the next SHA-3 round.

EDITED TO ADD (7/1): This is a very classy withdrawal, as we expect from Ron Rivest—especially given the fact that there are no attacks on it, while other algorithms have been seriously broken and their submitters keep trying to pretend that no one has noticed.

EDITED TO ADD (7/6): From the MD6 website:

We are not withdrawing our submission; NIST is free to select MD6 for further consideration in the next round if it wishes. But at this point MD6 doesn’t meet our own standards for what we believe should be required of a SHA-3 candidate, and we suggest that NIST might do better looking elsewhere. In particular, we feel that a minimum “ticket of admission” for SHA-3 consideration should be a proof of resistance to basic differential attacks, and we don’t know how to make such a proof for a reduced-round MD6.

Posted on July 1, 2009 at 2:27 PMView Comments

New Attack on AES

There’s a new cryptanalytic attack on AES that is better than brute force:

Abstract. In this paper we present two related-key attacks on the full AES. For AES-256 we show the first key recovery attack that works for all the keys and has complexity 2119, while the recent attack by Biryukov-Khovratovich-Nikolic works for a weak key class and has higher complexity. The second attack is the first cryptanalysis of the full AES-192. Both our attacks are boomerang attacks, which are based on the recent idea of finding local collisions in block ciphers and enhanced with the boomerang switching techniques to gain free rounds in the middle.

In an e-mail, the authors wrote:

We also expect that a careful analysis may reduce the complexities. As a preliminary result, we think that the complexity of the attack on AES-256 can be lowered from 2119 to about 2110.5 data and time.

We believe that these results may shed a new light on the design of the key-schedules of block ciphers, but they pose no immediate threat for the real world applications that use AES.

Agreed. While this attack is better than brute force—and some cryptographers will describe the algorithm as “broken” because of it—it is still far, far beyond our capabilities of computation. The attack is, and probably forever will be, theoretical. But remember: attacks always get better, they never get worse. Others will continue to improve on these numbers. While there’s no reason to panic, no reason to stop using AES, no reason to insist that NIST choose another encryption standard, this will certainly be a problem for some of the AES-based SHA-3 candidate hash functions.

EDITED TO ADD (7/14): An FAQ.

Posted on July 1, 2009 at 11:49 AMView Comments

More SHA-3 News

NIST has published all 51 first-round candidates in its hash algorithm competition. (Presumably the other submissions—we heard they received 64—were rejected because they weren’t complete.) You can download the submission package from the NIST page. The SHA-3 Zoo is still the best source for up-to-date cryptanalysis information.

Various people have been trying to benchmark the performance of the candidates, but—of course—results depend on what metrics you choose.

And there’s news about Skein’s performance. And two Java implementations. (Does anyone want to do an implementation of Threefish?) In general, the Skein website is the place to go for up-to-date Skein information.

Posted on December 11, 2008 at 1:16 PMView Comments

The Skein Hash Function

NIST is holding a competition to replace the SHA family of hash functions, which have been increasingly under attack. (I wrote about an early NIST hash workshop here.)

Skein is our submission (myself and seven others: Niels Ferguson, Stefan Lucks, Doug Whiting, Mihir Bellare, Tadayoshi Kohno, Jon Callas, and Jesse Walker). Here’s the paper:

Executive Summary

Skein is a new family of cryptographic hash functions. Its design combines speed, security, simplicity, and a great deal of flexibility in a modular package that is easy to analyze.

Skein is fast. Skein-512—our primary proposal—hashes data at 6.1 clock cycles per byte on a 64-bit CPU. This means that on a 3.1 GHz x64 Core 2 Duo CPU, Skein hashes data at 500 MBytes/second per core—almost twice as fast as SHA-512 and three times faster than SHA-256. An optional hash-tree mode speeds up parallelizable implementations even more. Skein is fast for short messages, too; Skein-512 hashes short messages in about 1000 clock cycles.

Skein is secure. Its conservative design is based on the Threefish block cipher. Our current best attack on Threefish-512 is on 25 of 72 rounds, for a safety factor of 2.9. For comparison, at a similar stage in the standardization process, the AES encryption algorithm had an attack on 6 of 10 rounds, for a safety factor of only 1.7. Additionally, Skein has a number of provably secure properties, greatly increasing confidence in the algorithm.

Skein is simple. Using only three primitive operations, the Skein compression function can be easily understood and remembered. The rest of the algorithm is a straightforward iteration of this function.

Skein is flexible. Skein is defined for three different internal state sizes—256 bits, 512 bits, and 1024 bits—and any output size. This allows Skein to be a drop-in replacement for the entire SHA family of hash functions. A completely optional and extendable argument system makes Skein an efficient tool to use for a very large number of functions: a PRNG, a stream cipher, a key derivation function, authentication without the overhead of HMAC, and a personalization capability. All these features can be implemented with very low overhead. Together with the Threefish large-block cipher at Skein core, this design provides a full set of symmetric cryptographic primitives suitable for most modern applications.

Skein is efficient on a variety of platforms, both hardware and software. Skein-512 can be implemented in about 200 bytes of state. Small devices, such as 8-bit smart cards, can implement Skein-256 using about 100 bytes of memory. Larger devices can implement the larger versions of Skein to achieve faster speeds.

Skein was designed by a team of highly experienced cryptographic experts from academia and industry, with expertise in cryptography, security analysis, software, chip design, and implementation of real-world cryptographic systems. This breadth of knowledge allowed them to create a balanced design that works well in all environments.

Here’s source code, text vectors, and the like for Skein. Watch the Skein website for any updates—new code, new results, new implementations, the proofs.

NIST’s deadline is Friday. It seems as if everyone—including many amateurs—is working on a hash function, and I predict that NIST will receive at least 80 submissions. (Compare this to the sixteen NIST submissions received for the AES competition in 1998.) I expect people to start posting their submissions over the weekend. (Ron Rivest already presented MD6 at Crypto in August.) Probably the best place to watch for new hash functions is here; I’ll try to keep a listing of the submissions myself.

The selection process will take around four years. I’ve previously called this sort of thing a cryptographic demolition derby—last one left standing wins—but that’s only half true. Certainly all the groups will spend the next couple of years trying to cryptanalyze each other, but in the end there will be a bunch of unbroken algorithms; NIST will select one based on performance and features.

NIST has stated that the goal of this process is not to choose the best standard but to choose a good standard. I think that’s smart of them; in this process, “best” is the enemy of “good.” My advice is this: immediately sort them based on performance and features. Ask the cryptographic community to focus its attention on the top dozen, rather than spread its attention across all 80—although I also expect that most of the amateur submissions will be rejected by NIST for not being “complete and proper.” Otherwise, people will break the easy ones and the better ones will go unanalyzed.

EDITED TO ADD (10/30): Here is a single website for all information, including cryptanalysis, of all the SHA-3 submissions. A spoke to a reporter who told me that, as of yesterday, NIST had received 30 submissions. And three news articles about Skein.

Posted on October 29, 2008 at 6:35 AMView Comments

The Strange Story of Dual_EC_DRBG

Random numbers are critical for cryptography: for encryption keys, random authentication challenges, initialization vectors, nonces, key-agreement schemes, generating prime numbers and so on. Break the random-number generator, and most of the time you break the entire security system. Which is why you should worry about a new random-number standard that includes an algorithm that is slow, badly designed and just might contain a backdoor for the National Security Agency.

Generating random numbers isn’t easy, and researchers have discovered lots of problems and attacks over the years. A recent paper found a flaw in the Windows 2000 random-number generator. Another paper found flaws in the Linux random-number generator. Back in 1996, an early version of SSL was broken because of flaws in its random-number generator. With John Kelsey and Niels Ferguson in 1999, I co-authored Yarrow, a random-number generator based on our own cryptanalysis work. I improved this design four years later—and renamed it Fortuna—in the book Practical Cryptography, which I co-authored with Ferguson.

The U.S. government released a new official standard for random-number generators this year, and it will likely be followed by software and hardware developers around the world. Called NIST Special Publication 800-90 (.pdf), the 130-page document contains four different approved techniques, called DRBGs, or “Deterministic Random Bit Generators.” All four are based on existing cryptographic primitives. One is based on hash functions, one on HMAC, one on block ciphers and one on elliptic curves. It’s smart cryptographic design to use only a few well-trusted cryptographic primitives, so building a random-number generator out of existing parts is a good thing.

But one of those generators—the one based on elliptic curves—is not like the others. Called Dual_EC_DRBG, not only is it a mouthful to say, it’s also three orders of magnitude slower than its peers. It’s in the standard only because it’s been championed by the NSA, which first proposed it years ago in a related standardization project at the American National Standards Institute.

The NSA has always been intimately involved in U.S. cryptography standards—it is, after all, expert in making and breaking secret codes. So the agency’s participation in the NIST (the U.S. Commerce Department’s National Institute of Standards and Technology) standard is not sinister in itself. It’s only when you look under the hood at the NSA’s contribution that questions arise.

Problems with Dual_EC_DRBG were first described in early 2006. The math is complicated, but the general point is that the random numbers it produces have a small bias. The problem isn’t large enough to make the algorithm unusable—and Appendix E of the NIST standard describes an optional work-around to avoid the issue—but it’s cause for concern. Cryptographers are a conservative bunch: We don’t like to use algorithms that have even a whiff of a problem.

But today there’s an even bigger stink brewing around Dual_EC_DRBG. In an informal presentation (.pdf) at the CRYPTO 2007 conference in August, Dan Shumow and Niels Ferguson showed that the algorithm contains a weakness that can only be described as a backdoor.

This is how it works: There are a bunch of constants—fixed numbers—in the standard used to define the algorithm’s elliptic curve. These constants are listed in Appendix A of the NIST publication, but nowhere is it explained where they came from.

What Shumow and Ferguson showed is that these numbers have a relationship with a second, secret set of numbers that can act as a kind of skeleton key. If you know the secret numbers, you can predict the output of the random-number generator after collecting just 32 bytes of its output. To put that in real terms, you only need to monitor one TLS internet encryption connection in order to crack the security of that protocol. If you know the secret numbers, you can completely break any instantiation of Dual_EC_DRBG.

The researchers don’t know what the secret numbers are. But because of the way the algorithm works, the person who produced the constants might know; he had the mathematical opportunity to produce the constants and the secret numbers in tandem.

Of course, we have no way of knowing whether the NSA knows the secret numbers that break Dual_EC-DRBG. We have no way of knowing whether an NSA employee working on his own came up with the constants—and has the secret numbers. We don’t know if someone from NIST, or someone in the ANSI working group, has them. Maybe nobody does.

We don’t know where the constants came from in the first place. We only know that whoever came up with them could have the key to this backdoor. And we know there’s no way for NIST—or anyone else—to prove otherwise.

This is scary stuff indeed.

Even if no one knows the secret numbers, the fact that the backdoor is present makes Dual_EC_DRBG very fragile. If someone were to solve just one instance of the algorithm’s elliptic-curve problem, he would effectively have the keys to the kingdom. He could then use it for whatever nefarious purpose he wanted. Or he could publish his result, and render every implementation of the random-number generator completely insecure.

It’s possible to implement Dual_EC_DRBG in such a way as to protect it against this backdoor, by generating new constants with another secure random-number generator and then publishing the seed. This method is even in the NIST document, in Appendix A. But the procedure is optional, and my guess is that most implementations of the Dual_EC_DRBG won’t bother.

If this story leaves you confused, join the club. I don’t understand why the NSA was so insistent about including Dual_EC_DRBG in the standard. It makes no sense as a trap door: It’s public, and rather obvious. It makes no sense from an engineering perspective: It’s too slow for anyone to willingly use it. And it makes no sense from a backwards-compatibility perspective: Swapping one random-number generator for another is easy.

My recommendation, if you’re in need of a random-number generator, is not to use Dual_EC_DRBG under any circumstances. If you have to use something in SP 800-90, use CTR_DRBG or Hash_DRBG.

In the meantime, both NIST and the NSA have some explaining to do.

This essay originally appeared on Wired.com.

Posted on November 15, 2007 at 6:08 AMView Comments

A New Secure Hash Standard

The U.S. National Institute of Standards and Technology is having a competition for a new cryptographic hash function.

This matters. The phrase “one-way hash function” might sound arcane and geeky, but hash functions are the workhorses of modern cryptography. They provide web security in SSL. They help with key management in e-mail and voice encryption: PGP, Skype, all the others. They help make it harder to guess passwords. They’re used in virtual private networks, help provide DNS security and ensure that your automatic software updates are legitimate. They provide all sorts of security functions in your operating system. Every time you do something with security on the internet, a hash function is involved somewhere.

Basically, a hash function is a fingerprint function. It takes a variable-length input—anywhere from a single byte to a file terabytes in length—and converts it to a fixed-length string: 20 bytes, for example.

One-way hash functions are supposed to have two properties. First, they’re one-way. This means that it is easy to take an input and compute the hash value, but it’s impossible to take a hash value and recreate the original input. By “impossible” I mean “can’t be done in any reasonable amount of time.”

Second, they’re collision-free. This means that even though there are an infinite number of inputs for every hash value, you’re never going to find two of them. Again, “never” is defined as above. The cryptographic reasoning behind these two properties is subtle, but any cryptographic text talks about them.

The hash function you’re most likely to use routinely is SHA-1. Invented by the National Security Agency, it’s been around since 1995. Recently, though, there have been some pretty impressive cryptanalytic attacks against the algorithm. The best attack is barely on the edge of feasibility, and not effective against all applications of SHA-1. But there’s an old saying inside the NSA: “Attacks always get better; they never get worse.” It’s past time to abandon SHA-1.

There are near-term alternatives—a related algorithm called SHA-256 is the most obvious—but they’re all based on the family of hash functions first developed in 1992. We’ve learned a lot more about the topic in the past 15 years, and can certainly do better.

Why the National Institute of Standards and Technology, or NIST, though? Because it has exactly the experience and reputation we want. We were in the same position with encryption functions in 1997. We needed to replace the Data Encryption Standard, but it wasn’t obvious what should replace it. NIST decided to orchestrate a worldwide competition for a new encryption algorithm. There were 15 submissions from 10 countries—I was part of the group that submitted Twofish—and after four years of analysis and cryptanalysis, NIST chose the algorithm Rijndael to become the Advanced Encryption Standard (.pdf), or AES.

The AES competition was the most fun I’ve ever had in cryptography. Think of it as a giant cryptographic demolition derby: A bunch of us put our best work into the ring, and then we beat on each other until there was only one standing. It was really more academic and structured than that, but the process stimulated a lot of research in block-cipher design and cryptanalysis. I personally learned an enormous amount about those topics from the AES competition, and we as a community benefited immeasurably.

NIST did a great job managing the AES process, so it’s the perfect choice to do the same thing with hash functions. And it’s doing just that (.pdf). Last year and the year before, NIST sponsored two workshops to discuss the requirements for a new hash function, and last month it announced a competition to choose a replacement for SHA-1. Submissions will be due in fall 2008, and a single standard is scheduled to be chosen by the end of 2011.

Yes, this is a reasonable schedule. Designing a secure hash function seems harder than designing a secure encryption algorithm, although we don’t know whether this is inherently true of the mathematics or simply a result of our imperfect knowledge. Producing a new secure hash standard is going to take a while. Luckily, we have an interim solution in SHA-256.

Now, if you’ll excuse me, the Twofish team needs to reconstitute and get to work on an Advanced Hash Standard submission.

This essay originally appeared on Wired.com.

EDITED TO ADD (2/8): Every time I write about one-way hash functions, I get responses from people claiming they can’t possibly be secure because an infinite number of texts hash to the same short (160-bit, in the case of SHA-1) hash value. Yes, of course an infinite number of texts hash to the same value; that’s the way the function works. But the odds of it happening naturally are less than the odds of all the air molecules bunching up in the corner of the room and suffocating you, and you can’t force it to happen either. Right now, several groups are trying to implement Xiaoyun Wang’s attack against SHA-1. I predict one of them will find two texts that hash to the same value this year—it will demonstrate that the hash function is broken and be really big news.

Posted on February 8, 2007 at 9:07 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.