Entries Tagged "quantum cryptography"

Page 1 of 3

New Lattice Cryptanalytic Technique

A new paper presents a polynomial-time quantum algorithm for solving certain hard lattice problems. This could be a big deal for post-quantum cryptographic algorithms, since many of them base their security on hard lattice problems.

A few things to note. One, this paper has not yet been peer reviewed. As this comment points out: “We had already some cases where efficient quantum algorithms for lattice problems were discovered, but they turned out not being correct or only worked for simple special cases.” I expect we’ll learn more about this particular algorithm with time. And, like many of these algorithms, there will be improvements down the road.

Two, this is a quantum algorithm, which means that it has not been tested. There is a wide gulf between quantum algorithms in theory and in practice. And until we can actually code and test these algorithms, we should be suspicious of their speed and complexity claims.

And three, I am not surprised at all. We don’t have nearly enough analysis of lattice-based cryptosystems to be confident in their security.

EDITED TO ADD (4/20): The paper had a significant error, and has basically been retracted. From the new abstract:

Note: Update on April 18: Step 9 of the algorithm contains a bug, which I don’t know how to fix. See Section 3.5.9 (Page 37) for details. I sincerely thank Hongxun Wu and (independently) Thomas Vidick for finding the bug today. Now the claim of showing a polynomial time quantum algorithm for solving LWE with polynomial modulus-noise ratios does not hold. I leave the rest of the paper as it is (added a clarification of an operation in Step 8) as a hope that ideas like Complex Gaussian and windowed QFT may find other applications in quantum computation, or tackle LWE in other ways.

Posted on April 15, 2024 at 7:04 AMView Comments

Side-Channel Attack against CRYSTALS-Kyber

CRYSTALS-Kyber is one of the public-key algorithms currently recommended by NIST as part of its post-quantum cryptography standardization process.

Researchers have just published a side-channel attack—using power consumption—against an implementation of the algorithm that was supposed to be resistant against that sort of attack.

The algorithm is not “broken” or “cracked”—despite headlines to the contrary—this is just a side-channel attack. What makes this work really interesting is that the researchers used a machine-learning model to train the system to exploit the side channel.

Posted on February 28, 2023 at 7:19 AMView Comments

NIST’s Post-Quantum Cryptography Standards

Quantum computing is a completely new paradigm for computers. A quantum computer uses quantum properties such as superposition, which allows a qubit (a quantum bit) to be neither 0 nor 1, but something much more complicated. In theory, such a computer can solve problems too complex for conventional computers.

Current quantum computers are still toy prototypes, and the engineering advances required to build a functionally useful quantum computer are somewhere between a few years away and impossible. Even so, we already know that that such a computer could potentially factor large numbers and compute discrete logs, and break the RSA and Diffie-Hellman public-key algorithms in all of the useful key sizes.

Cryptographers hate being rushed into things, which is why NIST began a competition to create a post-quantum cryptographic standard in 2016. The idea is to standardize on both a public-key encryption and digital signature algorithm that is resistant to quantum computing, well before anyone builds a useful quantum computer.

NIST is an old hand at this competitive process, having previously done this with symmetric algorithms (AES in 2001) and hash functions (SHA-3 in 2015). I participated in both of those competitions, and have likened them to demolition derbies. The idea is that participants put their algorithms into the ring, and then we all spend a few years beating on each other’s submissions. Then, with input from the cryptographic community, NIST crowns a winner. It’s a good process, mostly because NIST is both trusted and trustworthy.

In 2017, NIST received eighty-two post-quantum algorithm submissions from all over the world. Sixty-nine were considered complete enough to be Round 1 candidates. Twenty-six advanced to Round 2 in 2019, and seven (plus another eight alternates) were announced as Round 3 finalists in 2020. NIST was poised to make final algorithm selections in 2022, with a plan to have a draft standard available for public comment in 2023.

Cryptanalysis over the competition was brutal. Twenty-five of the Round 1 algorithms were attacked badly enough to remove them from the competition. Another eight were similarly attacked in Round 2. But here’s the real surprise: there were newly published cryptanalysis results against at least four of the Round 3 finalists just months ago—moments before NIST was to make its final decision.

One of the most popular algorithms, Rainbow, was found to be completely broken. Not that it could theoretically be broken with a quantum computer, but that it can be broken today—with an off-the-shelf laptop in just over two days. Three other finalists, Kyber, Saber, and Dilithium, were weakened with new techniques that will probably work against some of the other algorithms as well. (Fun fact: Those three algorithms were broken by the Center of Encryption and Information Security, part of the Israeli Defense Force. This represents the first time a national intelligence organization has published a cryptanalysis result in the open literature. And they had a lot of trouble publishing, as the authors wanted to remain anonymous.)

That was a close call, but it demonstrated that the process is working properly. Remember, this is a demolition derby. The goal is to surface these cryptanalytic results before standardization, which is exactly what happened. At this writing, NIST has chosen a single algorithm for general encryption and three digital-signature algorithms. It has not chosen a public-key encryption algorithm, and there are still four finalists. Check NIST’s webpage on the project for the latest information.

Ian Cassels, British mathematician and World War II cryptanalyst, once said that “cryptography is a mixture of mathematics and muddle, and without the muddle the mathematics can be used against you.” This mixture is particularly difficult to achieve with public-key algorithms, which rely on the mathematics for their security in a way that symmetric algorithms do not. We got lucky with RSA and related algorithms: their mathematics hinge on the problem of factoring, which turned out to be robustly difficult. Post-quantum algorithms rely on other mathematical disciplines and problems—code-based cryptography, hash-based cryptography, lattice-based cryptography, multivariate cryptography, and so on—whose mathematics are both more complicated and less well-understood. We’re seeing these breaks because those core mathematical problems aren’t nearly as well-studied as factoring is.

The moral is the need for cryptographic agility. It’s not enough to implement a single standard; it’s vital that our systems be able to easily swap in new algorithms when required. We’ve learned the hard way how algorithms can get so entrenched in systems that it can take many years to update them: in the transition from DES to AES, and the transition from MD4 and MD5 to SHA, SHA-1, and then SHA-3.

We need to do better. In the coming years we’ll be facing a double uncertainty. The first is quantum computing. When and if quantum computing becomes a practical reality, we will learn a lot about its strengths and limitations. It took a couple of decades to fully understand von Neumann computer architecture; expect the same learning curve with quantum computing. Our current understanding of quantum computing architecture will change, and that could easily result in new cryptanalytic techniques.

The second uncertainly is in the algorithms themselves. As the new cryptanalytic results demonstrate, we’re still learning a lot about how to turn hard mathematical problems into public-key cryptosystems. We have too much math and an inability to add more muddle, and that results in algorithms that are vulnerable to advances in mathematics. More cryptanalytic results are coming, and more algorithms are going to be broken.

We can’t stop the development of quantum computing. Maybe the engineering challenges will turn out to be impossible, but it’s not the way to bet. In the face of all that uncertainty, agility is the only way to maintain security.

This essay originally appeared in IEEE Security & Privacy.

EDITED TO ADD: One of the four public-key encryption algorithms selected for further research, SIKE, was just broken.

Posted on August 8, 2022 at 6:20 AMView Comments

Update on NIST's Post-Quantum Cryptography Program

NIST has posted an update on their post-quantum cryptography program:

After spending more than three years examining new approaches to encryption and data protection that could defeat an assault from a quantum computer, the National Institute of Standards and Technology (NIST) has winnowed the 69 submissions it initially received down to a final group of 15. NIST has now begun the third round of public review. This “selection round” will help the agency decide on the small subset of these algorithms that will form the core of the first post-quantum cryptography standard.

[…]

For this third round, the organizers have taken the novel step of dividing the remaining candidate algorithms into two groups they call tracks. The first track contains the seven algorithms that appear to have the most promise.

“We’re calling these seven the finalists,” Moody said. “For the most part, they’re general-purpose algorithms that we think could find wide application and be ready to go after the third round.”

The eight alternate algorithms in the second track are those that either might need more time to mature or are tailored to more specific applications. The review process will continue after the third round ends, and eventually some of these second-track candidates could become part of the standard. Because all of the candidates still in play are essentially survivors from the initial group of submissions from 2016, there will also be future consideration of more recently developed ideas, Moody said.

“The likely outcome is that at the end of this third round, we will standardize one or two algorithms for encryption and key establishment, and one or two others for digital signatures,” he said. “But by the time we are finished, the review process will have been going on for five or six years, and someone may have had a good idea in the interim. So we’ll find a way to look at newer approaches too.”

Details are here. This is all excellent work, and exemplifies NIST at its best. The quantum-resistant algorithms will be standardized far in advance of any practical quantum computer, which is how we all want this sort of thing to go.

Posted on July 24, 2020 at 6:36 AMView Comments

GCHQ on Quantum Key Distribution

The UK’s GCHQ delivers a brutally blunt assessment of quantum key distribution:

QKD protocols address only the problem of agreeing keys for encrypting data. Ubiquitous on-demand modern services (such as verifying identities and data integrity, establishing network sessions, providing access control, and automatic software updates) rely more on authentication and integrity mechanisms—such as digital signatures—than on encryption.

QKD technology cannot replace the flexible authentication mechanisms provided by contemporary public key signatures. QKD also seems unsuitable for some of the grand future challenges such as securing the Internet of Things (IoT), big data, social media, or cloud applications.

I agree with them. It’s a clever idea, but basically useless in practice. I don’t even think it’s anything more than a niche solution in a world where quantum computers have broken our traditional public-key algorithms.

Read the whole thing. It’s short.

Posted on August 1, 2018 at 2:07 PMView Comments

Quantum Tokens for Digital Signatures

This paper wins “best abstract” award: “Quantum Tokens for Digital Signatures,” by Shalev Ben David and Or Sattath:

Abstract: The fisherman caught a quantum fish. “Fisherman, please let me go,” begged the fish, “and I will grant you three wishes.” The fisherman agreed. The fish gave the fisherman a quantum computer, three quantum signing tokens and his classical public key.

The fish explained: “to sign your three wishes, use the tokenized signature scheme on this quantum computer, then show your valid signature to the king, who owes me a favor.”

The fisherman used one of the signing tokens to sign the document “give me a castle!” and rushed to the palace. The king executed the classical verification algorithm using the fish’s public key, and since it was valid, the king complied.

The fisherman’s wife wanted to sign ten wishes using their two remaining signing tokens. The fisherman did not want to cheat, and secretly sailed to meet the fish. “Fish, my wife wants to sign ten more wishes.”

But the fish was not worried: “I have learned quantum cryptography following the previous story (The Fisherman and His Wife by the brothers Grimm). The quantum tokens are consumed during the signing. Your polynomial wife cannot even sign four wishes using the three signing tokens I gave you.”

“How does it work?” wondered the fisherman.

“Have you heard of quantum money? These are quantum states which can be easily verified but are hard to copy. This tokenized quantum signature scheme extends Aaronson and Christiano’s quantum money scheme, which is why the signing tokens cannot be copied.”

“Does your scheme have additional fancy properties?” the fisherman asked.

“Yes, the scheme has other security guarantees: revocability, testability and everlasting security. Furthermore, if you’re at the sea and your quantum phone has only classical reception, you can use this scheme to transfer the value of the quantum money to shore,” said the fish, and swam his way.

Posted on October 6, 2016 at 7:03 AMView Comments

NSA Plans for a Post-Quantum World

Quantum computing is a novel way to build computers—one that takes advantage of the quantum properties of particles to perform operations on data in a very different way than traditional computers. In some cases, the algorithm speedups are extraordinary.

Specifically, a quantum computer using something called Shor’s algorithm can efficiently factor numbers, breaking RSA. A variant can break Diffie-Hellman and other discrete log-based cryptosystems, including those that use elliptic curves. This could potentially render all modern public-key algorithms insecure. Before you panic, note that the largest number to date that has been factored by a quantum computer is 143. So while a practical quantum computer is still science fiction, it’s not stupid science fiction.

(Note that this is completely different from quantum cryptography, which is a way of passing bits between two parties that relies on physical quantum properties for security. The only thing quantum computation and quantum cryptography have to do with each other is their first words. It is also completely different from the NSA’s QUANTUM program, which is its code name for a packet-injection system that works directly in the Internet backbone.)

Practical quantum computation doesn’t mean the end of cryptography. There are lesser-known public-key algorithms such as McEliece and lattice-based algorithms that, while less efficient than the ones we use, are currently secure against a quantum computer. And quantum computation only speeds up a brute-force keysearch by a factor of a square root, so any symmetric algorithm can be made secure against a quantum computer by doubling the key length.

We know from the Snowden documents that the NSA is conducting research on both quantum computation and quantum cryptography. It’s not a lot of money, and few believe that the NSA has made any real advances in theoretical or applied physics in this area. My guess has been that we’ll see a practical quantum computer within 30 to 40 years, but not much sooner than that.

This all means that now is the time to think about what living in a post-quantum world would be like. NIST is doing its part, having hosted a conference on the topic earlier this year. And the NSA announced that it is moving towards quantum-resistant algorithms.

Earlier this week, the NSA’s Information Assurance Directorate updated its list of Suite B cryptographic algorithms. It explicitly talked about the threat of quantum computers:

IAD will initiate a transition to quantum resistant algorithms in the not too distant future. Based on experience in deploying Suite B, we have determined to start planning and communicating early about the upcoming transition to quantum resistant algorithms. Our ultimate goal is to provide cost effective security against a potential quantum computer. We are working with partners across the USG, vendors, and standards bodies to ensure there is a clear plan for getting a new suite of algorithms that are developed in an open and transparent manner that will form the foundation of our next Suite of cryptographic algorithms.

Until this new suite is developed and products are available implementing the quantum resistant suite, we will rely on current algorithms. For those partners and vendors that have not yet made the transition to Suite B elliptic curve algorithms, we recommend not making a significant expenditure to do so at this point but instead to prepare for the upcoming quantum resistant algorithm transition.

Suite B is a family of cryptographic algorithms approved by the NSA. It’s all part of the NSA’s Cryptographic Modernization Program. Traditionally, NSA algorithms were classified and could only be used in specially built hardware modules. Suite B algorithms are public, and can be used in anything. This is not to say that Suite B algorithms are second class, or breakable by the NSA. They’re being used to protect US secrets: “Suite A will be used in applications where Suite B may not be appropriate. Both Suite A and Suite B can be used to protect foreign releasable information, US-Only information, and Sensitive Compartmented Information (SCI).”

The NSA is worried enough about advances in the technology to start transitioning away from algorithms that are vulnerable to a quantum computer. Does this mean that the agency is close to a working prototype in their own classified labs? Unlikely. Does this mean that they envision practical quantum computers sooner than my 30-to-40-year estimate? Certainly.

Unlike most personal and corporate applications, the NSA routinely deals with information it wants kept secret for decades. Even so, we should all follow the NSA’s lead and transition our own systems to quantum-resistant algorithms over the next decade or so—possibly even sooner.

The essay previously appeared on Lawfare.

EDITED TO ADD: The computation that factored 143 also accidentally “factored much larger numbers such as 3599, 11663, and 56153, without the awareness of the authors of that work,” which shows how weird this all is.

EDITED TO ADD: Seems that I need to be clearer: I do not stand by my 30-40-year prediction. The NSA is acting like practical quantum computers will exist long before then, and I am deferring to their expertise.

Posted on August 21, 2015 at 12:36 PMView Comments

1 2 3

Sidebar photo of Bruce Schneier by Joe MacInnis.