Comments

Nicholas Weaver December 4, 2014 9:53 AM

This makes me wonder: Will someone eventually prove that all trapdoor functions are vulnerable to quantum computing?

gmeista December 4, 2014 10:34 AM

Please tell me this wasn’t one of the post-quantum cryptography algorithms that got popped? Will read this in full detail later when I have the time.

Anura December 4, 2014 12:01 PM

Also, this is lattice-based, which is what NTRU is based on (NTRU being one of the most well-known post-quantum algorithms). The paper says that their particular attack doesn’t generalize very well, but it does make me doubt the security of NTRU to future quantum attacks.

bitstrong December 4, 2014 12:32 PM

FYI
The Cloud Security Alliance is forming a working group now, to prepare for QC.

Also, it is commonly stated that QC does not pose a threat to block ciphers, but this may not be true. Very broadly, the reasoning is, that because (CBC) uses information that is itself the result of previous operations there is no simultaneous solution, and therefore the cipher remains iterative and the most QC can do is weaken by one bit. This may not be true.

Anura December 4, 2014 1:31 PM

@bitstrong

Symmetric ciphers are probably secure from quantum computing because they are not easily representable mathematically – all public key algorithms in use today rely on relatively simple math problems that are hard to solve quickly. However, Grover’s Algorithm can still crack any symmetric cipher with an n-bit key in O(2n/2); this is still exponential time.

Symmetric ciphers with 128-bit keys will still be secure against the first generations of quantum computers, just because they will cost too much to build enough to break them; 256-bit keys will still be secure for the long term. If you are still using Blowfish with a 64-bit key, well, you are not secure today and definitely not secure against quantum computers.

Justin December 4, 2014 2:21 PM

There are a number of rather recent public key schemes designed to be quantum-resistant.

One is an implementation of a public key signature algorithm by Bernstein et al. called SPHINCS which is based on Merkle trees, and if I understand right, its security is provable with no difficulty assumptions other than that of breaking the hash function.

Then there is supersingular isogeny-based cryptography:

An ~O(p^(1/4)) quantum attack against these schemes…

So there is a lot of research lately into post-quantum cryptography, and it’s interesting to me that a lot of the impetus for research into quantum computing is for its ability to break public key cryptography. An article in IBM Systems Magazine relates

Q. What drives this push toward quantum computing?

A. It’s very clear that the biggest thing driving this today is the factoring application and its relationship to cryptography. …

MrC December 4, 2014 6:08 PM

@ Anura:

That was my first thought too — this could turn out to be generalizable and take down all of lattice-based cryptography. While I am certainly no expert, it seems to me that (variants on) McEliece for asymmetric encryption and Merkel for signatures sound like the safe, conservative (and resource-intensive) choices that are least likely to suddenly blow up due to a mathematical breakthrough.

What’s really puzzling me on this topic is the apparent lack of urgency in moving over to post-quantum algorithms. It’s not remotely hard to predict that the world’s first practical quantum computer will be built by the US government, and that it will be kept secret when it happens. Nor is it hard to predict, thanks to the uncomfortable truths Mr. Snowden has shown us, that the NSA will aspire to copy every encrypted communication directly off the wires and decrypt it. The only real open question remaining is “when?” I don’t have any special knowledge for answering the “when?” question, but I do recall that most everyone was caught off guard by the revelations that the NSA was already doing things we thought they would never do, and I think we underestimate them again at our peril. Given this situation, I’d think that everyone from IETF to GPG to TOR to OTR would be rushing to get something quantum-proof in place. But that doesn’t seem to be happening.

Ray Dillinger December 4, 2014 8:07 PM

I have not yet really seen any public-key systems that I think can withstand widespread understanding of quantum computing.

The math is hard. I’ll admit that I don’t fully understand it. But the more people understand it, the more activity and thought it will attract, and the more it will advance.

Quantum crypto is still so new that what we understand about how to use it is sort of like what some Cro-Magnon understood about math after thinking for a while about the fact that one rock is somehow different from two rocks and how he could generalize that to know something about three rocks… He was no dummy, but numbers were (at that time) a completely new idea and he was just developing the first ideas about how to apply them.

I’m just saying, it’s early days. We don’t have enough understanding and experience of the math behind quantum computing yet to have any REAL confidence in the security of any public-key scheme in the face of a mature understanding and experience of it.

Thoth December 5, 2014 1:08 AM

@An
OTP would only be secure if a few conditions are met:
– Unpredictable RNGs (not easy to do but plausible).
– Resist traffic analysis using weak message length which:
Key of message == Message length thus..
Bounds of length as B is set to >= 80 thus..
|Message| >= |Bounds| which means |Key| >= |Bounds|

Not as easy to use OTP as you thought. Tossing a coin and using a CSPRNG would be a medium strength option but ensuring not repeating key reuse, strong message/key length for traffic analysis, secure transferring of keymats …. a lot to consider.

Regarding QC, we don’t understand enough to properly generate and vet proofs. We have too few people on the field who claims to know of QC stuff that actually can do the job. I am skeptical if the spooks even know what they are talking about.

Just a note, most Military / Government agencies still use symmetric crypto than asymmetric crypto since it’s a proven and robust field.

Clive Robinson December 5, 2014 8:10 AM

The real problem with QC is it goes not after message cryptography but key distribution cryptography.

The thing is we know that PKcerts are broken badly in our current CA –or for that matter any other hierarchical– system, and have done for getting on for a quater of a century, if not a lot lot longer (if the observation about history not teaching the present is anytging to go by).

Bruce once observed that by and large we had solved the easier task of secure –symetrical– algorithms, and we needed to get on with the harder task of key distribution. Since then the progress has been most noticeable by the lack of anouncments…

The “only new kid on the block” getting any attention being QKD which while –supposadly– theoreticaly secure is practicaly more like a second hand pair of string underpants. But even if it’s practical security problems can be solved QCD has other problems that are very unlikely to be overcome, those being it’s lack of usefull range and it’s p2p nature.

I suspect that many at the likes of the NSA, GCHQ et al, have breathed a sigh of relief that we have not solved the Key Distribution problem. If as some are starting to suspect there will not be a PubKey algorithm that is secure against QC then the future of many things now taken for granted will most definitely be insecure, not least being the Internets commercial reason for existing.

Whilst this may horrify some people it may be no bad thing in one area, in that it will force the likes of banks and other financial institutions to set up their own key negotiation services and have to take back in house the financial liability they thought they had externalised.

Again it might have other benifits in the likes of tokens that are both secure and usable (though I might be stretching it a bit on the latter point). It might even see the likes of satellite “numbers stations” producing TRNG type output that can be used in protocols to act as a time based “back stop”.

It will be interesting to see if academia does start to get to grips with the KeyDev issue in time to be the overall winner in the race against QC…

@ Thoth,

With regards “key reuse” there is no system currently knowing that will provably prevent it. The only solution is to mitigate it to the point where the probability of it occuring is very low by a significant margin. The simplest way is to use CSPRNG’s based on one or more stream ciphers driving block ciphers in a cascade formation where the resulting block size is sufficiently large as is the overall relevant combined key bit size which I can see being upwards of a thousand bits depending on the required mitigation margin.

Thoth December 5, 2014 10:25 AM

@Clive Robinson
It’s really hard to track key reuse. Who would have known there might be a chance that my 256 bit keys I used, would have been used by someone else .. the exact bits … such possibility exists. Only way to prevent key reuse on a personal level is to record key hashes of used keys for particular cipher/cascading cipher but that might have some security issues ?

Regarding Key Distribution, it’s the hardest part of crypto and has always been that way. EKMS and the likes have not stopped key leakage if the person willingly attempts to steal keys during key distribution phase. EKMS systems only makes stealing keys much harder. Same for HSMs, it only makes stealing keys a little harder but not terribly too hard for someone willing to put some sacrifice on the tables especially in large organisations.

A lot of discrete log based PKI algos out there in the wild and recently a new PKI algo similar to RSA and DH appeared in the IACR ePrint archive. It’s about time the research directions should be placed into QC although it’s not very accurate or sure on how long before a properly working QC machine that does discrete log breaking algos to kill off ECC and RSA stuff would come into the picture.

If I did not remember wrongly, some of the proposals include exchange some kind of garbled circuitry to each other as their interactive methods to establish connections. Not sure how useful that do be.

Nick P December 5, 2014 11:20 AM

@ Clive Robinson

I think the NSA further see QKD as a potential gold mine of SIGINT in the event of widespread adoption. A subversion during manufacturing will be so much easier given these are complex black boxes based on esoteric math and/or equipment. NSA’s prime MO is to make the vulnerabilities look like typical accidents and brand new security schemes/products usually have plenty.

These points apply to other High Strength Attackers as well. Especially our competing nation states with plenty of espionage activity.

Nick P December 5, 2014 11:23 AM

@ Clive, Thoth

re key reuse

They must be able to notice the pattern in the data indicating reuse. I’ve always mitigated it with cascading algorithms, even if OTP. It was nice to see Markus do the same in TFC.

Anura December 5, 2014 3:53 PM

@Rune K. Svendsen

Check out this:

http://sphincs.cr.yp.to/

It’s very new, so it needs much more cryptanalysis before putting it into use, but it’s promissing for signatures. Of course, this still leaves key exchange, which is arguably the more important piece.

Thoth December 5, 2014 6:25 PM

@Nick P
TFC is a good example of a prototype secure comms system (albeit not perfect). Probably would be a good idea to have widespread adoption of cascading ciphers like the TFC and Truecrypt for as much communication crypto and storage crypto as possible. The only thing I worry is colliding key spaces whereby someone somehow generates a key that exists somewhere on a computer on other geographic location that may have been intercepted (RNG issues) which points to why NSA looks particularly interested on the RNG part of crypto recently.

I am pretty sure NSA/GCHQ/BND and other HSAs wouldn’t miss a nice change to twist and corrupt newly maturing crypto like the QKD / QC crypto schemes for their own selfish needs and who knows if FBI/CIA/British MI agencies …etc … might add a few more kicks to the QC stuff and make it more compatible with virtual civil surveillance and interdiction dominance in the name of anti-terror again (which can be used against their own civilian agencies one day) like a snake biting and poisoning itself.

Leon Wolfeson December 6, 2014 10:27 AM

@Thoth – Well, there are things you can do on reuse.

Like making sure the random number generator used to generate keys has a decent degree of randomness. Otherwise, well, clumping and higher chances of reuse.

Vatos December 8, 2014 9:23 AM

I wonder whether quantum attacks will ever be possible on, say, a 4096-bit key. Is there any particular reason to think that they will be viable?

J-F Biasse May 12, 2015 5:03 PM

I am currently undertaking research on similar topics, and I recently learned that the GCHQ attack was not a polynomial time one (despite what is claimed on their draft).

To make the attack polynomial time, you need to solve what is called the Principal Ideal Problem (finding the generator of a principal ideal). I am working on that problem and I am closing up on a polynomial time solution (in collaboration with F. Song).

I was originally skeptical about the GCHQ attack, and my feelings were shared by other people from the community. However, it was hard to tell if the online document was lacking details, or if it was because the research had not been done.

A few weeks ago, I had the opportunity to meet with the authors of the SOLILOQUY draft while I was attending a workshop in Bristol. I learned that the fact that the attack runs in quantum polynomial time is an overstatement that they cannot support. It seems that the approach they chose might actually not be valid for at least two important reasons I described there: https://groups.google.com/forum/#!topic/cryptanalytic-algorithms/GdVfp5Kbdb8

I also learned that they misinterpreted recent work from Eisentrager, Hallgren, Kitaev and Song, which lead them to think that there was a published proof of the statements they were working on (I was the one who told them it was not the case). Although they deny it, it seems likely that this has played a role in convincing them about the weakness of their SOLILOQUY crypto scheme.

Since they are claiming a result on which I am currently working without actually having it, I asked them to modify their claim on the online draft (removing one line about the polynomial run time would suffice). The authors agreed with me but they later claimed that their supervisors would not let them.
According to them, this whole thing is part of a communication strategy aiming at appearing even more knowledgeable than the scientific community. Indeed, they keep claiming they were ahead of the game in their draft. Their hope is that it will give them credibility when it comes to making recommendations on cryptography. This seems incompatible with admitting they were wrong and modify their statement.

This situation is unfortunate because it creates confusion among the community. Indeed, this draft has already been cited, even after one of its authors publicly confirmed they did not know if their algorithm was polynomial time. Moreover, it interferes with the work of people like me who are trying to get an actual polynomial time algorithm (the future reviewers of my work may reject my paper).

Jean-Francois Biasse

J-F Biasse May 12, 2015 5:08 PM

@Justin

FYI: the paper http://cacr.uwaterloo.ca/techreports/2014/cacr2014-24.pdf that I coauthored (and presented at Indocrypt 2014) is not an attack against the schemes

a) LUCA DE FEO, DAVID JAO, AND JEROME PLUT, TOWARDS QUANTUM-RESISTANT CRYPTOSYSTEMS FROM SUPERSINGULAR ELLIPTIC CURVE ISOGENIES

b) David Jao and Vladimir Soukharev, Isogeny-based quantum-resistant undeniable signatures

To break these, you need an isogeny of a given degree, while all we return is some isogeny of arbitrary degree.

However, it is an attack against the isogeny-based Charles-Goren-Lauter hash function.

Jean-François Biasse

Luiz Carvalho May 22, 2015 6:48 PM

The Supersingular Isogeny Key Exchange seems like a good post quantum replacement for Elliptic Curve Diffie-Hellman we spent 20 years studying the mathematics of elliptic curves and the academic community seems to have a lot of confidence in understanding their security. The only problem is that elliptic curve diffie-hellman doesn’t hold up against a quantum computer. The isogeny key exchange builds on that knowledge base and code base and provides the needed boost to security.

The lattice based post quantum schemes seem too unproven to trust for the next 5 to 10 years. They may be secure but they may fall apart with the next improvement in doing lattice attacks. At the moment the elliptic curve isogeny approach seems pretty sound.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.