NIST Starts Planning for Post-Quantum Cryptography

Last year, the NSA announced its plans for transitioning to cryptography that is resistant to a quantum computer. Now, it’s NIST’s turn. Its just-released report talks about the importance of algorithm agility and quantum resistance. Sometime soon, it’s going to have a competition for quantum-resistant public-key algorithms:

Creating those newer, safer algorithms is the longer-term goal, Moody says. A key part of this effort will be an open collaboration with the public, which will be invited to devise and vet cryptographic methods that—to the best of experts’ knowledge—­will be resistant to quantum attack. NIST plans to launch this collaboration formally sometime in the next few months, but in general, Moody says it will resemble past competitions such as the one for developing the SHA-3 hash algorithm, used in part for authenticating digital messages.

“It will be a long process involving public vetting of quantum-resistant algorithms,” Moody said. “And we’re not expecting to have just one winner. There are several systems in use that could be broken by a quantum computer­—public-key encryption and digital signatures, to take two examples­—and we will need different solutions for each of those systems.”

The report rightly states that we’re okay in the symmetric cryptography world; the key lengths are long enough.

This is an excellent development. NIST has done an excellent job with their previous cryptographic standards, giving us a couple of good, strong, well-reviewed, and patent-free algorithms. I have no doubt this process will be equally excellent. (If NIST is keeping a list, aside from post-quantum public-key algorithms, I would like to see competitions for a larger-block-size block cipher and a super-fast stream cipher as well.)

Two news articles.

Posted on May 9, 2016 at 6:19 AM43 Comments


W May 9, 2016 6:30 AM

I hope that this will help with the patent encumbrance, either because it’s a requirement for participation or because of NIST buying a liberal license for everybody.

Clive Robinson May 9, 2016 6:54 AM

I realy wish NIST would take a step back and think, before holding yet another Cryptoalgorithm Contest.

All such previous contests have been majorly flawed in one or more ways, with the end result algorithms are not lasting as long as the “expected life time” of the equipment it is put in.

Realistically there is no reason to suppose that this problem is not going to change any time soon.

Thus NIST need to think on how you design systems that alow for the obsolescence of these low level algorithms. Also importantly, how you migrate from one algorithm to another with out the security flaws etc that thoughtlessness upgrading engenders.

I’ve mentioned this several times in the past but NIST under the NSA guidence has repeatedly ignored the issue. Thus everyone has paid the price of this, primarily because NIST has acted improperly under NSA guidence.

Tommy Dohn May 9, 2016 7:18 AM

NIST has done an excellent job with their previous cryptographic standards, giving us a couple of good, strong, well-reviewed, and patent-free algorithms.

Sure, if you ignore Dual_EC.

Mark May 9, 2016 7:28 AM

NIST really are slow.

I did manage to upload volumes 0.0 through 10.0 to iTunes though.

Cryptopocalypse NOW 01 04 2016 Volumes 0.0 through 10.0 are available via :

Its a fictional biography, so I don’t get sued, but can still progress with my legal action against NEC, A.G.S.V.A., A.S.I.O., ORACLE, et al.

Updates coming in June.

Enjoy !

I have a few promo codes if you would like a freebie.

DavidFMM May 9, 2016 7:44 AM

Since “quantum computing” is still a fantasy, just exactly how is someone going to test whether an algorithm is resistant to a quantum based computing scenario?

May 9, 2016 7:50 AM

Is quantum computing good for anything but sniffing around? Or why is Google/NSA investing so much in it?

Mark May 9, 2016 8:25 AM

Ohh, to coin a phrase,

I’ve been playing with ‘No More Secrets’ ( ), it does have some amusement, especially for those old enough to remember 1992 and wear sneakers.

banner NSA| nms -a -f green ; banner Back | nms -a -f green ; banner Door | nms -a -f green ; banner Cryptology | nms -a -f green

is an interesting concept.

But you will need to reduce the font size of your terminal program of choice, and place your computer monitor on a 90 degree angle.

If you have a Mac, just record it, and rotate the video 😉

If my permit from D.E.C.O. / A.S.D. is not delivered by June 6, the 4th year anniversary of myself under going Robotic Surgery for Localized Prostrate Cancer, FooCrypt, A Tale of Cynical Cyclical Encryption, WILL be published using ‘No More Secrets’. Of course it will be recorded into H.264 video format, and uploaded into various Volumes on iTunes.

According to my current legal advice, apparently I can supply and publish cryptographic software, in a video, whereby, the video image, of the cryptographic software, is obfuscated by a currently unknown / published algorithm that dosen’t require a permit from D.E.C.O. / A.S.D. to use.

Kinda pointless, if you want to encrypt / decrypt something though !

But what a sales model though.

Buy the book, watch the video, follow the instructions in the following chapters, in order to run forensic’s over the video file(s), to extract the software, which has been exported, without a PERMIT……;)

It’s nearly as good as the old saying, around monkeys randomly typing out the complete works of William Shakespeare.

I know, it’s silly hour here, but I’ve been trying to leave Australia for about 2 weeks now, unfortunetly, I can’t until D.E.C.O. / A.S.D. finalize my permit(s).

I have also been restricted from writing cryptology for anyone out side of Australia until the permit(s) are finalized.

Makes for a great story though..;)

Thoth May 9, 2016 8:50 AM

@Clive Robinson
Finally NIST started to talk about algorithm agility with quantum resistant which you have been nagging about for a long time 🙂 . Hopefully they implement a proper algorithm agile PQC cipher suite.

Coz May 9, 2016 9:02 AM

Maybe a stupid (but genuine) question, how can a ‘quantum resistant algorithm’ be developed when quantum computers aren’t properly a thing yet? The capabilities and properties of those computers can’t be fully known yet and any algorithm can’t be tested against them. Is it just that they’d be harder to crack in general?

z May 9, 2016 9:36 AM

I’m uneasy about this. I have no problem with holding a competition to get the ball rolling; that’s clearly a good idea. I just think it would be very easy to get it wrong or to subvert it given that far less is known about PQC and it would be impossible to test in the real world. Plus if I was the NSA, I’d jump on this opportunity for two reasons:

A.) the risk of getting caught is lower since it is still a very new field and nobody has the hardware to attack it yet. It would be like holding a block cipher competition in the 1950s.

B.) I’d likely be the first (and only, for a while) attacker in the world with an actual QC to attack them, so it introduces little risk for the time being. The world would still be secure using them, except from us, which is the holy grail of government surveillance.

Would they actually try to do that? I would like to say that the US gov has learned its lesson about backdoors, but it has strangely gotten worse than ever before. I wouldn’t put it past them.

Igor May 9, 2016 10:14 AM

As others here have already indicated, there is still a trust issue here between NIST and the community at large given historical NSA subversion. How do we know this is not going to be a problem in future? Too early to tell imo.

ianf May 9, 2016 10:23 AM

         what you wrote in the last two comments sounds intriguing but also wholly unintelligible and opaque to non-cognoscenti. Perhaps you’d care to summarize what this/ your apparent spectrum of conflicts with the AU laws/ is about. No need to go into detail, just the skinny.

All I can say is that, when trying to view the iBooks item you posted from Continental EU, I get the iBooks store’s titlebar with “Mark A. Lane,” and the Library button, then a blank slate followed by the taskbar at the bottom. When explicitly searching for “Mark A. Lane,” a number of items comes up, but the only two authored by Mark Lane are “Citizen Lane” (2012), and “Legendary Locals of Daytona Beach” (2015), neither of which sounds like it’d be your cryptosumphin’. Nor does editing the URL by removing the country origin make a difference.

Tim March May 9, 2016 10:40 AM

The whole thing is a failure from the start. First, The knowledge base of crypto expertise is meaningless. Imagine a bus load of WW II vets getting dropped into a fire fight in Fallujah.

Case in point: You’re STILL referring to them as ciphers.

Second, the knowledge base of physicists is meaningless because few understand information theory. If you had attended a single conference on information theoretic security in the past five years you would not be optimistic. Buy the publications. See for yourself they do not have a solution. Most of that stuff was written for the sole purpose of satisfying some part of a PhD. You know, there is already so much conceit and arrogance and contempt between cryptographers and mathematicians. Go back and read some of the stuff on this blog which dissed information theory. This is just a stupid throwback to the 90’s. The handwriting is on the wall, but this is all NIST knows.

If you think QC is still years away, I can tell you with absolute certainty that practical QC resistant solutions are at least twice as far away. All they’re going to do is spew crap no one understands, that’s impossible to vet, that’s a giant pain in the ass to use, and every BS claim they make will eventually be disproven.

My advice? Exclude crypto “experts” because they will bog everything down with new versions of old stuff.

B May 9, 2016 11:37 AM

The post-quantum candidates are going to take a long time to be trusted for anything real. Right now the concrete security level achieved by a given set of parameters is still a research topic for prominent candidates (lattice systems, isogeny-based systems). There is also the complexity — for systems based on LWE there are at least 3 parameters that matter, or up to 5 depending on certain design choices, and if you go “by the book” you need to sample from a discrete Gaussian distribution (not hard, but not common — most programmers have not seen this).

Basically there are a ton of ways things can go wrong.

A personal concern of mine is the general lack of expertise. For ECC this is pretty bad — only maybe 100 people in the world actually know how to generate good EC parameters — but for PQ candidates it is even worse. At CRYPTO it is common for half the attendees to leave the room during lattice crypto talks and even those that stay mostly have trouble following. The mathematical background for lattice or isogeny-based crypto is well beyond the majority of cryptography researchers or engineers. McCliece is somewhat better since it is based on error-correcting codes, but the cryptanalysis knowledge for it is lacking (compared to what we are using today).

Fortunately these problems can be overcome, and even ignoring QC (which I view the same way as cold fusion: always one engineering problem away, regardless of how many engineering problems are solved) there are advantages to some of the PQC candidates. The systems can be computationally faster for encryption / decryption, and side-channel resistance can sometimes be more easily achieved. There are also some interesting “cryptomania” applications that are becoming more prominent (e.g. MPC).

Mark May 9, 2016 11:44 AM


sorry about the EU block of my titles on iTunes due to being listed in Oz.

Should we blame the NSA ?

Now available through iTunes – iBooks Volume 0.0 @
Now available through iTunes – iBooks Volume 1.0 @
Now available through iTunes – iBooks Volume 2.0 @
Now available through iTunes – iBooks Volume 3.0 @
Now available through iTunes – iBooks Volume 4.0 @
Now available through iTunes – iBooks Volume 5.0 @
Now available through iTunes – iBooks Volume 6.0 @
Now available through iTunes – iBooks Volume 7.0 @
Now available through iTunes – iBooks Volume 8.0 @
Now available through iTunes – iBooks Volume 9.0 @
Now available through iTunes – iBooks Volume 10.0 @

As for the last 2 sentences.

The software, that I utilize [ cryptology ], is in breach of export controls without a permit from D.E.C.O. / A.S.D. Hence, I can’t take my computers / software / etc outside of Australian Boarders without a completed permit assessment.

According to my last conversation with D.E.C.O., I can also, not as an Australian citizen, write, create, modify, cryptology software, outside of the Australian Boarders, without a permit. Hence, popping overseas, writing, something which is covered by D.E.C.O due to the ‘Wassenger Arrangement’ [ that’s not just Oz by the way ] requires a permit.

If you have a few hours, read a volume or read whats dumped on

Gweihir May 9, 2016 11:47 AM

Since any meaningful way to scale up quantum computing to sizes that are even remotely in the required sizes to threaten current or past RSA key lengths has eluded researchers despite considerable efforts, I have to wonder whether this is an attempt at getting vulnerabilities into public-key algorithms. Sure, they would need something that looks pretty good, and where the backdoor-possibility is very much non-obvious. Also, after their backdoored CPRNG, people will look extra carefully.

But the suspicion is there, as the need for QC-resistant public-key crypto seems very much contrived at this time. I mean, QC seems to scale much worse than quite a few alternate approaches to computing hardware that have failed. QCs are basically almost where they were 20 years ago, and hence may well scale with the log of the years invested into research, i.e. as to sort-of an “inverse” Moore’s law. If so, they will never even factor numbers that my 25 year old programmable pocket calculator could factor in a minute.

qwertty May 9, 2016 12:32 PM


Not an expert, but as far as I understand it, even though there are no paractical quantum computers yet (some qbits here and there, but nothing that gets even remotely close to the power of everyday desktops), the properties that a quantum computer would have have been studied quite extensively (which is why people are so willing to invest huge sums of money into making one).
As with regular computers, there will surely be some people who find novel ways of using it if/when it becomes available to the general public, and they might well be able to do things we didn’t think possible, but in the meantime, there are some crypto-schemes that are known to be easy to crack on quantum computers, even though they would take thousands of years on regular computers (like factoring large numbers). So it seems to me that this is more of a “This won’t work anymore”, than “This will hold until the end of times” kind of issue. Though I could be wrong.

Now for a quesiton of my own:
I did hear from one of my professors that an algorithm for factoring large numbers in O(n^6) has already been found (at least, in theory). I forgot the name of the people involved, and couldn’t find anything useful on the net (didn’t search for too long, though). Anybody knows anything about this?

Anura May 9, 2016 1:33 PM

Is a larger block cipher really that advantageous? There are performance advantages to a smaller blocksize, primarily when it comes to 32-bit processors. x86 has only seven registers that you can readily make use of; more than that, and you have to start spilling to the stack. Of course, you might be able to write the implementation so it is done entirely in the sse registers. I’d personally like to see another 128-bit block cipher that is both significantly faster than AES and more secure, and that uses only arithmetic operations.

z May 9, 2016 1:42 PM

@ Anura

I was actually just telling someone the other day that another block cipher competition would be advantageous, whether with a wider block size or not. We know much more about side channel attacks now than in the late 90s when Rijndael was being developed, especially timing attacks that have proved devastating to AES on more than one occasion. There are also use cases such as smartphones and IoT devices that were not on the horizon at the time. This might also be a good time to define the standard to use authenticated encryption modes whenever possible.

National Institute of Surveillance Tools May 9, 2016 3:03 PM

NIST has done an excellent job with previous cryptographic standards? No one in their right mind is going to trust them after their complicity (or helpless ineptitude) in clandestine sabotage.

They’ve betrayed the trust of any independent institution or state.

Anura May 9, 2016 3:27 PM


I could see justification for two algorithms, one with a 128-bit block size targeting 32-bit processors and embedded systems, and one with a 256-bit block size targeting 64-bit processors.

I should note that there can be advantages of 256-bit ciphers (which my previous comment makes it seem like there aren’t) in that they are inherently harder to break , and it can have more throughput on 64-bit processors (while potentially killing performance on 32-bit processors). For example, take a 128-bit ARX cipher defined across four 32-bit words, then redefine it across four 64-bit words and choose new constants. Even if it takes the same number of rounds to be secure, it will encrypt at 2x the speed on a 64-bit processor (assuming you don’t need more rounds, which you might for an ARX cipher, but probably not twice as many – although if your algorithm includes multiplication modulo 2^n, you might even be able to get away with fewer rounds than is necessary on a 32-bit processor).

You can also build authentication into the algorithm, making it more efficient than authenticated modes like GCM or CCM.

MrC May 9, 2016 3:37 PM

@ DavidFMM & Coz:

We’ve known what the operational behavior of a quantum computer would be if we could build one since the 1980s; it’s the engineering part that we don’t have figured out. Accordingly, it’s possible to write “programs” for a quantum computer, even though we don’t yet have a machine to run them. Such is the case with Shor’s Algorithm, which is the algorithm that would easily break RSA, ECC, and Diffie-Hellman, if the hardware existed to run it. While we will probably eventually see new attacks that require a quantum computer to execute, those attacks will also depend on new mathematical breakthroughs. All the advances that depend solely on having a working quantum computer have probably already been thought of.

@ Universe:

Can anyone recommend a good plain-English explanation of the McEleice crytosystem? All the literature I can find assumes the reader already has a pretty deep understanding of linear error correcting codes (and all the literature I can find on linear error correcting codes also assumes the reader already has a pretty deep understanding of linear error correcting codes). I’m looking for something that explains the various terms of art and symbolic notations in plain English, rather than tersely defining them by reference to other terms of art and symbolic notations that are equally obscure.

Clive Robinson May 9, 2016 8:12 PM


    Unfortunately, factoring prime numbers is one of the things…

I wish people would stop saying…

Drone May 9, 2016 8:51 PM

There’s no point in this because the Gubbment is gonna force them to put a back door anyway.

Clive Robinson May 9, 2016 8:57 PM

@ z,

Well to be fair, a QC is probably pretty fast at it. 🙂

Yes it would, if they could ever get a QC working beyond a hands worth of digits…

However I’ve had a creepy premonition that I’ve cursed myself over “factoring primes”, and that it is going to come back and haunt me when saying something like “factoring for primes” Im going to for-get something 😉

Anura May 9, 2016 9:09 PM

@Clive Robinson

I’m pretty sure if you know your input is prime, you can factor it with only one qbit.

precompute_is_better May 9, 2016 10:35 PM

Well to be fair, a QC is probably pretty fast at it. 🙂

If you want to factor primes even faster than a quantum computer ever could, it’s really quite easy! Simply write “1 × P” on a sticky note, and then affix it somewhere near eye-level at your workstation.

On the other hand, having an O(log3 n log log n log log log n) or better algorithm to determine if a ‘prime’ number is actually prime would really come in handy, assuming that factoring large numbers still proves to be too difficult!

RonK May 10, 2016 1:28 AM

While I think that it is good to try to push research effort into the direction of post-quantum cryptography, I find it puzzling how this competition could work. If I am not mistaken, up to now, the algorithms were judged based on current hardware, but for this competition we have no idea what kind of architecture details the non-quantum hardware will have at the future time when the work will become interesting to use, if ever.

Markus Ottela May 10, 2016 5:01 AM

The report rightly states that we’re okay in the symmetric cryptography world; the key lengths are long enough.

Hi Bruce! Could you please give a short comment on choises for key and block sizes with Threefish? 256-bit keys should be long enough yet Threefish offers also 512-bit and 1024-bit keys.

Cassey Tomczak May 10, 2016 5:19 AM

Thank you for calling the NIST sales helpdesk. Your call is important to us. If you would like us to use your PRNG in our next certified product, please press 1.

Mark May 11, 2016 12:59 PM

@ianf, et al

I listed some details regarding ‘FooCrypt,….’ on my store located @ which is located on

‘FooCrypt, A Tale of Cynical Cyclical Encryption, is a symmetric cryptology solution, which is theoretically capable of a key length up to INFINITY.’

PRE ORDER’s currently being accepted at a reduced price, only for personal use.

D.E.C. ( They dropped the office ), permit to export / supply / publish around June, 2016.

Only to countries not currently under embargo restrictions.

Mark A. Lane

Anura May 11, 2016 1:35 PM

@Markus Ottela

The Skein hash function, for which Threefish was developed, offers several different sizes for the internal state, and for a different size of state it requires a block cipher with a larger block size. Threefish is designed to have the key length equal the block length; from what I can tell, that’s primarily for the sake of simplicity.

Assuming that your cipher remains secure, there is no advantage to a key size larger than 256-bits. However, if there is a weakness discovered down the line, a larger key size could potentially offer a higher security margin. That is, you might be able to recover a 256-bit key in 2^48 operations, while a 512-bit key still takes 2^96 operations to break – they are both considered broken, but only one of those keys can be recovered in practice.

r May 11, 2016 8:13 PM


The reason for a post quantum crypto event is to design something for today’s commodity hardware that will be resistant to tomorrow’s advanced in hardware and revelations.

David Svarrer May 13, 2016 2:19 AM

Here is an encryption algorithm which can likely not be broken:

Simply just shut up. If you know something and tell at least one person then its not a seccret anymore, anyway. Its by the way more important who you tell something in the first place. Besides which, if it was supposed to be secret, why are you working so much to 1. Spread it, 2. Use means which prevents spreading it?

Here is another one: Run your tablet, mobile, gadget, laptop, computer, … through a shredder, then use other means of communication.

Yet another encryption method: Viterbi optimized, caesium random pattern xor encryption distorted, pattern reconstruction prevention. This one is simple, and the intermediary result is true random, laced with FIPS 140-II strong encoding, so when decrypting it, brute force, one has to deal with true randomization, ..

Or. Stay on the right side of the law, try, just try to live a life without expressing your greed…

MarkH May 13, 2016 8:05 AM

I appreciate B’s comment, especially the comparison of QC to cold fusion.

However, because I am deeply skeptical that cold fusion will ever achieve net energy production, and I consider useful QC to be unlikely but just maybe doable, I would compare usable QC to usable hot (laser/Tokamak) fusion: not likely to happen in my lifetime, but not altogether implausible.

Considering that proving out QC-resistant public-key algorithms is expected to take years (and a very big investment), it makes sense to get the project started now.

I strongly share B’s concern about the complexity of the PK algorithms expected to resistant to large-scale QC attacks. A terrific virtue of RSA is that it is so very simple to understand and implement.

Maybe with a little luck, someone will devise a new simpler QC-resistant PK algorithm before the first practical QC (if that day ever comes).

In the meantime, anybody with very stringent security concerns (for secrets which must be protected over a long time span) can use existing PK algorithms with painfully large key sizes.

Obviously, scaling QC today is VERY hard, and even with future breakthroughs, getting from about 2000 qubits (needed for 1024 bit RSA, long considered unsafe) to about 16000 qubits (needed for 8192 bit RSA) would likely take a lot of time and investment … if it can ever be done at all.

r May 14, 2016 2:38 PM

Something else, about ECC (error correcting/goppa, etc)… I don’t know why, but something about qbits makes me nervous with it. Maybe it’s my lack of understanding of either QC, shore’s, LwE or McEliece… but something about qbit’s ‘guessing’ just makes ECC seem weak.

jon August 23, 2016 12:26 AM

New q-resistant Algorithms that come from the competition will most likely be the “almost Best”, if that at all, which will leave the real heavy-weights solutions in the skunkwerks of the known heavy hitter nation states!

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.