Update on NIST's Post-Quantum Cryptography Program

NIST has posted an update on their post-quantum cryptography program:

After spending more than three years examining new approaches to encryption and data protection that could defeat an assault from a quantum computer, the National Institute of Standards and Technology (NIST) has winnowed the 69 submissions it initially received down to a final group of 15. NIST has now begun the third round of public review. This “selection round” will help the agency decide on the small subset of these algorithms that will form the core of the first post-quantum cryptography standard.


For this third round, the organizers have taken the novel step of dividing the remaining candidate algorithms into two groups they call tracks. The first track contains the seven algorithms that appear to have the most promise.

“We’re calling these seven the finalists,” Moody said. “For the most part, they’re general-purpose algorithms that we think could find wide application and be ready to go after the third round.”

The eight alternate algorithms in the second track are those that either might need more time to mature or are tailored to more specific applications. The review process will continue after the third round ends, and eventually some of these second-track candidates could become part of the standard. Because all of the candidates still in play are essentially survivors from the initial group of submissions from 2016, there will also be future consideration of more recently developed ideas, Moody said.

“The likely outcome is that at the end of this third round, we will standardize one or two algorithms for encryption and key establishment, and one or two others for digital signatures,” he said. “But by the time we are finished, the review process will have been going on for five or six years, and someone may have had a good idea in the interim. So we’ll find a way to look at newer approaches too.”

Details are here. This is all excellent work, and exemplifies NIST at its best. The quantum-resistant algorithms will be standardized far in advance of any practical quantum computer, which is how we all want this sort of thing to go.

Posted on July 24, 2020 at 6:36 AM34 Comments


Random Commenter July 24, 2020 6:48 AM

Cryptography specifics go way above my head so I can only hope NIST has learn their lesson regarding the NSA…..

Larry the tech wannabe July 24, 2020 8:17 AM

Excuse my ignorance,but why does anyone trust any part of “the government”? NSA,NIST, FBI??
Same as when people call for more regulation. I get that companies have to me held accountable, but as the old question goes, who watches the watchers?

echo July 24, 2020 8:35 AM

I can safely say I know near zero about this topic and am completely lost at the first mention of any algorithm relating to it. Anything I have to say about the topic can be safely scrolled past without missing anything or read for entertainment purposes only.

It’s slightly interesting that Ramujan graphs are potentially useful in post-quantum cryptography and mock theta functions are used in calculating the entropy of black holes. I probably got something or everything wrong in just this one sentence but can safely add “weakly holomorphic Jacobi forms” to my list of nonsense to drop at a dinner party because the odds of anyone knowing enough to quibble is pretty low. As you do. The funny thing is the absolutely mindboggling amount of energy required to do some things whether it is cryptanalysis or bending space is ridiculous. I’d be pretty interested to hear what any mathematician or cosmologist had to say especially any of the more fun “what ifs”. Just leave it off with the maths please because it gives me a headache.

traci July 24, 2020 8:50 AM

NIST’s response to the SHA3 capacity change controversy (ie. restoring it to a security level that’s arguably overkill) suggests they learned their lesson about the NSA. And in response to “who watches the watchers?”: Bruce and a lot of other cryptographers. The proposals have been written and studied by people who are, as far as we know, independent of NIST and the NSA.

echo July 24, 2020 9:47 AM


Looking back at the history of computational power and custom architectures and exploitation of known (to GCHQ/NSA) weaknesses both in theory and implementations I simply don’t have a clue whether this approach can carry forward to quantum cryptography. I don’t know enough to enough know if it’s a good question or a stupid question. Any maths at this level bends my brain and that’s sometimes just trying to follow basic concepts which I sometimes have to read half a dozen times to grasp the gist of. Even being spoon fed cosmology which is a quite open and sometimes not unrelated subject is difficult enough.

TimH July 24, 2020 9:54 AM

Discourse has lost it’s nuance, partly due to the presentation of the two-party political discussions, so evaluation of NIST seems be polarised to either “NIST is great!” or “NIST can’t be trusted, because NSA”.

Actually it’s both, and, as traci points out, maybe the trust has gone but the watchers, include young Bruce, are ware of this.

Lastly, the trust may be circumspect in the decision making, but not in the quality of work.

traci July 24, 2020 11:36 AM

Looking back at the history of computational power and custom architectures and exploitation of known (to GCHQ/NSA) weaknesses both in theory and implementations I simply don’t have a clue whether this approach can carry forward to quantum cryptography.

For whatever it’s worth, the general feeling among cryptographers seems to be that the “secret” cryptography world (NSA et al.) is not as far ahead of public cryptography as it was in the ’70s or even the ’90s. As much as Dual_EC_DRBG drew NIST’s actions into question, it didn’t really fool anyone—people said “this looks like a backdoor” (described in some 1997 papers) years before it was published, and again when it was proposed for standardization, and then after it was standardized.

When we found out RSA took $10 million to make it the default, that was more of a “we told you so” moment rather than any great surprise. And then RSA and NSA made some statements that can only be described as comedy: “Art Coviello implied that RSA Security had not seen merit in the 2006 and 2007 research papers that pointed out flaws in Dual_EC_DRBG until NIST issued guidance to stop using the CSPRNG … and blamed NSA for tricking the company”; “With hindsight, NSA should have ceased supporting the Dual EC DRBG algorithm immediately after security researchers discovered the potential for a trapdoor”.

If our adversaries have introduced intentional weaknesses into the post-quantum proposals, they’re doing an unusually good job at hiding it. Some of this is based on decades-old work and I haven’t seen reports of any potential backdoor (unexplained numbers etc.)—even in 1975, people were wondering whether the NSA’s DES S-box tweaks introduced a backdoor (they didn’t, but the reduction in key length was a problem).

just-another-nerd July 24, 2020 11:50 AM

Hi Mr. Schneier,

sorry for OT comment but please consider writing up a “robust” critical review of the Twitter hack event.

We have heard of no post morterns on this, and a New York Times article with “logs and screenshots” presented as supporting material for a comforting story.

What is not comforting is that Twitter’s “admin panel” not only shows questionable filtering means, but apparently allows its user to change email addresses without notification to the user. That is not so comforting.

And today we have news dribbling out again via side channels that up to 1,000 people, ranging from “employees and contractors”, have access to this “admin panel”.

Now I find that all rather entirely discomforting.

Thank you!

myliit July 24, 2020 12:23 PM

re: Schneier on Security weekly squid threads


“I don’t get it.

What does this have to do with squid?”

Perhaps you were looking for the current squid page. The squid page involves security topics that may not belong in other Schneier threads. Posts that don’t run afoul of blog posting guidelines sometimes aren’t deleted. [1]

https://www.schneier.com/blog/archives/2020/07/friday_squid_bl_737.html .

If you check back later today, usually by 7 pm et each Friday, a new weekly squid should be available at:


For example, I plan to try to post about Noam Chomsky’s interview on Democracy Now there tonight. Here’s a link, if you are curious.


“ Noam Chomsky on Trump’s Troop Surge to Democratic Cities & Whether He’ll Leave Office if He Loses”

[1] Commenting policy for this blog:


Norio July 24, 2020 1:01 PM

What does this have to do with squid?

From https://ladailypost.com/lanl-atomtronic-device-could-probe-boundary-between-quantum-everyday-worlds/

“A new device that relies on flowing clouds of ultracold atoms promises potential tests of the intersection between the weirdness of the quantum world and the familiarity of the macroscopic world experienced every day.
The atomtronic Superconducting QUantum Interference Device (SQUID) also is potentially useful for ultrasensitive rotation measurements and as a component in quantum computers… “

1&1~=Umm July 24, 2020 4:17 PM


As traci has pointed out the NSA has lost a lot of ground in terms of “being ahead” of non secret researchers in academia and industry.

Thus the question arises of how far behind or in advance they might actually be on quantum computing research?

The probably the NSA have is they can not pay enough to attract “top flight talent” and “for flag and country” does not realy cut a lot these days where as an International reputation and respect of your peers does. Thus the talent they are likely to attract are the meak and the timid who lack self confidence and realy want to tick along quietly with healthcare and dental to a federal pension etc…

So anybody want to guess on behind or ahead and how far?

I’m guessing behind playing catch up reading other researchers papers so probably by a little under a year.

It may actually be more as, as far as we can tell funds appear to be directed to “easy win” offensive capabilities not long term defensive capabilities.

David Leppik July 24, 2020 4:43 PM

@Larry the tech wannabe:

This is an open process. You don’t have to trust them. However, the winning algorithm will be used by the US government so it needs to be an algorithm that they trust. There are plenty of government agencies, such as the State Department, that have a mix of classified and unclassified communications. It’s simply not feasible for them to have a secret, secure algorithm installed on all their computers along with a public, not-really-secure algorithm. Sooner or later a laptop will be stolen and the secret algorithm would become public.

It’s not unreasonable for an encrypted file to need to stay secure for 50 or more years to protect people, such as young spies, from retaliation. That’s why a quantum secure encryption algorithm is a big deal in the first place.

Because this is an open process, cryptographers from around the world can analyze these algorithms themselves and warn of vulnerabilities. Among the things they will be looking for is a back door that requires secret knowledge.

In fact, this is believed to have happened in the early 2000s with the Dual Elliptic Curve pseudorandom number generator which appears to have been designed to include a private key that allowed the NSA to read encrypted data. Cryptographers were suspicious from the get-go because the algorithm included hard-coded constants for no known reason.

NIST will end up standardizing one or more of these quantum-secure algorithms, and in the process all of these algorithms will get a lot of scrutiny. The government and its partners will be required to use the NIST-approved algorithms, but you will be free to use any of the other algorithms which you consider secure.

echo July 24, 2020 8:08 PM

People forget you only need one edge to get ahead and that a simple tilt and misdirection or simple deep investment in a single area and not telling anyone about it can bend things just enough to stay ahead. I have no idea what people keep going on about the NSA like they are the only game in town. You forget GCHQ exists and you simpy don’t understand the British. Not everyone is motivated by money and we don’t need to be. Status and comfort zones are arguably a more valuable currency over here than money. It may be the real work is in cryptanalysis and nobody is splurging that all over the internet nor leaving an implementation on every desktop nor to the best of my knowledge publishing much if anything in the way of academic papers.

The open maths I can see is cosmology and things liked detecting signals in the noise to look through the Milky Way or even detect hidden objects which cannot be seen directly is pretty amazing.

MarkH July 24, 2020 9:11 PM

NSA is a sinister presence in terms of the U.S. constitution, but the superiority of its secret techniques is perhaps not so much as is supposed.

Since mention has been made of the DUAL_EC_DRBG fiasco, I want to offer a perspective I haven’t seen here before.

Why did NSA make a backdoor that was clumsy and simple to detect?

One plausible answer is that NSA, with all its vast resources, failed to find anything better.

They have a heavy budget, and excellent talent. They’re not superhuman … and most of that budget surely is consumed by their sprawling operational staff and infrastructure, not bleeding-edge research.

Whether they could “poison the well” by promoting a supposedly quantum-resistant algorithm with some weakness known only to them, is practically impossible to answer.

It’s surely vital to NSA that any backdoors they create have the “NOBUS” property (NObody But US can open the backdoor). DUAL_EC_DRBG is a NOBUS backdoor, because it’s based on a secret number which is feasible to protect.

In contrast, a flawed algorithm might be known only to NSA (or be so expensive to exploit that others can’t afford to use it) for some period of time, but is a weapon dangerous to the assets they are tasked to protect:

  1. It’s impossible for them to be certain that no adversary knows (or can exploit) the flaw.
  2. If they’re lucky — and only they can use the flaw — it’s impossible for them to be certain that when an adversary eventually breaks through it, they will be aware that their exclusivity has been lost.
  3. When the flaw is eventually usable to adversaries, some incomprehensibly vast set of systems and secrets will be compromised.

Technical sophistication isn’t enough. These ramifying consequences must be taken into account, and are wickedly difficult to navigate.

Drone July 24, 2020 10:35 PM

NIST Rocks! It just needs to be careful to stay out of the stink-pot that holds our now politicized and corrupt NSA, CIA and FBI.

echo July 25, 2020 12:36 AM

You really need to learn about “Perfidious Albion” and how secretive and sneaky the British state is. The UK is different from the USA status and comfort zones are to some more important than money. The US makes the mistake it’s always about the money or size or huge sprawling monocultures. Things are arranged a little differently over here.

In the UK for some things the budget is effectively unlimited and pro-rata magnitudes larger per capita than the US and spent in very precise ways on select things involving select people. There are no organisations like the US NSA or US marine corps. It’s all dispersed with anything from three to five or more seperate entities with sometimes overlapping skillsets. The British are or have been very good at repurposing assets or very simply talking other people into doing their work for them. (The make do and mend philosophy gets very wearing at times but we are where we are.) But like I say it’s documented differently, arranged differently, and works differently. That’s probably why most Americans don’t perceive this.

There’s no such thing as an easy enemy to deal with. There’s only hard and less hard.

Clive Robinson July 25, 2020 2:33 AM

@ ALL,

The one thing to remember is that a SigInt agency no mater how it appears to individuals is “resource bound”.

Where resources are not just financial but ability and culture as well. When it comes to research there is also a lot of what some consider “luck” involved. That is you might try a hundred different approaches and fail, whilst some young graduate has an idea and his PhD paper walks down the right path first time such is coincidence in life that triggers stray thoughts.

Being immersed in experts is also an issue. The advantage is that you have a large body of knowledge around you in a live form. The disadvantage is that top flight researchers in quantity can be “like a room full of cats” but worse by a long way they tend to suffer from group think that keeps them to close to “the known” thus progress “is by very small incremental steps” rather than unexpected leaps into entirely new ways.

If you look at the “Open Contests” NIST has run for crypto it shows that NIST was “over reliant” on the NSA in setting up the contests. As I’ve mentioned before the AES contest was hamstrung by the contest and was for many years and still is in some cases insecure in it’s implementation.

What people realy need to understand is a basic truth in life,

    What may be true in theory, may not be in practice.

This is especially true in security, “theory” is like a microscope, “it gives you great depth but only has very little scope”. That is theory has a lot of “depth” and only where people have specifically looked. Security in practice is mainly about “breadth” that is it’s about the whole system because it’s the “weak link” in the chain or the tiny fracture in the corner an adversary can drive a wedge in and bust the whole box open.

As our host used to say you need to think “hinky” that is you get a sixth sense about security systems. Often though security flaws are “long known” but for various reasons –usuall financial– nobody addressess them and they almost become “forgotten knowledge” as the industry lumbers forwards.

Hardware faults are usually “long known” the problems behind RowHammer go back five decades and were discussed but faded due to the idea that unreliable parity circuits would solve the problem (they were kind of “the best we’ve got” solution back then, but the world has moved on and that 11% missed by double and other even flip errors is nolonger acceptable). But the problem remains, there is no reliable way to detect errors in memory because there is more than one valid state. A similar issue occured with “cache timing attacks” when we get taught about caching you get told about hits as well as no hits and the significant time differences. We’ve known about the very real issues with time based attacks since the late 80’s and early 1990’s when Secure Smart Cards were getting broken by them. The AES competition was 97-2000 so cache based timing attacks would have been well known to the NSA yet they encoraged “speed contests” and open availability of the source code. Well to get real speed you do things like loop unrolling, which have very real time channel issues as well as stack issues. So whilst in theory AES was secure as an algorithm, in practice nearly all implementations were very insecure, and down to people using the fast code riddled with time based side channels causing information about the key to be leaked well out of the computing system and well out onto the network…

I’ve little doubt that this NIST competition will end up with some practical implementation issue that will end up leaking plaintext, KeyMat or both in a way usefull to adversaries, because of “the way of things” when it comes to “selection criteria”.

When you optimize a design for one criteria other areas will be de-optomized, it’s just the way life is “you don’t get something for nothing” nature does not alow it. Thus with security you have a very difficult task of balancing optimizations, especially when you have only partial knowledge of where the problem areas are.

echo July 25, 2020 3:10 AM


This is all true enough. I’m constraining myself to the organisational and cultural stuff not the implementation specific stuff. It’s kind of read this and also read the monograph on Brigadier Tiltmanto get a clue about what I’m saying.

Groups can only be as clever as the cleverest person in the room but are as dumb as the average is a random thought. Blind luck certainly plays its part.

I tend to beat my own path and am studying a completely different subject area this week. It’s technically in scope of security but standalone. Without going into details by persistence and paying attention to the random stuff I’ve had to wade through I’ve managed to curate some real gold. I can’t get my head around your “thinking hinky” term and have been looking for my own way to express it but certainly some of what you also mention about forgetting what people have done in the past is true. I’m revisting a now very niche area originally carved out by people who very literally made a career out of “thinking hinky” and plundering it for ideas. I also have some very hard to get books because they go out of print fairly quickly. When even people at the top of the hierarchial tree or at the top of their game at least in the “retail space” don’t know the skills and knowledge in these books I have to wonder what is going on. I may be able to comment on this at a future time but maintaining some seperation and have to wait for the right excuse. Don’t hold your breath.

Necessity is the mother of invention is another one. Being stuck in a room full of rote learned silo mentalities or moaners is not my idea of fun so I’m making my own fun. I think people forget that and it’s another thing organisations can kill stone dead.

Another Mouse July 25, 2020 4:45 AM

They could just kick out the best candidates early on in the contest so they dont get a considerable amount of cryptanalysis…

Clive Robinson July 25, 2020 4:58 AM

@ echo,

Thinking hinky is when you look at something and get a feeling something.

Call it subconcious learning if you will where your body tells you by the hairs of your neck or the pit of your stomach.

It’s kind of like a “pre” “fight or flight” response.

I get it when I look at systems, when drawn up in charts or diagrams they should have a certain flow and aesthetic about them. If it’s lacking or worse disturbed or abrubtly cut then these are the places where you will find the weak links and flaws by which systems are exploited.

Look at it another way, when we look at people to much symmetry looks robotic, to little is ugly, either way algorithms in the “monkey brain” give warning they are “not fit” for continuation in the species thus we tend to avoid such people. We don’t know why we do it we just do. Well we also get other fealings about people we do feal in the hairs on our neck and the pit of our stomach only instead of words like “dread” we use words like “butterflies”, good or bad that’s a hinky fealing.

What’s important at the end of the day is what we do about that “hinky fealing”.

As for forging your own path in life, society moves forwards by those who do not hide in the hurd or Kow Tow to conventional wisdom. To much faith is put into conventional wisdom, mostly it’s a method of exploitative control, but sociologicaly it’s a “suicide pact by stagnation”. You have to be carefull how you use the terms liberal and conservative because they have been so overloaded with connotations. But broadly those of a real liberal outlook are “enablers” whilst those of a conservative outlook are protectors of vested interests and thus are “stagnators”. It’s important to be some of both, the liberal side drives you forwards whilst the conservative gives you precaution thus “safely enables” for some measure of safety derived from precaution. But as annoying as it can sometimes be to be on the receiving end of, I actively encorage “free thinking” because even when annoying or even wrong, within it there are gems of value. That is it might be the wrong way on this occasion but in others it might well be the right way, or parts might be the right way irrespective of the occasion.

echo July 25, 2020 9:15 AM


I have a clue what you mean and don’t always listen to it and there are times when I do. I can be a bit of a howling nob like that. I’ve tended to rely on the intuitive and bursts of overwhelming inspiration. This has somewhat deserted me now. I think confidence and knowing when something feels wrong probably matters more now. I’m shifting from more technical to creative stuff so it’s less aversion and more affinity. I’m not sure how that’s going to work out. The reason why I never bought into something being right because it was “elegant”. Yes, this was persuasive argument but I wasn’t convinced and learned later that developing a new thoery would often cause the same people to view the new arrangement as elegant and the old arrangement to no longer be elegant. In some ways it’s a different mode of reasoning but equivalent to something is in fashion until it’s not in fashion. But I do know what you mean.

Without labouring points of view and power structures humour isn’t too different. I really dislike “banter” and the practiced cynicism and aggressive takedowns which pass as humour today. It’s not just rude or crude but it’s not very enlightening in any of the right ways. I’m also not a fan of being “punked”. That said I need to button my lip sometimes.

metaschima July 25, 2020 10:32 AM

It’s great to see another cryptography competition going on. I think they really motivate cryptographers to come up with new and better algorithms. I think a lot of great algorithms came out of these in the past and often not the winners of the competition. Yes the NSA is everpresent and I expect them to intervene right at the end like they often do to slightly alter the winning submission to make it “more secure” without any explanation of the changes. Still, very good competition, I just am always weary of the NSA. On choosing an algorithm to use myself, I do extensive research into each algorithm and usually I will not choose one of the new ones until it has had plenty of cryptanalysis papers released on it. Oh and I never choose the winner of these competitions, but usually one of the runners up.

Clive Robinson July 25, 2020 11:32 AM

@ metaschima,

Oh and I never choose the winner of these competitions, but usually one of the runners up.

That might be a bit hard this time around, it looks like after a quater of a century they’ve wised up.

There won’t be a winner or realy “runners up”. It looks like there will ba a selection of A-team Algorithms and a B-Team that is “training up”.

Whilst not quite a “Framework” with “plugin replacable algorithms” I suspect that will end up being the net effect, which is good.

What I am kind of more interested in is not the algorithms themselves but what modes they will be suitable for. Because at the end of the day, it’s the modes, algorithms get used in, that define not just what applications they are good for, but also their overall security.

As the Linux Tux image in code book mode shows an algorithm no mater how clever at the end of the day is a “mapping” or simple “substitution cipher” that leaks information extensively if used incorrectly…

Singapore Noodles July 26, 2020 6:16 AM

Does the mathematics of permutation patterns [1] have an application to cryptography ?

  1. en.m.wikipedia.org/wiki/Permutation_pattern

Clive Robinson July 26, 2020 9:57 AM

@ Singapore Noodles,

Does the mathematics of permutation patterns have an application to cryptography?

Long answer short, our understanding of permutation patterns for practical uses is actually very recent (think this century), however this has not stopped people trying to use them both in the design of cryptographic systems and in the breaking of them as well[1] oh and in Traffic Analysis systems.

However the question at the rnd of the day boils down to one of efficiency -v- effectiveness. That is any simple even linear transform can be used as a cryptographic primitive, the question you have to ask is “What do you gain over other transforms? And likewise what do you loose?”

The simplest way to use most transforms is as part of a “card shuffling algorithm” that is you have an array of numbers that are balanced (0-2^n being a simple one) that forms your “deck” and each number a card[2]. You also have a pointer to a base position in the array from which you apply a permutation transform, the simplest being a swap of two “cards”. You actually need a minimum of two transforms, the first to update the pointer, the second to combine card values to provide an output value.

You have to be careful how you generate the output value because it can change the statistical properties of the output[3].

You also have to be careful how you generate the base pointer into the array so that no part of the array suffers from unequal usage, otherwise bias will very quickly become apparent at the output no matter how good the output transform it only puts of the inevitable observation of bias. The simplest way to do this is simply use an incrementing pointer that wraps from 2^n to 0. However this is insufficient because it is far too predictable. Thus the solution to that is to use another pointer usually as an offset to the base pointer that is not predictable either in the forward or reverse direction. But there is one “gotcher” which is in effect having a second pointer value of zero with a transform. This is easiest to see with a simple swap, because a swap in place effectively does not change the deck, if this no-change can be predicted in some way then it is possible to gain information about the positional content of the array.

It would normally be at this point that you would get the “any questions” enquiry… However whilst card shuffling algorithms are an interst of mine it’s part of what falls under “commercial research”. As for questions on “permutation patterns” there are way better people to ask than myself, you will find a number hanging out in the “maths department” in Queen Mary University London (QMUL) where the subject is of special interest to a number of people there[4].

[1] Have a look at this paper to get not just an idea of what can be done but a usefull set of refrences to explore,


[2] The important thing to remember is these values never change, just their position in the array changes, so the array properties especially it’s balance does not change.

[3] A simple example is if you repeatedly sum four to six randomly selected values from a pot of numbers –where the ball drawn goes back in the pot before the next is drawn– then the shape of the probability curve the outputs fall on changes. That is from one that is a flat distribution for the pot to a series of numbers that fall on what is very very close to a normal (bell) distribution curve. If this supprises you as it does quite a few people when first told, you can easily demonstrate this to yourself as a valid graphical proof.

[4] Somebody I used to work with years ago went to QMUL to work for her PhD in that very subject area. She used to think I was the smart one because I could usually “find a rabbit in any old hat, and pull it out” when needed. Where as the reality of it was just having a good memory for apparently unrelated information and spoting how to repurpose it to the task in hand. As with stage magic it’s only magical when you don’t know the mechanics of the trick 🙁

Keith Douglas July 28, 2020 2:28 PM

Many years ago some philosophers of physics wrote a “mild critique” of quantum crypto based on the idea it might not actually prove to be a speed up: the constant factors hidden in the setup time swamp you out (i.e., preventing decoherence is hard). About 1 year or so ago I attended an election security conference where I asked (not being a quantum crypto or any sort of crypto guy) what the state of the art on this question was, and the supposed quantum algortihms guy was unable to answer. Can anyone here provide me any references?

name.withheld.for.obvious.reasons July 31, 2020 12:36 AM

I listed a summary of the data released on the candidates and alternate algorithms to the Friday squid:


Curious July 31, 2020 8:02 AM

@Keith Douglas

Armchair physicist here, and not a good one.

I just like reading about such science/physics stuff, and that is basically it. I can’t answer your question, I don’t know much about that stuff, but I can’t help but wonder, if the work that goes into preventing said decoherence, could be an area for backdooring quantum crypto. How that imagined backdooring might even work or how it makes any sense at all on some technical level I don’t know but presumably the people that know this stuff would be able to evaluate such basic questions, even if apparently implausible and seemingly pointless. The way I vaguely iamgine it, perhaps “it” could all boil down to statistics based on a single, or a a range of variables, either monitored, recorded or simply known beforehand from someone’s quantum crypto device (or engineered into a device). I have this weird idea that, although I don’t personally believe in time-reversal, time reversal is apparently a thing or an idea in quantum mechanics, and disregarding what might be learned about quantum mechanics later on in the far future, I wonder if perhaps the uncertainty associated with measurements can be exploited by baking in “time reversal”, or, perhaps relying on this perhaps unintuitive prediction of results, by baking in this kind of “spring”, and I imagine it could be thought of as some kind of deferred result (maybe something that can be somehow monitored and help decrypt someone’s quantum crypto box), thus being something predictable despite any dogmas about the fundamental uncertainty in quantum mechanics when doing measurements, or so I vaguely imagine.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.