You Can’t Rush Post-Quantum-Computing Cryptography Standards

I just read an article complaining that NIST is taking too long in finalizing its post-quantum-computing cryptography standards.

This process has been going on since 2016, and since that time there has been a huge increase in quantum technology and an equally large increase in quantum understanding and interest. Yet seven years later, we have only four algorithms, although last week NIST announced that a number of other candidates are under consideration, a process that is expected to take “several years.

The delay in developing quantum-resistant algorithms is especially troubling given the time it will take to get those products to market. It generally takes four to six years with a new standard for a vendor to develop an ASIC to implement the standard, and it then takes time for the vendor to get the product validated, which seems to be taking a troubling amount of time.

Yes, the process will take several years, and you really don’t want to rush it. I wrote this last year:

Ian Cassels, British mathematician and World War II cryptanalyst, once said that “cryptography is a mixture of mathematics and muddle, and without the muddle the mathematics can be used against you.” This mixture is particularly difficult to achieve with public-key algorithms, which rely on the mathematics for their security in a way that symmetric algorithms do not. We got lucky with RSA and related algorithms: their mathematics hinge on the problem of factoring, which turned out to be robustly difficult. Post-quantum algorithms rely on other mathematical disciplines and problems­—code-based cryptography, hash-based cryptography, lattice-based cryptography, multivariate cryptography, and so on­—whose mathematics are both more complicated and less well-understood. We’re seeing these breaks because those core mathematical problems aren’t nearly as well-studied as factoring is.

[…]

As the new cryptanalytic results demonstrate, we’re still learning a lot about how to turn hard mathematical problems into public-key cryptosystems. We have too much math and an inability to add more muddle, and that results in algorithms that are vulnerable to advances in mathematics. More cryptanalytic results are coming, and more algorithms are going to be broken.

As to the long time it takes to get new encryption products to market, work on shortening it:

The moral is the need for cryptographic agility. It’s not enough to implement a single standard; it’s vital that our systems be able to easily swap in new algorithms when required.

Whatever NIST comes up with, expect that it will get broken sooner than we all want. It’s the nature of these trap-door functions we’re using for public-key cryptography.

Posted on August 8, 2023 at 7:13 AM38 Comments

Comments

Z.Lozinski August 8, 2023 8:11 AM

People are forgetting how long the original NIST process that created DES took, The first call for proposals was 1973 and the FIPS standard was 1977. Before that there had been a lot of work dating back to the late 1960s in banking cryptography.

There are three implications we all need to get used to

1: Lots of organizations don’t even know all the places they use cryptography. Finding all the dependencies will take time. How many PKIs do you have? Think log4j ..

2: We have thought of public key cryptography as something with a very long life. We may need to get used to the idea we have to update it in the same way military cryptosystems are (ideally) designed to be agile and deal with compromised systems.

3: Replacing all the cryptography used in all our systems is going to take around 10 years, and not all industries have started this process. US Government: yes with NSA CNSA 2.0. Telecoms: yes (the GSMA). Banking: some. EU: some (ENISA). Others: not that I have seen.

flash August 8, 2023 9:15 AM

If you had unlimited resources at your disposal necessary to take care of the problem, what would be the time frame for a crypto overhaul then??

Kemal Bicakci August 8, 2023 9:36 AM

“Whatever NIST comes up with, expect that it will get broken sooner than we all want. It’s the nature of these trap-door functions we’re using for public-key cryptography.”

What you have said is not true for hash based cryptography. We have been working on cryptographic hash functions maybe more than 50 years. However, of course, it is only useful for digital signatures though.

Mexaly August 8, 2023 10:50 AM

A certain airplane model was rushed to market and two planeloads of people died.

That’s what happens when you listen to people who say safety takes too long or costs too much.

Yes, lives depend on security.

Clive Robinson August 8, 2023 11:28 AM

Lets be honest up front and say,

“We are fighting an enemy that does not practically exist, and may not exist at some point if at all.”

NIST in the background to the call for standard candidates said,

“If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. This would seriously compromise the confidentiality and integrity of digital communications on the Internet and elsewhere.”

However they decided to take the cautious approach,

“While in the past it was less clear that large quantum computers are a physical possibility, many scientists now believe it to be merely a significant engineering challenge. Some engineers even predict that within the next twenty or so years sufficiently large quantum computers will be built to break essentially all public key schemes currently in use.”

And start in even though there was no likelyhood of a “Quantum Computer”(QC) within the foreseable future if at all.

People have to remember that QC like “Artificial Inteligence”(AI) is one of those seeming eternal dreams that are always “a decade in the future” (though based on NISTs own time lines they had put it out to more than thirty years).

But we also need to ask what has happened with QC in the past half decade?

Honest answer the results are slowing down and progress on getting the number of Q-Bits up is not looking at all good[1].

But also we need to also be realistic about how long it will take to replace our existing “Public Key”(PK) algorithms.

And I think NIST was well off base with,

“Historically, it has taken almost two decades to deploy our modern public key cryptography infrastructure.”

Because that is a “New view” not a “Replace view”. Look at it this way,

1, Is DES still out there?
2, Is defective AES still out there?

The answer to both is “Yes” and is unlikely to change anytime soon.

As I’ve mentioned in the past we have to consider “Expected Product Life Time” some of which are upto half a century for utility meters, industrial control systems, medical implants and similar where replacment has high cost.

The fact some people get jittery is a fact of life and there is also the “Chicken Little” syndrome.

My main concern is actually not “when” these things might happen but

“What will happen with ‘Collect it All’ records”

People had an expectation of privacy not just at the time, but forvever.

It’s been pointed out that the Founding Fathers could get both simply by walking into the middle of a field and talk quietly. That is nolonger true technology has made “eavesdropping easy” thus “Privacy hard very hard” and some lunatics wabt to use this to destroy society as we used to know it.

But the question I don’t hear being asked or addrrssed is a fairly obvious one,

“Do we want to replace PK with the same?”

The answer is actually “probably not” PK was a cludge in many ways and it is reliant on assumptions. The PQC equivalents have key sizes that are large, some so large they are bigger than works of literature. You should be suspicious and ask “What hides within?”.

Remember the basis is “One Way Functions with trap doors”, we don’t actually know that One Way Functions”(OWFs) actually exist, as for the “Trap doors” that is an interesting philosophical question along the lines of,

“None, One, Some, Endless, Unknown?”

We need “One” to make a PK replacment work, anything else is not going to be what we want.

It’s interesting that the Chinese are looking at other ways than PK to do secure key exchange, but in the West we give the impression of being so manic about PQC we’ve failed to look at alternatives…

I’m keeping my eye rather more on “Quantum Communications Systems” the version you will hear most about is “Quantum Key Distribution”(QKD). It’s already a practical technology in terms of functionality, the question is,

“Can we make QKD scale in terms of distance and number of users?”

If we can then PQC to replace PK may well be seen historically as a quaint cul de sac.

[1] It’s dificult to measure QC performance, which is why came up with “Quantum Volume” which has subsequebtly been redefined by IBM,

The “latest and greatest” figures,

https://www.quantinuum.com/news/quantinuum-h-series-quantum-computer-accelerates-through-3-more-performance-records-for-quantum-volume-217-218-and-219#

But note the number of Q-Bits is 20. Some years ago the estimate on the number of Q-Bits needed to be equivalent of a high end personal computer was 1000… Each extra Q-Bit improvment is almost like doing what has been done again due to noise, decoherance, and circuit depth.

jackson August 8, 2023 12:40 PM

Listen to RSAConference2023 Panel: Migrating to Post-Quantum Schemes, especially comments by Adi Shamir.

It’s a mess, there’s little agreement on what new algorithms will even achieve.
On top of that, notice that few even discuss the keys.
All key management sucks.

AES was approved years before commercial cloud offerings first appeared (AWS).
Still, all vendors say is we use AES-256.
So what.
When was the last time you heard an enterprise was hacked because the attackers broke their encryption?
And that’s BECAUSE of AES256? No, I don’t think so.
Then other vendors want you to think HSMs are the big solution.
Yeah, if you’re working in your basement on your own stuff, or you’re a commando out in the desert somewhere.

Enterprise is under constant pressure to implement strategies that are tested and well proven, but laxity or whatever, prevails.

So do you think the enterprise is behind that haste to approve PQ encryption?
No, it is vendors who need new stuff to sell.
Several even try to leap frog by promoting their own product or service they say is PQ.
First, it was because QCs were right around the corner.
Then it was because those pesky Chinese were storing all your private emails so you better go PQ now so when the QC is ready you know the first thing they will do with their 10 billion dollar QC is break you private emails from 2020.

What a mess. On top of that, let’s not pretend everything will stay the same in every other area of technology all waiting for NIST to do their thing. Enterprise can’t even get up to speed on TLS 1.3. Then the agencies are constantly moving the furniture. Use DPI to catch the hackers! No wait, the NSA says DON’T do DPI.

What a mess.

Roger A. Grimes August 8, 2023 2:17 PM

We need to make everything with crypto…crypto-agile. We need to make so that when one standard gets sufficiently weakened that we can update our environment and replace with software patches or upgrades. It shouldn’t take 10 years. Of course, hardware is another issue…but somehow we need to make everything more crypto-agile.

iAPX August 8, 2023 4:18 PM

@Kemal, All

“Whatever NIST comes up with, expect that it will get broken sooner than we all want. It’s the nature of these trap-door functions we’re using for public-key cryptography.”

What you have said is not true for hash based cryptography. We have been working on cryptographic hash functions maybe more than 50 years. However, of course, it is only useful for digital signatures though.

Remember SHA, retro-named SHA-0. NIST FIPS PUB 180.
This the perfect example of a trap-door, or backdoored function, and no one is really certain the SHA-1 and followers haven’t been trap-doored too!

iAPX August 8, 2023 4:34 PM

Decades, we need cryptography that could endure decades of attacks.
I prefer people taking time to create real strong encryption (and hashing) algorithms instead having a bunch of new weak toys every other year!

The problem is about cold storage, but also on system exchange on whatever media (Internet, local network, floppy disk, RS232C!).

There are exchanges that contains informations or datas that will still be relevant decades later. Think about the JFK files…
If these exchanges are captured, they have to endure decades of attacks to decrypt them, thus we don’t need new fancy encryption algorithms for the next years, but real strong one that will last for decades.

anon August 8, 2023 5:25 PM

I think we’re looking at the wrong timeline. It doesn’t actually matter how long the standards process takes to approve a new encryption algorithm. What matters is how long it takes to implement, thoroughly test, and globally deploy that algorithm. By that time it will be too late for any data that has aleady been transmitted anywhere at any time, over any network — unless all of the 5-eyes’ data centers are razed, all of their networks torn out, and all of their satellites landed on Venus.

The only way that’s going to happen is if section 702 is not renewed, but prohibitted, and the FISA courts dismanteled, and the rule of law reinstated.

M.Atkinson August 8, 2023 8:51 PM

@ flash,

If you had unlimited resources at your disposal necessary to take care of the problem, what would be the time frame for a crypto overhaul then??

With unlimited resources, I could probably switch any system to use post-quantum RSA in a few days, or by the end of the year anyway. (Assuming we don’t consider “hours in a day” to be a resource limit…) But if my counterparty didn’t have unlimited resources, the gigabyte-to-terabyte keys and multi-day computations might pose a problem. And if other people did have unlimited resources, their brute-force attacks would complete instantly, so what would be the point?

@ Mexaly, Roger A. Grimes,

A certain airplane model was rushed to market and two planeloads of people died. […] Yes, lives depend on security.

Another planeload of people died because of hardened cockpit doors (which seemed reasonable at the time, but in no case is known to have saved any lives). Probably several planeloads of people die every year in separate road accidents, while trying to avoid airport “security”.

Similarly, crypto agility seems like a good idea, and I think actually is. But algorithm negotiation adds complexity which can itself enable attacks; conspiracy theorists have said that’s how the NSA sabotaged IPsec. It’s easy to say we should make it possible to switch, but who decides (and how) when we should actually do that?

@ iAPX,

Remember SHA, retro-named SHA-0. NIST FIPS PUB 180.
This the perfect example of a trap-door, or backdoored function

Is it? That would suggest it was an intentional weakness. If so, why would the NSA have published a corrected version before anyone else found the flaw?

Ted August 8, 2023 9:23 PM

Re: FIPS 140-3

I’m not sure that I understand, but there’s a NIST PQC presentation that mentions FIPS 140-3 vs. Common Criteria (CC), and NSS (U.S. DoD / IC) NIAP.

Dr. Saarinen seems to say that Common Criteria protection profiles (like in finance) are already applicable to PQC?

This was in his April 2023 presentation “Intro to Side-Channel Security of NIST PQC Standards” on slide 51, around min 53:25.

https://csrc.nist.gov/Projects/post-quantum-cryptography/workshops-and-timeline/pqc-seminars

Clive Robinson August 8, 2023 10:03 PM

@ anon, ALL,

“By that time it will be too late for any data that has aleady been transmitted anywhere at any time, over any network”

Not quite true.

Whilst PK used for asymmetric algorithms are very susceptible to American mathematician Peter Shor’s 1994 algorithm, it generally does not apply to symetric algorithms.

However many but not all symetric block algorithms are susceptible to Indian-American computer scientist Lov Grover’s 1996 Algorithm. It does not produce an exponential speed up but in effect is the equivalent of taking the square root of the non quantum search. This is the equivalent of “halving the bit width” of the block algorithm so the under Grover’s algorithm 256 bit width AES would be the same strength as 128 bit AES under current attacks.

Then there are other cipher systems that quantum computers will have no effect on. These are ones where the unicity distance is the length of the message.

The most well known is the One Time Pad, which though it is fragile to operator errors is an easy to use pencil and paper “hand cipher” that should have the property that all messages of a given length are equiprobable or as is more commonly said have “Perfect Secrecy”.

@ ALL,

As I’ve mentioned above Post Quantum Crypto Algorithms may not be a worthwhile investment in effort, for key exchange protocols. And that the Chinese are looking at extending both the range and capacity of Quantum Key Distribution”(QKD) systems.

Others are looking in other areas. One such is to extend the notion of “perfect secrecy”. Earlier this year a paper poped up on ARXIV,

https://arxiv.org/abs/2302.07671

Shannon Perfect Secrecy in a Discrete Hilbert Space

15 Feb 2023

Randy Kuang[1], Nicolas Bettenburg

“The One-time-pad (OTP) was mathematically proven to be perfectly secure by Shannon in 1949. We propose to extend the classical OTP from an n-bit finite field to the entire symmetric group over the finite field. Within this context the symmetric group can be represented by a discrete Hilbert sphere (DHS) over an n-bit computational basis. Unlike the continuous Hilbert space defined over a complex field in quantum computing, a DHS is defined over the finite field GF(2). Within this DHS, the entire symmetric group can be completely described by the complete set of n-bit binary permutation matrices. Encoding of a plaintext can be done by randomly selecting a permutation matrix from the symmetric group to multiply with the computational basis vector associated with the state corresponding to the data to be encoded. Then, the resulting vector is converted to an output state as the ciphertext. The decoding is the same procedure but with the transpose of the pre-shared permutation matrix. We demonstrate that under this extension, the 1-to-1 mapping in the classical OTP is equally likely decoupled in Discrete Hilbert Space. The uncertainty relationship between permutation matrices protects the selected pad, consisting of M permutation matrices (also called Quantum permutation pad, or QPP). QPP not only maintains the perfect secrecy feature of the classical formulation but is also reusable without invalidating the perfect secrecy property. The extended Shannon perfect secrecy is then stated such that the ciphertext C gives absolutely no information about the plaintext P and the pad.”

And before anyone asks, no I’ve not looked at it in any depth, so I’ve no idea if it is secure or not.

[1] https://www.researchgate.net/lab/Lab-of-Quantum-Encryption-Randy-Kuang

Erdem Memisyazici August 9, 2023 1:26 AM

I agree. In the end you’ll have to get back to the pidgeon idea of the packet and its route being the secret. The best force for this in my opinion seems to be neutrinos. When compared to the properties of gravitational waves except directly through the core they can pass through the Earth for the most part. Intercept that.

You can also go the symmetric everything route but you must trust one source fully like a VPN.

That being said supply chain attacks and a whole industry profiting from breaking your devices while nobody is going to jail is going to ruin it all mobster style as always.

But one day we may just do the right thing and sell people devices that work and provide them the privacy with it. In that case maybe we can have a citizen’s VPN sort of deal where the military protects the keys and issues them and anyone who tampers with the production and delivery of the keys to each citizen goes to jail.

The process will be such that even the producers won’t know the keys. You then allocate a few hundred billion dollars to openly protect the area.

That or we play baseball with bosons, why not both? It will be a while until we can make any of this happen though. I imagine some basic building codes will be necessary for each city where all houses must include shielding and soundproofing etc. Then add the devices and most people would enjoy a decent level of privacy in their personal spaces whether that is a dorm, apartment or a house. It’s too easy to spy on people today. Nobody is prepared for it in any decent measure.

So far we have been relying on people not ruining each other’s privacy because they are decent. That went out the door when it all became a billion dollar industry back in 2010. 13 years later, nothing changed. Except maybe Snowden said, “Hey, you guys have no privacy. Bye.” in between then but it hardly made a difference.

Phillip August 9, 2023 2:25 AM

As a general rule, SSL/TLS will not work with UDP. I wish to keep my comments much more simple. I heard a “might-be-mercy-me.” The only muddling with UDP is choosing whether you choose to believe me. Okay, go ahead, look around. Who is your answer?

Clive Robinson August 9, 2023 4:29 AM

@ Dave Taylor, ALL,

Ahh, the iconaclastic Lisa gets in the act 😉

We don’t hear as much from Peter as we should do. His diary on the Auckland blackouts and what was found, is a rather interesting documenting of just how fragile modern cities are when run by corporates.

Winter August 9, 2023 4:37 AM

@Clive

just how fragile modern cities are when run by corporates.

We all tend to look the other way but efficiency and robustness are mutually exclusive.

Look also at the Texas Freeze and the Western U.S. energy crisis.

Who? August 9, 2023 6:23 AM

After years of work and refinement, the OpenBSD project finally enabled on april last year (on OpenSSH release 9.0) a key exchange (KEX) algorithm believed to resist attacks enabled by cryptographically-relevant quantum computers, coupled with the classical X25519 ECDH KEX (this pair being named “sntrup761x25519-sha512@openssh.com”). This pairing ensures a reasonable degree of protection against quantum computers while ensuring that security is not worse than it was previously in case of a catasthropic failure of the quantum resistant counterpart.

WireGuard provides some sort of quantum computing resistance in the form of a pre-shared secret key too.

The real question we should ask ourselves is: can we trust NIST to provide a list of quantum computing resistant algorithms to be widely used? Let it me be clear here: it is known that NIST working groups have been contaminated for decades by US intelligence agencies, so there is no way they will allow an algorithm that has not been backdoored by three letter agencies.

If we want real protection against cryptographically-relevant quantum computers we need to write the software and deploy it ourselves, instead of listening to technical committees.

Clive Robinson August 9, 2023 7:52 AM

@ Winter, ALL,

Re : Sweet spots exist.

“We all tend to look the other way but efficiency and robustness are mutually exclusive.”

Efficiency and robustness are not mutually exclusive any more than efficiency and fragility are mutually inclusive.

I’ve a mantra about,

“Security -v- Efficiency”

Being the norm, but it does not have to be. All real world engineering has trade offs, and thus “sweet spots” exist.

Back in the 1950’s US cars were death traps because the way they were designed was on a production costs basis that had entered a downward spiral, and no single manufacturer could “jump the grove”.

A change in legislation forced all manufacturers to change their behaviour. The grove got closed and engineers got freed up from the limitations. The result was not only did safety go up, but manufacturing costs dropped and as a consequence of lighter vehicles fuel usage went down[1].

Most ICT designers don’t even go for “efficiency” but what marketing sees as “performance” via rather dubious “specmanship”. It’s why we have the “Xmas Gift that Keeps Giving” that came into view with “meltdown” and as we’ve just seen with AMD opens up “side channels” through which information can be extracted.

For years I’ve pointed out that information leakage depends on the energy in and leaking out as well as the bandwidth of the side channel. Reduce any of them and information leakage goes down.

Thus the trick is to make those reductions without overly effecting performance. One way is by the use of parallel path systems and segregation.

[1] What many would see as things moving in the right direction got halted by the SUV “carve out”. If the US wants to decrease it’s frankly horrifying road death numbers they have two basic ways.

1.1 Cut speed limit on all roads, with it down to 20mph in city suburban and urban roads.
1.2 Get the weight and size of vehicles down and change body shapes

They can also make all bodywork “impact absorbing” that is effectively energy absorbing crumple zones. Such bodywork is actually less costly to make than what is currently made.

Wannabe techguy August 9, 2023 7:55 AM

@who?
Regarding your comments about NIST, I’ve wondered and even asked here if they can be trusted and was answered that they can be. I’m not a tech pro at all, so I figured I must be wrong. It’s good to see someone else talk about it.

Winter August 9, 2023 8:01 AM

@Clive

Efficiency and robustness are not mutually exclusive any more than efficiency and fragility are mutually inclusive.

Robustness requires excess capacity. Efficiency required no excess capacity. These do not go well together.

There might be corner cases, but I cannot think of one.

Phillip August 11, 2023 12:39 AM

Probability is defined differently in classical and quantum. If we understand it, we can probably defeat it. Now if, I might get my hands on one of these quantum gizmos.

Anyway, how many different entities are gatekeeping which truth is admissible? Maybe the slowness in finding out is defining a security end-run, yet again. We need to be immediately jaded, or we are risking it.

Why would this principle only change, with how many of us are watching? We are washing it, behind. Washing it behind is watching you stand up. I wash my laundry with Tide.

modem phonemes August 12, 2023 10:55 AM

Carver Mead from 2013 [1] –

“ Mead is also directing his energy into developing a unified framework to explain both electromagnetic and quantum systems. This is summarized in his book Collective Electrodynamics. Mead is skeptical, yet supportive, of current quantum computing projects.

“We don’t know what a new electronic device is going to be. But there’s very little quantum about transistors,” he says. “I’m not close to it, but I’m generally supportive of these people doing what they call quantum computing. People have got into trying to build real things based on quantum coupling, and any time people try to build stuff that actually works, they’re going to learn a hell of a lot. That’s where new science really comes from.”

Perhaps a different understanding of quantum phenomena will be needed [2] –

“In our work, we have discovered that wave function collapse, at least in a simple case, is implicit in the existing formalism,” he said, “as long as one allows the use of advanced as well as retarded electromagnetic potentials.”

“In other words, the explanation requires accepting the possibility that time can flow backward as well as forward. “

Mead’s approach to qm is outlined in his book Collective Electrodynamics [3]

  1. https://www.hpcwire.com/2013/11/25/carver-mead-quantum-computing-neuromorphic-design/
  2. https://www.geekwire.com/2020/physics-professor-tackles-one-mystery-quantum-mechanics-times-flow/
  3. Mead, C. Collective Electrodynamics (2002), MIT Press.

Winter August 12, 2023 11:36 AM

@modem

But there’s very little quantum about transistors

Strange quote. The principle of N and P doped silicium interfaces is all about “quantum”. I assume “quantum computing” is meant.

In our work, we have discovered that wave function collapse, at least in a simple case, is implicit in the existing formalism,

The real problem with the wave function collapse is that it is not part of quantum theory and is not understood at all. This is part of what Sabine Hossenfelder calls the measurement problem.

In other words, the explanation requires accepting the possibility that time can flow backward as well as forward.

Look up super-determinism. It is connected with the measurement problem. [1]

This all goes well over my head, but it makes quantum computing look simple. But it does not question quantum mechanics, just our interpretations.

[1] ‘https://arxiv.org/pdf/2010.01324.pdf

modem phonemes August 12, 2023 1:41 PM

@ Winter

principle of N and P doped silicium interfaces is all about “quantum”

I think Mead is saying only the most elementary aspects of quantum mechanics, such as tunneling, are used.

More on Cramer and Mead’s approach to wave function collapse [1].

  1. https://arxiv.org/pdf/2006.11365v2.pdf

modem phonemes August 12, 2023 3:13 PM

@ Winter

Some extracts from Cramer And Mead’s paper that seem to speak to Hossenfelder’s paper –

“The TI leans heavily on the standard formalism of Schrödinger wave mechanics. However, that formalism is conventionally regarded as not containing any mathematics that explicitly accounts for wave function collapse (which the TI interprets as transaction formation). Here we show that this is incorrect, and that the Schrödinger formalism with the inclusion of retarded and advanced 4-potentials can provide a detailed mathematical description of a “quantum-jump” in which what seems to be a photon is emitted by one hydrogen atom in an excited state and excites another hydrogen atom initially in its ground state. Thus, the mysterious process of wave function collapse becomes just a phenomenon involving an exchange of waves that is actually a part of the Schrödinger formalism.

The origin of statistical behavior and “quantum randomness” can be understood in terms of the random distribution of wave-function amplitudes and phases provided by the perturbations of the many other potential recipient atoms; no “hidden variables” are required. These findings might be viewed as a first step towards a physical understanding of the process of quantum energy transfer.“

Winter August 14, 2023 3:17 AM

@modem

Some extracts from Cramer And Mead’s paper that seem to speak to Hossenfelder’s paper –

The solutions offered to the measurement problem have always been felt like kludged. QM is fundamentally a linear dynamic system with unitary evolution, ie, it never loses information. Measuring a quantum parameter is neither linear nor unitary. After a measurement, information about the wave function is lost.

But the new solutions have the problem that they do not offer any new physics. There is no observable difference (yet) between, eg, super-determinism and “collapse of the wave function. So this all remains pure speculation.

Clive Robinson August 14, 2023 4:37 AM

@ Winter,

“Robustness requires excess capacity. Efficiency required no excess capacity.”

Err neither statment is true, all you’ve defined is “a point on a line”.

Robustness requires “only sufficient” capacity to function continuously at maximum load.

Likewise,

Efficiency requires “only sufficient” capacity to function continuously at maximum load.

Thus the “capacity” will be the same, that is provide the robustness required to meet maximum efficiency.

You see this in power electronics design where you “Design to the load line”. Where you select “a load” based on the device manufacturers maximum power ratings curve supplied in their data sheet.

However I’ve often designed significantly beyond the load line in the “storage quadrants” using both the maximum voltage and current ratings combined. The trick is to use them in such a way that maximum current occures at minimum voltage and the reverse. That is you use the device as a “true switch” in a digital not analog Class, like Class E. What you have to ensure is that the device disipation over time stays inside the manufacturers maximum power curve.

Winter August 14, 2023 5:56 AM

@Clive

Robustness requires “only sufficient” capacity to function continuously at maximum load.

Robustness requires capacity to function continuously under a predefined range of conditions that cover expected and unexpected outliers

Efficiency requires “only sufficient” capacity to function continuously at maximum load.

Efficiency requires minimal capacity to function continuously under a small range of expected conditions.

Robust systems are optimized for best worst case functioning. Efficient systems are optimized for best average functioning.

You cannot optimize for best worst case and average functioning at the same time.

I do not see the relevance of your power example. Did you optimize? And to what outcome did you optimize?

Paul Koning September 12, 2023 2:39 PM

Mr. Robinson’s comment on “may never exist” is a good one in my view.
We know there are algorithms for breaking earlier systems such as RSA, given a suitably large quantum computer (Shor’s algorithm). But the devil is in “suitably large”.
As I recall, Shor’s algorithm requires 2 qubits per bit to be factored, so for a decent size RSA key that’s 4-8k qubits. But that is error corrected qubits. Now we’re getting past the theory I understand more than very vaguely. Quantum ECC is not like classical ECC in that it has a small fractional overhead — it’s factors of 9 or 100 or other large numbers. How large? I haven’t seen clear answers. If calculating with 8k qubits requires an ECC factor of 100, that’s almost a megaqubit. Is that plausible? If yes, when?
Somewhat related: is there a scale point beyond which the coherence required for quantum computers is no longer achievable? For example, is there some sort of QC Heisenberg principle that makes scale beyond a certain point not possible?

Winter September 13, 2023 1:28 AM

@Paul Koning

Quantum ECC is not like classical ECC in that it has a small fractional overhead — it’s factors of 9 or 100 or other large numbers.

Work is done to improve precision and reduce the ECC requirements.

is there a scale point beyond which the coherence required for quantum computers is no longer achievable?

If you look at the complexity of the quantum channeling inside the photosynthesis center in green plants, I would not bet on such a scale cutoff.

A Bose Einstein or Ising spin system with macroscopic dimensions single coherent evolving quantum mechanical systems without a maximum size. These are not computers but there is nothing in QM that indicates that there is a super scale limit to coherent quantum evolution (which is what computing is).

Clive Robinson September 13, 2023 8:18 AM

@ Winter, Paul Koning,

Re : QC ECC

I’m not sure you to guys are talking about,the same issue.

The thing about the plants is each part is independent of each other.

In the QC however all the parts are interdependent.

Winter September 13, 2023 8:58 AM

@Clive

The thing about the plants is each part is independent of each other.

But the light harvesting complex is one big multi-part quantum machine to collect photons and channel the energy to the photosynthesis reaction center.

Have a look:

‘https://www.researchgate.net/figure/X-ray-crystallographic-structure-of-photosystem-II-PS-II-from-T-vulcanus-A-View-of_fig8_333710396

In the QC however all the parts are interdependent.

So are the parts in a Bose Einstein condensate. People are already looking for ways to use them as a sort of quantum computer.
‘https://nyu.timbyrnes.net/research/quantum-information-using-bose-einstein-condensates/

Clive Robinson September 13, 2023 12:24 PM

@ Winter,

“But the light harvesting complex is one big multi-part quantum machine…”

Each cell where light harvesting goes on, it’s function is independent of all the other cells.

The commonality is nutrients in and what are effectively sugars out, both of which are very much non quantum systems of chemistry and basic physics, that you can build and demostrate at home as a school science project.

The potentially 10^6 quantum logic gates / Qbits in a usefull QC have to work together and in synchronisation without loosing coherence for a significant period of time. So the quantum gates / Qbits are not independent of each other.

Further with a QC is the question of coherence distance, that is d=tC/e where “C” is the speed of light “e” is the constants of the system non freespace effects and “t” the time coherence can be usefully maintained for the Qbits in question. Depending on the Qbit t ranges from just a few pico seconds to just a few micro seconds in systems people have considered. The Qbit traps are generally many thousands of times in size bigger than the Qbit, which is quite a lot of effective realestate even for 1/1000th of the number of Qbits required.

So pardon me, if based on the information we have, I’ve my doubts on the viability on the sort of QC needed to gain benifit to apply to modetn RSA key sizes, or the larger AES “extended” key sizes.

But as I’ve also indicated other conventional logic is catching up fast. Logic gates clocking at 10GHz is an accepted thing but not generally usefull in the design of “general purpose CPUs” which is where the use of “hardware algorithms” using FPGAs and “chain algorithms” without feedback comes into play. It’s not hard to envisage a 10,000 times speed improvement for some algorithms over that which can be achieved with a general purpose CPU.

So with the QC achievement curve being both shallow and slow, and the hardware algorithms being far from shallow or slow… The point at which QC is “going to pay the rent” is moving away faster than QC achievements are being established.

Hence my original comment that @Paul Koning refrenced above.

I’ll tell you what, there are atleast a couple of people that read and post to this blog, who’s work dependency is way closer to QC than mine, I’ll leave it to them to express their viewpoints.

Winter September 13, 2023 12:49 PM

@Clive

Each cell where light harvesting goes on, it’s function is independent of all the other cells.

If you mean “without a quantum bridge” I agree. But a molecular structure with a dozen quantum “dots” within “reach”, working at room temperature does show what can be done in the real world. And then we have no idea yet what can be done with Bose Einstein condensates and Ising spin glasses.

So pardon me, if based on the information we have, I’ve my doubts on the viability on the sort of QC needed to gain benifit to apply to modetn RSA key sizes, or the larger AES “extended” key sizes.

I won’t hold my breath either. But I will not bet against it happening. People who said quantum technology X is impossible/impractical, have too often been forced to eat their words.

But really, breaking RSA looks to me as the least interesting application of QC.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.