NIST Announces First Four Quantum-Resistant Cryptographic Algorithms

NIST’s post-quantum computing cryptography standard process is entering its final phases. It announced the first four algorithms:

For general encryption, used when we access secure websites, NIST has selected the CRYSTALS-Kyber algorithm. Among its advantages are comparatively small encryption keys that two parties can exchange easily, as well as its speed of operation.

For digital signatures, often used when we need to verify identities during a digital transaction or to sign a document remotely, NIST has selected the three algorithms CRYSTALS-Dilithium, FALCON and SPHINCS+ (read as “Sphincs plus”). Reviewers noted the high efficiency of the first two, and NIST recommends CRYSTALS-Dilithium as the primary algorithm, with FALCON for applications that need smaller signatures than Dilithium can provide. The third, SPHINCS+, is somewhat larger and slower than the other two, but it is valuable as a backup for one chief reason: It is based on a different math approach than all three of NIST’s other selections.

NIST has not chosen a public-key encryption standard. The remaining candidates are BIKE, Classic McEliece, HQC, and SIKE.

I have a lot to say on this process, and have written an essay for IEEE Security & Privacy about it. It will be published in a month or so.

Posted on July 6, 2022 at 11:49 AM38 Comments

Comments

Jedward July 6, 2022 3:50 PM

Why do we need a quantum resistant algorithm? I thought ot was said somewhere that existing algorithms were resistant to quantum decryption for any reasonable time frame. Or something like that. What is this expected to solve?

John July 6, 2022 4:12 PM

hmm….

I have a hard enough time understanding what I have written in plain text!!

John

Celos July 6, 2022 5:00 PM

While I am looking forward to your analysis of the process, what I am really interested in is how much potential uncertainty we have in the respective mathematics of the different algorithms, i.e. how solid are the assumptions needed for security. Is this more pretty established to the respective experts or should we wait for 10 years of targeted cryptoanalytic interest before trusting theses algorithms? I am not a mathematician and it is really hard to evaluate where the state of things is in the techniques used.

SpaceLifeForm July 6, 2022 5:56 PM

What’s that smell? Swamp gas?

Smells like a Patent Trap coming thru the Backdoor.

See @hashbreaker as he does not tweet much.

‘https://blog.cr.yp.to/20220129-plagiarism.html

“If the New Hope developers fell into the trap of optimizing a post-quantum system threatened by a patent, and Google fell into the trap of deploying a post-quantum system threatened by a patent, then how are we supposed to avoid the same trap?”

SpaceLifeForm July 6, 2022 6:40 PM

Can you smell that smell?

Intentionally allowing this to be turned into a clickable link because your copy-pasta would not be al dente.

https://www.twitter.com/hashbreaker/status/1544456469038645248?cxt=HHwWgIDSgbm-ge8qAAAA

Looks like NIST didn’t actually nail down the patent buyouts before announcing Kyber’s selection, so now the patent holders have even more power. But, wait, NIST’s expert negotiators say that they “may consider” switching to NTRU if agreements aren’t signed “by the end of 2022”.

SpaceLifeForm July 6, 2022 8:15 PM

@ Jedward

re: What is this expected to solve?

Nothing that you will ever care about.

I am confident that this about chasing the Quantum Crypto Ghost thru the swamp.

Stick to modern Cryptography. Use a good SafeCurve if possible, and deprecate use of old protocols that have been attacked. Try to use ECC and TLS1.3 as soon as you can.

Do not be scared of the Quantum Crypto Ghost. It is just swamp gas.

https://www.schneier.com/blog/archives/2022/05/the-nsa-says-that-there-are-no-known-flaws-in-nists-quantum-resistant-algorithms.html

Clive Robinson July 7, 2022 2:03 AM

@ Jedward, SpaceLifeForm, ALL,

Why do we need a quantum resistant algorithm?

It is “algorithms” plural not singular.

But importantly it’s ONLY to replace those algorithms that are based on what are currently ASSUMED to be hard mathmatical problems of factoring large numbers and finding the discrete logarithms logarithm. Some of these effected algorithms are DH, RSA, DSA, and ECDH which often get lumped under “asymetric algorithms” that are also used in “Public Key” systems.

For “key negotiation” or “key transfer” for traditional “block ciphers” like AES. Also “Digital Signitures” the most common but mostly unseen use for is “code signing” of “Patches and updates” to fix security vulnerabilities in existing code bases. Also increasingly to “Secure the Digital Supply Chain” which is rife with proven classes of critical vulnarabilities.

The problem is that a new assumption came along in the mid 1980’s when it was theoreticaly proved that there were “Quantum Algorithms” in existance that would make an exponential increase in certain types of computing. This was done by Physist David Deutsch FRS, from Oxford University in the UK.

His pioneering work founded the field of quantum computation, by his formulating not just a description for a quantum Turing machine, but as well as specifying an algorithm designed to run on a quantum computer that would make that exponential speed up.

So just under fourty years ago an assumption became “theoreticaly possible” with the only assumption left being “Could it be made practically possible?”.

But… there was an issue then that still exists today, which is “Quantum Computers”(QC) are no faster than Classical Computers”(CC) in many general purpose algorithms.

Which begs the question,

“So why even build Quantum Computers?”

Well you could say it was Netscape’s fault. They came along and took the ideas of the RSA public key algorithm and built the prototype “Secure HTTP” which we now know as “https://…” or SSL. Which enabled the nascent Internet and curiosity upon it that was the “World Wide Web”(W3) to be able to handle financial transactions securely. Thus turning W3 into a “Market Place” for every crook, shyster, banker, financier, and psychopath with any ability, to fill the “old bottles” of existing crimes with the “new wine” of fast flowing money across international borders and out of a legal jurisdiction thus make them not prosecutable. Yup “Home Banking” in the late 1990’s was “crime central”… Later, traditional purchasing markets developed with the likes of Amazon getting in to take “The first to market takes all” prize.

SSL was now the integeral part of billions of transactions and it was theoretically very vulnerable to “Shor’s Algorithm” in it’s “Key Exchange/Transfer/Negotiation Mechanisms” but not in the data encryption algorithms that was only weakened by “Grover’s Algorithm”.

So the idea of QC for attacks against RSA and later “Eliptic Curve”(EC) became very very attractive to crooks, criminals and their bedfellows the SigInt and other “Intelligence Community”(IC).

But also to physicists and even biologists looking for Nobel prizes and mathmaticians looking for their equivalent…

So… if QC becomes practical then nearly all “consumer” used crypto becomes vulnerable (but traditional Diplomatic and Military crypto does not)

So people finally started getting twitchy and the idea of “Post-Quantum Cryptography”(PQC) came about. That is to find replacment algorithms to do “Key exchange” and “Digital Signing” without the underlying mathematical problems of “factorization” and “discrete logarithms”.

Because it is ASSUMED that it will be easy for future quantum computers to solve because of Shor’s algorithm, which renders existing Web Usage, and Code Signing unsafe because the RSA, DSA, DH, and ECDH algorithims within them would be unsafe should Quantum Computing become a practical reality.

All because we’ve not been able to solve “the secure channel” problem needed to establish a “root of trust” / “shared secret” underlying all cryptography used in the protection of information for the three most basic tasks,

1, Communications
2, Storage
3, Processing.

So on the assumption that Quantum Computing and the algorithms we have come up with so far will not be the end of the trail, this PQC issue will be gone through again in the not to distant future, maybe only a quater to half a century at most.

With regards,

I thought ot was said somewhere that existing algorithms were resistant to quantum decryption for any reasonable time frame.

Resistant but not immune, Grover’s Algorithm does effect symetric crypto like block ciphers, but not to anywhere near the effect Shor’s Algorithm has on asymetric algorithms based on “factorization” and “discrete logarithms”.

But… mankind is an inquisitive and inventive omnivore and we beaver on at a quite alarming rate. Neither Shor’s or Grover’s algorithms are the “Be all and end all” of QC algorithms, so all we can realy say is,

“Keep a weather eye on this space”

Because a lot of current PQC ideas and algorithms are based on lattice systems, and there were one or two quite alarming shocks came up in NIST PQC event, which might be why there is no “One Ring” contender in sight.

I hope that helps a little.

Gert-Jan July 7, 2022 6:43 AM

Why do we need a quantum resistant algorithm?

At some point in time, intelligence agencies might have a quantum computer capable of finding the private key of your encrypted communications in a reasonable timeframe.

The question for you is, should the information that’s transferred today still be secret by the time this quantum decryption is viable? Because a lot of encrypted information is currently captured and stored, in the hopes of decrypting it in the future.

The next question is, will switching to these newly proposed algorithms prevent this?

First, you want them to be just as good as the currently used algorithms. There is always the risk of serious flaws that are found after introduction.

Second, will it actually protect against quantum computing. Quantum decryption will undoubtedly evolve over time. Can its progress be foreseen years in advance?

If quantum computing delivers on it’s promise, expect all your current and historic electronic communications to be compromised.

Winter July 7, 2022 12:25 PM

@Gert-Jan, Jedward, SpaceLifeForm, ALL,

Why do we need a quantum resistant algorithm?

As Gert-Jan already explained, quantum computing is coming, in a decade, or two decades, and changing the cryptographic infrastructure takes time, a lot of time. Also, everything that is communicated today is stored wholesale for later decryption.

We do not know whether or when it will happen. But if it is theoretically possible, someone somewhere will do it. When people build gravitational wave detectors, quantum computers are not that more extravagant.

This is all well described.

More reading
Paywalled Nature article with an open list of good background references:
Transitioning organizations to post-quantum cryptography
ht-tps://www.nature.com/articles/s41586-022-04623-2

Open Access:

Identifying Research Challenges in Post Quantum Cryptography Migration and Cryptographic Agility
ht-tps://arxiv.org/abs/1909.07353

On the State of Post-Quantum Cryptography Migration
ht-tps://dl.gi.de/handle/20.500.12116/37746
PDF: ht-tps://dl.gi.de/bitstream/handle/20.500.12116/37746/J1-2.pdf?sequence=1&isAllowed=y

Clive Robinson July 7, 2022 2:34 PM

@ Winter,

“As Gert-Jan already explained, quantum computing is coming, in a decade, or two decades, and changing the cryptographic infrastructure takes time, a lot of time. Also, everything that is communicated today is stored wholesale for later decryption.”

Three statments.

Whilst the last is mostly true for digital consumer communications operated by “the voters” in some parts of the world, other communications are not recorded.

Also not all parts of the world poses the equipment to store these communications… As I occasionaly point out things only happen “If the laws of physics alow”. But… We should also note “If the excess resources are available” etc.

This does however leave quite a lot of “Wriggle Room” for the thoughtful or inventive mind. Even what feels like the most oppressive surveillance is not omnipotent or omnipresent.

The second point of all “things take time” is one of those universal constants with what often feels like near infinite variability. Thus the “How long is a piece of string?” question, that is more frequently an expression of exasperation.

But the first statment that “quantum computing is coming” at any time let alone within two decades, is shall we say “open” so is currently an assumption.

We know there are things that “theory allows” but “practically disdains”. Such as the ability to travel at the speed of light, we know the minimum energy required but currently not how to atain or control it for realistic masses in human terms.

To be realistic we have to acknowledge that whilst quantum computing is possible at a small scale, the ability to scale it to useful levels may well be constrained by limits we can not currently cross or may ever be able to.

To see why this might be, consider the issue of “Signal to Noise” at some point the signal level becomes comparable to the mean level of noise in any specified bandwidth. At this point the signal level is of the same magnitude and sumed with what is very probably “random” noise. The signal is effectively indeterminate at this point, to recover it requires a change in the ratio of signal to noise which usually carries penalties such as decreasing bandwidth in some way, often requiring an exponential rise in difficulty, time or both.

Thus the question arises of if attacks on cryptographic systems will be even close to practical beyond “toy” systems not just with the technology we currebtly posses but technology that we can build.

So a second question arises of “Are we throwing the baby out with the bath water”. That is chucking away what we currently have in exchange for something untested and having what are hidden weaknesses (rainbow).

Thus using both in hybrid systems not one or the other would be prudent behaviour…

It’s certainly the way I would consider moving in prefrence to the either or option.

SpaceLifeForm July 7, 2022 10:01 PM

@ TimH

There is no pot of gold at the end of the Rainbow.

Scroll back 4 posts in this thread for context.

‘https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/MYmlJkXQDRI/m/wKhZ-fuAAwAJ

SpaceLifeForm July 7, 2022 10:45 PM

@ Clive

I would be very leery about a hybrid approach for some time.

Not only the Patent Trap angle, but we have seen how double encryption has failed in the past (CIA, China).

It seems too early to think about chasing the Quantum Crypto Ghost thru the swamp. There may be sharp thickets that lead to excessive bleeding.

SpaceLifeForm July 7, 2022 11:48 PM

@ Clive, ALL

Swamp Screams

Can you see what this is telling you?

‘https://cloud.google.com/blog/products/identity-security/how-google-is-preparing-for-a-post-quantum-world

network products that were incompatible with post-quantum TLS

lurker July 8, 2022 12:47 AM

@SLF, “Can you see what this is telling you?”

It tells me I’m still in the wrong business. I can’t get excited by all the hand-waving, -wringing, crystal ball stuff about post-quantum when we are plainly still in pre-quantum. Until quantum actually happens the bogey-men are creating a lot of sound and fury, signifying nothing.

The merchants of doom are basing their predictions on some mathematical constructs. Pretty to watch, but the rubber will have to get a lot closer to the road to get me off the couch.

Winter July 8, 2022 1:25 AM

@Clive
I am aware of the Quantum Computing Hype [1], but almost every other technology that changed our life started as a hype, from printing press to lasers to the Internet. And many hypes prove to be duds. The fact that it is a hype does not tell us much.

@Clive

Also not all parts of the world poses the equipment to store these communications…

The point of the panopticon is not that everybody is watched 24/7, but that everybody can be watched and no one knows whether he is watched and when.

@Clive

The second point of all “things take time” is one of those universal constants with what often feels like near infinite variability.

The articles (e.g., the Nature link) give a pretty decent estimate of the time it will take to overhaul the cryptographic infrastructure (including debugging). That time is long, over a decade.

We do not know how long it will take to make quantum computing useful, but progress is rather steady. The quantum communication distance record is currently at 600 km over fiber [1] and over 4,000 km in space [2]. Quantum qubits can already be exchanged between distant nodes [3] allowing a qubit network. Working quantum computations currently still use only dozens of noisy qubits.

With the current speed of progress and a lot of guesswork, predictions can be made of possible progress scenarios [5]:

We also apply a log-linear regression on the metrics to provide a tentative
upper bound on how much progress can be expected over time. Within the
(generally optimistic) assumptions of our model, including the key assumption that exponential progress in qubit count and gate fidelity will continue,
we estimate that that proof-of-concept fault-tolerant computation based on
superconductor technology is unlikely (<5% probability) to be exhibited before 2026, and that quantum devices capable of factoring RSA-2048 are unlikely ( To see why this might be, consider the issue of “Signal to Noise” at some point the signal level becomes comparable to the mean level of noise in any specified bandwidth.

Any discussion on viable Signal-to-Noise rations should take a lesson from gravity wave detection. If you can successfully detect gravity waves (as is done), I would not bet a cent against them not being able to use noisy qubits.

I would also like to point to the quantum mechanics behind some biological systems that show what has already been achieved in hot, wet, and noisy systems, like plant and bacterial photosynthesis [6].

To our surprise, however, the bacteria were in fact performing a different type of quantum algorithm, called a quantum walk.
A random walk is a transport mechanism in which the walker takes steps in random directions through the network or structure to be traversed. In a classical random walk, the position of the walker gradually diffuses throughout the network. Because the walker is equally likely to move in any direction, after time t the walker has moved from its original position by an distance proportional to √t. In a quantum walk, by contrast, the walker takes a quantum superposition of paths through the network. These paths can exhibit quantum interferences. If the interference is destructive, the walker becomes stuck or ‘localized’ within the network. By contrast, if the interference is constructive, the walker moves from its original position by a distance proportional to t, a considerable improvement over the classical random walk. The coherence demonstrated by the Fleming group showed that the excitons were performing a quantum walk, at least over short distances.

References:
[1] ht-tps://www.technologyreview.com/2022/03/28/1048355/quantum-computing-has-a-hype-problem/

[2] ht-tps://www.toshiba.eu/pages/eu/Cambridge-Research-Laboratory/toshiba-announces-breakthrough-in-long-distance-quantum-communication

[3] ht-tps://phys.org/news/2021-01-world-quantum-network.html

[4] ht-tps://english.elpais.com/science-tech/2022-05-26/teleportation-breakthrough-paves-the-way-for-quantum-internet.html

[5] ht-tps://www.researchgate.net/publication/344234562_Forecasting_timelines_of_quantum_computing

[6] ht-tps://iopscience.iop.org/article/10.1088/1742-6596/302/1/012037

Winter July 8, 2022 1:57 AM

Formatting errors dropped some text. Correction:

We do not know how long it will take to make quantum computing useful, but progress is rather steady. The quantum communication distance record is currently at 600 km over fiber [1] and over 4,000 km in space [2]. Quantum qubits can already be exchanged between distant nodes [3] allowing a qubit network. Working quantum computations currently still use only dozens of noisy qubits [4].

With the current speed of progress and a lot of guesswork, predictions can be made of possible progress scenarios [5]:

We also apply a log-linear regression on the metrics to provide a tentative
upper bound on how much progress can be expected over time. Within the
(generally optimistic) assumptions of our model, including the key assumption that exponential progress in qubit count and gate fidelity will continue,
we estimate that that proof-of-concept fault-tolerant computation based on
superconductor technology is unlikely (<5% probability) to be exhibited before 2026, and that quantum devices capable of factoring RSA-2048 are unlikely (< 5% probability) to exist before 2039. It is of course possible that
these milestones will in fact be reached earlier, but that this would require
faster progress than has yet been seen.

@Clive

To see why this might be, consider the issue of “Signal to Noise” at some point the signal level becomes comparable to the mean level of noise in any specified bandwidth.

Any discussion on viable Signal-to-Noise rations should take a lesson from gravity wave detection. If you can successfully detect gravity waves (as is done), I would not bet a cent against them not being able to use noisy qubits.

SpaceLifeForm July 8, 2022 3:13 AM

@ lurker, Clive, ALL

pre-quantum network products that were incompatible with post-quantum TLS

Think about it. Why would these pre-quantum network products be incompatible in the first place?

Don’t switches and routers just transport bit strings (i.e., packets)?

Explain to me why the network kit should even know what TLS algorithm is being used (pre or post quantum) and why should it even care?

Think about it. To me, it is just confirmation of [REDACTED].

Maybe the scam to come will be marketing PQC compatible routers.

Clive Robinson July 8, 2022 6:59 AM

@ SpaceLifeForm,

Maybe the scam to come will be marketing PQC compatible routers.

And much else besides… Should we ask that eternal question,

“Is PQC the new IoT?”

With regards,

Explain to me why the network kit should even know what TLS algorithm is being used (pre or post quantum) and why should it even care?

That is a “Draw up a chair and get comfy” question.

As I’ve mentioned before there is a line at one end “basic components” at the other “complex systems”. At the basic component end you have “zero security” at the complex systems end “high level security” (but not 100% security).

But, that line usefull as it is, has another associated with security which is “cost” in terms of resources used or being used.

Thus “security” is a composite of multiple constituent parts that effectively interreact, in different ways at different times.

The logical conclusion is that “security” is not innate but a construct that has to be built into systems, importantly that the systems have to be beyond a certain degree of complexity and resources to give security.

That is each bar of a cage does not offer security, nor do all the bars required to build a cage give security. It is only when the bars are held in a complex pattern is the cage formed as a system, that then gives security.

So “no” no base component parts need have knowledge of security, but at some point when the system they are built into reaches an appropriate level of complexity does security become meaningfull.

Mauricio July 8, 2022 7:19 AM

Not only the Patent Trap angle, but we have seen how double encryption has failed in the past (CIA, China).

As long as all cryptographic state (keys, IVs) is independent, double-encryption is not going to reduce security. If it did, anyone could reveal secrets by randomly re-encrypting data.

(Of course, we should not assume it will increase security—watch for hucksters claiming “512-bit encryption”—but it seems like a good idea, when using a novel algorithm, to combine it with something we have more confidence in.)

Mauricio July 8, 2022 7:28 AM

The article posted by TimH makes this dangerous claim: “Put simply, the security of an algorithm is derived from the fact that nobody has figured out a way to break it in a reasonable amount of time. If it takes three million years on a supercomputer to break a cryptographic algorithm, it’s probably safe for use today.”

That’s not true at all. Rainbow may well have taken 3 million years to break, the week before the shortcut was found. Now it’ll take a few minutes. No security was derived from the fact that nobody had figured it out before then.

Emoya July 8, 2022 9:24 AM

@SpaceLifeForm, Clive

I have worked professionally with enterprise mobile devices since the mid-90s, back when Palm OS, Symbian, and other early mobile systems were the primary players.

Each new wireless security protocol (WEP, WAP, etc) was strongly tied to the combination of hardware radio device and OS software. It was not possible to simply upgrade an existing device with a new OS version to implement a newer security algorithm. To take advantage of the latest security required the replacement or, at a minimum, physical upgrade of all outdated devices. Since these devices require direct access to enterprise networks to achieve their desired function, they also introduce weak points in the network infrastructure and therefore needed to be periodically refreshed to maintain the security status quo.

At a fundamental level, anything which is software is vulnerable to software-based attacks. The only way to minimize vulnerability to such attacks is to isolate some of the critical security functionality. Traditionally, this has been done by implementation in hardware and is one reason why HSMs are prevalent in enterprise networks. Modern mobile devices have found a measure of success via trusted execution environments. Also, nearly all modern cryptographic algorithms are designed to be efficient when implemented in either software or hardware because hardware implementations will always outperform their software counterparts, and performance is crucial for high volume throughput applications, such as switches and routers.

Emoya July 8, 2022 9:25 AM

@SpaceLifeForm, Clive

Cryptography has, and likely always will be, temporal. While provably secure algorithms exist (i.e. one-time pad, quantum key distribution), realistic implementations thus far do not, so the world relies upon computationally secure algorithms. All computational security erodes over time as new attacks are found and computational capabilities increase. This erosion has been accelerating and will continue to do so. The conservative assumption must be made that every non-provably secure algorithm has the potential of being completely broken on any day, and may possibly have been so “yesterday”, but the news is not yet public. To keep pace, innovation necessarily must continue to produce stronger algorithms and protocols based upon varying underlying principles, and these newer algorithms/protocols must be implemented, PQC is no different.

Cryptographic agility is a relatively new priority in system design due to the growing pace of computational security erosion, and because it adds significant overhead in every aspect of system development. Successful and secure implementation is not trivial, even for password hashing, one of the simplest of cryptographic processes. It not only increases complexities in system design, but also data storage and processing requirements as well as execution time.

Taking cryptography, something that is already inherently extremely complex and, quite frankly, beyond the ability of most people to implement properly, then adding yet another layer to make it “swappable”, only exacerbates the security vulnerability issues we face today. It is not quite “rolling your own crypto”, but it is in essence rolling your own crypto implementation interface, and every interface is a potential chink in the armor. As such, only qualified cryptography/security specialists should design and implement crypto-agile system components, but since every hardware and software development shop needs to do so, this is an unrealistic expectation. Due to the related costs associated with extended design, development, and testing, it is no wonder that businesses have not prioritized cryptographic agility. It is not surprising that nearly, if not, all currently deployed hardware was not designed, for several reasons, with cryptographic agility in mind.

Clive Robinson July 8, 2022 11:53 AM

@ Winter,

With regards “hype” as I said Quantum computing has existed theoretically, since the mid 1980’s getting on for fourty years ago. Depending on who you believe either very little or a great deal has happened in that time.

The problems are two fold, the first is the ASSUMPTION QC of sufficient number of bits will ever be practically possible.

The second is the fact that at the moment we have very few QC algorithms. Shor’s and Grover’s being the two that we are getting uptight about. But we’ve no idea if the algorithms that are comming through this competition are any good or more probably not. I’ve warned in the past of fixing an instance of an attack, and not fixing a class of attack. Effectively the competition is for just an instance of attack, and most solutions use the same solution –lattices– just dressed up different ways. Thus the probability of it being “fragile” rather than “robust” is high which could be extreamly problematic.

Years ago, not long after the AES competition got started I pointed out that NIST’s time would be better spent coming up on a standard framework into which crypto algorithms and modes could be plugged in or removed easily because of this “all eggs in one basket” issue. I pointed out that it was needed such that upgrading in situ on a less than 20-25 year life cycle became possible in all goods. Especially those that were to be found “in people”, the “Utility Supplies” of the homes they lived in, and the factories and plant automated by “Industrial Control” where nearly everything people lived with were made.

So the NIST focus is still not realy where it should be, and things are just getting more embarrassing for them as time moves on.

So I’m not takeng about “hype” but “practical engineering solutions” and why they should be standardized.

Which brings me onto your,

The point of the panopticon is not that everybody is watched 24/7, but that everybody can be watched and no one knows whether he is watched and when.

The “panopticon” idea actually fails where it can not be supported by the environment people are in. The authorities are not omnipotent or omnipresent and many people especially those we regards as crooks know this, thus play to this.

The Panopticon was a thought experiment back in the 19th century, that required the impossible of either to many resources or resources that were magically able to be in the right place at the right time.

The modern trick to making a Panopticon is to “node society” if you can force communications through your select “nodes” then you can attempt to watch them. However if the number of communications are too great or go outside of your nodes then they are outside of your resources thus not seen/recorded.

The actual reason for the “Methods and Sources” secrecy is fundementally a lack of resources by the surveilling entity. They attack with “instances” and hope you only defend against old out of date nolonger used instances, rather than defend against classes of instances.

Those games with the Mobile Phones and Drug cartels etc are now “known” thus old. Which means that smarter crooks will now have taken that into their planning. Perhaps the joke of it will be that rather than go “HiTech” they might go “Old School” there are after all quite a few books around telling you how to do things securely in an insecure environment as well as keeping things out of sight. Yes technology does bring the authorities extra scope and reach, but it does not yet bring them the extra resources they actually need which exists between an analysts ears.

With regards your,

We do not know how long it will take to make quantum computing useful, but progress is rather steady. The quantum communication distance record is currently at 600 km over fiber [1] and over 4,000 km in space [2].

I’ve little or no idea what you are trying to convey but Quantum Computing and Quantum Cryptography have next to nothing in common.

The first is like drawing logic diagrams on paper whilst the second is using a One Time Pad with a twist of physics thrown in.

Similar with,

Any discussion on viable Signal-to-Noise rations should take a lesson from gravity wave detection. If you can successfully detect gravity waves (as is done), I would not bet a cent against them not being able to use noisy qubits.

As I indicated there are fundemental limits on information transfers to do with the energy per bit of information and it’s bandwidth. Put simply you either have to up the energy of the information or reduce the bandwidth thus reducing the energy of the noise. Reducing the bandwidth also reduces the information you can communicate.

As for summation and noise, arguably “random noise” is not random but the sum of very many weak signals that you see the instant average amplitude of.

You might care to consider that a signals energy and instant amplitude are in effect not just unknowns to an observer they are also in effect unrelated due to phase and frequency error. Thus the instant amplitude of the summation requires you to know a great deal about all the possible components their frequency, energy, phase and direction at an antenna, to reduce their effects independently.

In effect you are seeing is the result of a filled knapsack of constantly varying size, and trying to work backwards to it’s contents. Which is the inverse of a harder version of the knapsack problem,

https://en.m.wikipedia.org/wiki/Knapsack_problem

Knapsack problrms and algorithms are of interest because it is believed they are not amenable to Quantum Computers,

https://en.m.wikipedia.org/wiki/Knapsack_cryptosystems

If true it suggests that there are some quite fundemental physical limits on quantum computers that are not being talked about.

Winter July 8, 2022 1:06 PM

@Clive

Depending on who you believe either very little or a great deal has happened in that time.

There were no quantum computations in the 1980s. There are quantum computations using a dozen or so quits now. From zero to dozen is a lot or little depending on the expectations.

The authorities are not omnipotent or omnipresent and many people especially those we regards as crooks know this, thus play to this.

Several powers are building a comprehensive database of human beings with an online presence. These include at least the USA, China, probably Russia, and most likely a few others. The USA and probably China, Japan already archive a large fraction of internet traffic. Both are happy to share when it suits them. If you are Chinese, American, Japanese Tibetan, Uighur, Ukrainian etc., your traffic will likely already be archived.

But the point is, like you should not discuss sensitive matters over the phone, you should not do so online. Not because you know that you are tapped, but because you cannot be sure.

Quantum Computing and Quantum Cryptography have next to nothing in common.

ICT is computation&communication. Quantum Informatics is Quantum Communication and Quantum Communication. Both have to move in lockstep to be useful.

As I indicated there are fundemental limits on information transfers to do with the energy per bit of information and it’s bandwidth.

Look up the SNR for astronomy, eg, exo-planets and gravitational waves. Physicist regularly push the boundaries. Quantum computing will not be different. Shannon already showed that you can reliably send information over any noise filled channel. That is basically how the gravitational waves people do it.

I do not buy ‘X is impossible’ unless there are laws of physics or logic broken. As chlorophyll can implement a quantum computation (quantum walk), others will be possible too.

We do not know whether RSA will be cracked. But we neither know it won’t. So I think it would be prudent to prepare for the bad outcome.

Which is the inverse of a harder version of the knapsack problem,

I do not understand what the knapsack problem has to do with this discussion.

vas pup July 8, 2022 2:37 PM

Toyota partners with Israel’s Quantum Machines for quantum computing solutions

https://www.timesofisrael.com/toyota-partners-with-israels-quantum-machines-for-quantum-computing-solutions/

“Founded in 2018 by award-winning quantum electronics experts Dr. Itamar Sivan, Dr. Yonatan Cohen and Dr. Nissim Ofek, Quantum Machines built the Quantum Orchestration Platform (QOP), a hardware and software solution for operating quantum systems to facilitate research and enable future breakthroughs.

!!!!!It also developed the QUA, a standard universal language for quantum computers that the startup says will allow researchers and scientists to write programs for varied quantum computers with one unified code.

“Access to QM’s state-of-the-art quantum control system will enable Toyota Tsusho’s customers to develop in-house quantum computing capabilities. The advantage of QM’s solution is that it covers much of the stack, including both software and hardware. This highly integrated approach will make it far easier for organizations with quantum aspirations to develop fully functioning quantum computers,” added Sivan.

In March, an Israeli team of researchers led by Prof. Roee Ozeri of the Weizmann Institute of Science announced that they have built the country’s first quantum computer, a major feat that has been years in the making.

The device is one of about 30 quantum computers in the world in different stages and one of fewer than 10 that use ion traps, an advanced technology that confines ions (molecules with a net electrical charge) in a small space using magnetic or/and electric fields. Trapped ions can form the basis of quantum bits, or qubits, the basic unit of quantum information.”

SpaceLifeForm July 8, 2022 8:29 PM

@ lurker, Clive, ALL

Learning is Free

You may want to check out this One Time Pad. 8 bytes in hex, it may apply to the text between the brackets above.

https://www.schneier.com/blog/archives/2022/07/nist-announces-first-four-quantum-resistant-cryptographic-algorithms.html/#comment-407341

01202734313d313d

Go here: ‘https://xor.pw/#

Put the text between the brackets into the first form box, make sure you select ASCII. Do not forget that case is important.

Then put the OTP in the second form box, make sure you select Hex.

Then, click Calculate. Make sure the output is selected to be ASCII.

Here is another One Time Pad to try.

1f0c100c6310150d

Interestingly, I did find a minor flaw at the site, but for the purposes of this exercise, it does not matter.

Free Leading Zeroes are included in the tuition.

Clive Robinson July 8, 2022 11:57 PM

@ SpaceLifeForm,

Re : Learning is Free

As,

02372b2322362921
1b2b22282d3d3121
002037342f202020

Indicates, all information to that length is equiprobable, that is the strength, and importantly the deniability of the OTP…

SpaceLifeForm July 9, 2022 4:26 AM

@ Clive

3a2a2924343b372f
312a29312f313121
373d2133203d3621
2224363537232a7b

312a293131312121

?

SpaceLifeForm July 9, 2022 5:38 AM

@ ALL

For those of you wondering what Clive and I are doing, you did not do the homework.

If you did not show up for class, this could or would make no sense.

You can plainly see that various hex codes have appeared more than once. Sure, that will happen. But does any byte have any meaning?

This is a demonstration of the problem of reusing One Time Pads. You can not do that. Which is exactly what we are doing here.

Because, if you can guess the plaintext, you can determine the OTP, and if you keep reusing it, then all future alleged ciphertext is easily decrypted.

You can not reuse a One Time Pad. It’s called One Time for a reason.

Winter July 9, 2022 6:45 AM

@SLF

You can plainly see that various hex codes have appeared more than once.

Follow Sherlock Holmes [1]. Look for repeating three letter combinations that might be, eg, “the”, but now in the same position in the different texts.

[1] ht-tps://en.m.wikipedia.org/wiki/The_Adventure_of_the_Dancing_Men

Clive Robinson July 9, 2022 8:11 AM

@ Winter, SpaceLifeForm, ALL,

Look for repeating three letter combinations…

You could but take a look agian at,

SL = 1f0c100c6310150d
CR = 002037342f202020

The first CR is 00 which means it is the same letter and case as in the –supposadly– unknown SL.

But then you see four 20 which means the letters are the same but the case is different.

So five of eight pieces known in some way simply due to ASCII structure.

But what does that gain you?

Well, if you assume that the 12—345 are the same letter in each word a fast unix commandline search on a dictionary list will probably give you a very very short list of matching pattern words.

But can you learn more from,

37342f

The answer is of course yes.

They are all over 20 which means they are of different cases. Most probbally lower case in the second word. So a quick conversion by subbing 20 gives,

17 14 0f

And a conversion to decimal

23 20 15

Gives you a spacing within the alphabet between the coresponding letters in the words.

At which point you have very probably found the two words without being given the hint @SLF gave…

As they used to say in the UK insurance advert,

“Simples”

You could have used similar knowledge to break the other two words or having got the third fast back to them.

There are reasons why you should never send “true plaintext” or “probable plaintext” under a One Time Pad, because from a practical perspective you badly break the “equiprobable” rule which is the OTP’s only strength.

Remember it’s not only KeyText that has statistics such as unicity distance, but so does “plaintext” that has not had it’s “statistics flattened”.

As demonstrated above you do not need to know the KeyText to break OTPs that are not used correctly.

Essentially this is what they did with project VENONA.

SpaceLifeForm July 9, 2022 6:52 PM

@ Winter, Clive, ALL

Equiprobable statistics flattening

If one was to reuse a OTP, you could toss the attacker a Red Herring.

Firstly, you do not want to do short messages, add some garbage noise.

Then Base64 encode it. That moves the bits around. And probably gets identical 3 letter word sequences onto different mod 3 offsets. Obviously, if you use the same 3 letter word repeatedly, then that will not help much.

Then compress it.

Short messages will not compress well, but compression will appear to make input appear more random.

That moves more bits around.

Then feed that into your OTP.

At this point, the 3 byte common word dictionary attack will be much more difficult.

Yes, it is Security by Obscurity.

Probably does not matter if they can get your Metadata anyway.

But, you can impose a cost even if you reuse a OTP.

Did I mention that you could repeat this process X times?

Does the attacker know what X is?

Does the attacker know what compression algorithm was used during each X step?

Clive Robinson July 10, 2022 4:57 AM

@ SpaceLifeForm, ALL,

Historically the two basic encryption steps were,

1, Substitution
2, Permutation

The idea being with substitution you could change the very basic “known” letters to “unknown” letters. However as in the Enigma “Plug Board” if employed in a static way across any body of text it’s security is minimal as “counting letter frequency” gives the translation table away very quickly.

Likewise with permutation the idea being to break up contact frequency thus change the sparce bigrams and trigram frequencies. Whilst harder to come up with a translation table over a body of text when used invariently fairly trivial methods were known.

The idea in the early 20th century was to do simple substitution and permutation as a pair of operations over and over such that things were not invariant across a body of text thus the complexity built up with each application.

In essence this is what you are describing.

However there is a form of “stream cipher” sometimes called a “book cipher” where the “key stream” is infact a “text stream” read out from the book from an agreed starting position.

In essence such a book code is not a plaintext mixed with a random keystream like an OTP, but two plaintexts mixed.

This is the same result you get from using an OTP twice, by mixing the two ciphertext streams, the keystream gets nulled out and all you are left with is two mixed plaintexts for which there are known methods of attack.

That said, I know that as recently as back in the early 1980’s, it was felt by some that you could improve the keystream generated by mixing four textstreams together.

Two basic methods were implied…

The first being four entirely seperate and unrelated book textstreams, preferably not even in the same language. This had usage issues needing four agreed start points and considerably increased risk of error jumping from book to book. But also Operational Security (OpSec) issues of carrying four sufficiently large books around is not “normal behaviour” even for “traveling salesmen”.

The second was in effect a “lagged Fibonacci Generator” solution, where by you used a single book textstream and selected four characters at fixed offsets with regards to each other as you incremented the start point.

So using the paragraph above with lag points of 1,2,3,5 we would get the fitst three keystream characters as the mixes of,

{Thee}
{hesc}
{eseo}

From which it can be seen that there is a “low pass filter” or “integrator” issue. Yes it flattens the statistics, but in reality the change is small.

In fact to see why this can be such a bad idea, there is a fast averaging algorithm, where you only do one subtract, one add to a running count and increment a pointer in a circular buffer.

So with a buffer [ABCDE],

On “New” input,

1, Inc pointer P (to point at A)
2, RunCount = RunCount – P[A]
3, P[A] = New
4, RunCount = RunCount + P[A]
5, Output = RunCount

From examination it quickly becomes clear,

1, How to implement a fast low tap count Fibonacci generator.
2, How little things change in each output.
3, Increasing the number of taps does not realy change very much (as only one value in the circular buffer changes at any time).

The lessons being that using a Lagged Fibonacci Generator has to be done with care and adding more taps reduces the potential change % at each step to in effect 1/n where n is the number of changing taps.

But what does this have to do with “compression” well, rather than look at the absolute output values of the Lagged Fibonacci Generator look at them as output to output differences or “deltas” it’s not difficult to see that the averaging process reduces the maximum change range for each delta down. Thus it acts like a compressor of the sort often found in CODECs and similar.

Which means the alleged entropy in the keystream is reduced and can be considerably so if that tap points used are not selected with care. The same is true of all compression it reduces the “key search space” at each step.

So whilst compression is usually “good on plaintext” as it flattens the statistics, it is “bad on keytext” as by flattening the statistics it reduces the key search space.

So… When you reuse a stream cipher key stream it cancels out, leaving two plaintext streams mixed together. If both are effectively compressed plaintext you can open up a whole can of worms you did not know was there because of the lowpass / integrating effect compression has.

SpaceLifeForm July 12, 2022 11:56 PM

No QC hardware required to attack this PQC NIST candidate.

As I said, do not be in a rush to chase the Quantum Crypto Ghost thru the swamp, where you may run into thickets, and bleed badly.

‘https://eprint.iacr.org/2022/904

We apply this attack to fully recover secret keys of SIKE

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.