Lattice-Based Cryptosystems and Quantum Cryptanalysis

Quantum computers are probably coming, though we don’t know when—and when they arrive, they will, most likely, be able to break our standard public-key cryptography algorithms. In anticipation of this possibility, cryptographers have been working on quantum-resistant public-key algorithms. The National Institute for Standards and Technology (NIST) has been hosting a competition since 2017, and there already are several proposed standards. Most of these are based on lattice problems.

The mathematics of lattice cryptography revolve around combining sets of vectors—that’s the lattice—in a multi-dimensional space. These lattices are filled with multi-dimensional periodicities. The hard problem that’s used in cryptography is to find the shortest periodicity in a large, random-looking lattice. This can be turned into a public-key cryptosystem in a variety of different ways. Research has been ongoing since 1996, and there has been some really great work since then—including many practical public-key algorithms.

On April 10, Yilei Chen from Tsinghua University in Beijing posted a paper describing a new quantum attack on that shortest-path lattice problem. It’s a very dense mathematical paper—63 pages long—and my guess is that only a few cryptographers are able to understand all of its details. (I was not one of them.) But the conclusion was pretty devastating, breaking essentially all of the lattice-based fully homomorphic encryption schemes and coming significantly closer to attacks against the recently proposed (and NIST-approved) lattice key-exchange and signature schemes.

However, there was a small but critical mistake in the paper, on the bottom of page 37. It was independently discovered by Hongxun Wu from Berkeley and Thomas Vidick from the Weizmann Institute in Israel eight days later. The attack algorithm in its current form doesn’t work.

This was discussed last week at the Cryptographers’ Panel at the RSA Conference. Adi Shamir, the “S” in RSA and a 2002 recipient of ACM’s A.M. Turing award, described the result as psychologically significant because it shows that there is still a lot to be discovered about quantum cryptanalysis of lattice-based algorithms. Craig Gentry—inventor of the first fully homomorphic encryption scheme using lattices—was less impressed, basically saying that a nonworking attack doesn’t change anything.

I tend to agree with Shamir. There have been decades of unsuccessful research into breaking lattice-based systems with classical computers; there has been much less research into quantum cryptanalysis. While Chen’s work doesn’t provide a new security bound, it illustrates that there are significant, unexplored research areas in the construction of efficient quantum attacks on lattice-based cryptosystems. These lattices are periodic structures with some hidden periodicities. Finding a different (one-dimensional) hidden periodicity is exactly what enabled Peter Shor to break the RSA algorithm in polynomial time on a quantum computer. There are certainly more results to be discovered. This is the kind of paper that galvanizes research, and I am excited to see what the next couple of years of research will bring.

To be fair, there are lots of difficulties in making any quantum attack work—even in theory.

Breaking lattice-based cryptography with a quantum computer seems to require orders of magnitude more qubits than breaking RSA, because the key size is much larger and processing it requires more quantum storage. Consequently, testing an algorithm like Chen’s is completely infeasible with current technology. However, the error was mathematical in nature and did not require any experimentation. Chen’s algorithm consisted of nine different steps; the first eight prepared a particular quantum state, and the ninth step was supposed to exploit it. The mistake was in step nine; Chen believed that his wave function was periodic when in fact it was not.

Should NIST be doing anything differently now in its post–quantum cryptography standardization process? The answer is no. They are doing a great job in selecting new algorithms and should not delay anything because of this new research. And users of cryptography should not delay in implementing the new NIST algorithms.

But imagine how different this essay would be were that mistake not yet discovered? If anything, this work emphasizes the need for systems to be crypto-agile: to be able to easily swap algorithms in and out as research continues. And for using hybrid cryptography—multiple algorithms where the security rests on the strongest—where possible, as in TLS.

And—one last point—hooray for peer review. A researcher proposed a new result, and reviewers quickly found a fatal flaw in the work. Efforts to repair the flaw are ongoing. We complain about peer review a lot, but here it worked exactly the way it was supposed to.

This essay originally appeared in Communications of the ACM.

Posted on May 28, 2024 at 7:09 AM28 Comments

Comments

From here to the future May 28, 2024 10:18 AM

@ ALL,

“To be fair, there are lots of difficulties in making any quantum attack work—even in theory.”

One is practicalities.

We may end up with a perfectly working algorithm on paper

But not be able to get it to work practically.

As some know the bottom is kind of dropping out of the Quantum Computer market and money is getting short.

Worse is the price and availability of “cryo-gasses” that will work.

In the past the making of nuclear weapons helped with a suitable gas as a waste product that could be re-cycled.

Odlly it is one of the most abundant elements in the universe, the problem is due to Earth’s Gravity Well it rapidly leaves our atmosphere by simply “floating up” into space where solar radiation as space weather rips it across the vastness of the solar system.

From here to the future May 28, 2024 11:08 AM

@ALL

If we ever do get to the point where Quantum Computers do become sufficiently practical for cryptography which I’m quite skeptical about happening any time soon (half a century or more as a lower gustimate). There is another consideration we have to think about.

Which is the “all the eggs in one basket” problem.

As noted Lattice based is about the only PQC algorithm we have currently and this may not change especially if NIST finalises and standardises on it.

For some reason I’ve not chased down there has and still is talk that hybrid algorithms are “verboten”.

If Lattice becomes susceptable to non QC Attacks, then using it with another fundamentally different algorithm type will in theory give some resilience.

But I’m of the opinion that current “Key Exchange”(KEX) algorithms are in effect a “dead-duck gliding” we know that they are going to hit bottom at some point.

We should start thinking and investigating how to pass Roots of Trust over significant distances in ways that do not involve unproven theories of “One Way Functions With Trapdoors”(OWF-WTD) etc.

The nearest we have currently at a fundamental level is “Quantum Key Distribution” and on Earth it’s really quite range limited.

Which has some interesting engineering let alone security issues.

penelope May 28, 2024 11:13 AM

Breaking lattice-based cryptography with a quantum computer seems to require orders of magnitude more qubits than breaking RSA, because the key size is much larger and processing it requires more quantum storage.

This is not a very satisfying summary, because it’s not obvious how much of this difference is inherent to the algorithms and how much comes down to parameter selection. I recall the key sizes and execution times of “post-quantum RSA” being truly absurd in the suggested configurations (gigabytes and hours); still, I don’t have any specific sense of how the strength of that would compare to the lattice stuff, were their key sizes roughly matched (nor how quantum cryptanalysis might change this math).

Regarding “Should NIST be doing anything differently…?”: concern has repeatedly been expressed in the comments of this blog that NIST isn’t really doing anything to advance the hybrid modes; that, to be FIPS-approved, one may have to use the systems just as documented, even if that rules out hybridization. You suggest using hybrid modes “where possible”, which leaves significant ambiguity. Is it meant to imply that if NIST were to make it effectively impossible in certain circumstances, it would be nothing to worry about?

bcs May 28, 2024 12:58 PM

By “hybrid cryptography”, are we talking about the client and server having a slate of algorithms and agree on the single one they consider most secure based on current public knowledge? Or are we talking about a system where forward security is maintained for as long as at least one of the mutually supposed algorithms remains secure against each relevant adversary?

We know how to accomplish the second (if you are willing to pay the O(n) cost in compute and bandwidth) but I don’t think TLS current works that way, so I’m guessing based on that mention of TLS that the first is what’s being suggested.

bcs May 28, 2024 1:01 PM

@penelope I wonder what NIST would think of using the NIST approved systems “just as documented” and then tunneling that inside a non-NIST approved systems that depends on unrelated security assumptions? Or maybe tunnel things the other way round?

Not really anonymous May 28, 2024 1:16 PM

Hybrid systems in this context means that the key exchange uses nested encryption with a QC resistant scheme and a scheme believed to be strong when QCs aren’t involved.
The theory is that the risk of QCs being broken conventionally is high because they are relatively new. If you used nested encryption, you get at least the current state and maybe QC resistence.
The NSA is involved in NIST and their interests do not align well with a lot of people. So there is speculation that the NSA is dragging out getting QC standards set, weakening the proposed standards, and discouraging hybrid systems so that they can have access to people’s communications.

penelope May 28, 2024 1:41 PM

bcs, by “hybrid” I mean a combination of a well-understood algorithm such as ECDSA, and a quantum-resistant one, such that security is maintained till both are broken. This normally refers just to key exchange; maybe encrypt the randomly-generated session key with ECDSA, then encrypt that ciphertext with CRYSTALS-Kyber using an independent key. Once the session key’s established, a single symmetric cipher such as AES or ChaCha20 (none being very vulnerable to quantum computers) is used for bulk encryption.

My understanding is that “FIPS-compliant cryptographic modules” can only include approved algorithms, operating in approved configurations. For example, Red Hat says about the Linux kernel’s “FIPS mode”: “Furthermore, enforcement of restrictions required in FIPS mode depends on the contents of the /proc/sys/crypto/fips_enabled file. If the file contains 1, RHEL core cryptographic components switch to mode, in which they use only FIPS-approved implementations of cryptographic algorithms.” An OpenSSL page says: “For FIPS usage, it is recommended that the config_diagnostics option is enabled to prevent accidental use of non-FIPS validated algorithms via broken or mistaken configuration.”

It might be possible to effect a combination of a FIPS-approved quantum-vulnerable algorithm and a FIPS-approved quantum-resistant algorithm even in “FIPS mode”. However, I don’t know whether that would be acceptable to a FIPS certifier. Pretty much every “early implementer” of the new lattice algorithms is using them in hybrid form (including for TLS), and it’s curious that NIST is perceived as resistant to this idea.

vas pup May 28, 2024 2:55 PM

Reaching absolute zero for quantum computing now much quicker thanks to
breakthrough refrigerator design
https://news.yahoo.com/news/tech/reaching-absolute-zero-quantum-computing-110023118.html

“A breakthrough cooling technology could help invigorate quantum computing and slash costly preparation time in key scientific experiments by weeks.

Scientists often need to generate temperatures close to absolute zero for
quantum computing and astronomy, among other uses. Known as the “Big Chill,” such temperatures keep the most sensitive electrical instruments free from
interference — such as temperature changes. However, the refrigerators used to achieve these temperatures are extremely costly and inefficient.

The researchers published the details of their new machine April 23 in the journal Nature Communications. They claimed using it could save 27 million watts of power per year and reduce global energy consumption by $30 million.

Conventional household fridges work through a process of evaporation and
condensation, per Live Science.
A refrigerant liquid is pushed through a
special low-pressure pipe called an “evaporator coil.”

As it evaporates, it absorbs heat to cool the inside of the fridge and then passes through a compressor that turns it back into a liquid, raising its temperature as it is radiated through the back of the fridge.

To achieve required temperatures, scientists have used pulse tube refrigerators (PTRs) for more than 40 years. PTRs use helium gas in a similar
process but with far better absorption of heat and no moving parts.

While effective, it consumes huge amounts of energy, is expensive to run, and takes a long time. However, the NIST researchers also discovered that PTRs are needlessly inefficient and can be greatly improved to reduce cooling times and lower overall cost.

The team found that by adjusting the design of the PTR between the compressor
and the refrigerator, helium was used more efficiently. While cooling down, some of it is normally pushed into a relief valve rather than being pushed around the circuit as intended.

Their proposed redesign includes a valve that contracts as the temperature drops to prevent any helium from being wasted in this way. As a result, the NIST team’s modified PTR achieved the Big Chill 1.7 to 3.5 times faster, the scientists said in their paper.

Quantum computers need a similar level of isolation. They use quantum bits, or qubits. Conventional computers store information in bits and encode data with a
value of either 1 or 0 and perform calculations in sequence, but qubits occupy a superposition of 1 and 0, thanks to the laws of quantum mechanics, and can be used to process calculations in parallel. Qubits, however, are incredibly
sensitive and need to be separated from as much background noise as possible —
including the tiny fluctuations of thermal energy.

The researchers said that even more efficient cooling methods could
theoretically be achieved in the near future, which could lead to faster
innovation in quantum computing space.

The team also said their technology could alternatively be used to achieve extremely cold temperatures in the same time but at a much lower cost, which could benefit the cryogenics industry, cutting costs for non-time-intensive experiments and industrial applications. The scientists are currently
working with an industrial partner to release their improved PTR commercially.”

Winter May 28, 2024 3:30 PM

@vas pup

A breakthrough cooling technology could help invigorate quantum computing

I think practical quantum computing will be not cryogenic. It should preferably be room temperature.

There are some developments in this directions:
‘https://www.tomshardware.com/news/world-first-room-temperature-quantum-computer

The room-temperature achievement was unlocked due to Quantum Brilliance’s approach to quantum computing; instead of the more common ion chains, silicon quantum dots, or superconducting transmon qubits, Quantum Brilliance took advantage of specifically implanted nitrogen-vacancy centers in synthetic diamonds (where a carbon atom is replaced by a nitrogen one).

It will have to be seen whether this approach will scale.

vas pup May 28, 2024 4:24 PM

@winter – thank you for your input.
I hope that will work.
Glad to have reasonable folks on this blog.

fib May 28, 2024 8:20 PM

It’s interesting that the current challenges in developing quantum computers, such as the need for ultra-cooling and precise control of qubits, may initially lead to the creation of large centralized structures — the dream of decentralization always receding. This pattern does seem reminiscent of the early days of classical computing, where room-sized machines were the norm before the development of smaller, more accessible devices. History repeats itself.

MrC May 29, 2024 12:45 AM

I’d much rather see the infrastructure investment to make McElice and Sphinx palatable for TLS-like uses than bet everything on there not eventually being a big break for lattice problems.

Panagiotis Grontas May 29, 2024 1:48 AM

“And—one last point—hooray for peer review”

If I might add: Hooray for open archives and publishing in those.

Winter May 29, 2024 2:03 AM

And—one last point—hooray for peer review.

The news only prints horror stories about peer review. But peer reviewers are volunteers who (mostly anonymously) spend their personal time to check and improve manuscripts for free.

Without peer review, there would be no science.

Contrary to popular lore, most (all?) manuscripts improve markedly from peer review. Also, peer review is the original Comment Moderation that cleans up the publication record at the front door. What happens to information without moderation or peer review can be seen on Xitter and FB.[1]

[1] Both Thumbs Up for our moderator who keeps this place readable!

From here to the future May 29, 2024 2:29 AM

@bcs

“Or are we talking about a system where forward security is maintained for as long as at least one of the mutually supposed algorithms remains secure against each relevant adversary?”

Though there are a number of ways you can have multiple algorithms in an encryption system, including splitting them into logical parts and combining randomly, most do not make any real increase in cryptographic strength. And may even weaken things if errors occur the odds of which increase with complexity.

I was talking about systems where independent algorithms within their modes are used effectively in a width equalised chain or similar defined structure. To give as much forward secrecy whilst minimizing complexity by segregation and similar understood techniques.

Unfortunately as you’ve highlighted “hybrid” means as many different things to as many marketing people as can dance across hell. It is after all their job to differentiate their product from the products of others by as many “unique features” as possible. No matter how dysfunctional those features make the overall system.

Hence the need for “constraint by standards” and “interoperability guides” to limit the number of possibilities of “unique features”

As far as I’m aware this has never really been a NIST consideration. That is they “favour algorithms over frameworks”. And I can see a number of reasons why they might do so.

Amongst others is the “different sides of the pond” issue. Anyone who has had to use both US and Europe standards for any real length of time has a feel for the different ways the respective standards bodies view things.

In the US there is very much a “One and done” mentality which in effect drills down to the minimum scope possible hence an “algorithms view”. In Europe the focus tends much more to the “lets talk” of building interfaces that allow interoperation. Thus surprising to many the US has a “multiple and disparate” “little picture” view where things are independent and do not interoperate. Europe on the other hand tends to the “multiple interoperable” “big picture” view of “systems of parts” rather than just parts.

Part of the reason for this is the US “minimum time to market” view which inturn begets a minimum investment in infrastructure / systems view. As an engineering colleague put it a few weeks ago “Best bolts worst bridges attitude”.

Less obvious with “one and done” thinking is seen within standards. US standards have a tendency to “standardised tests” where as Europe tends to “standardized measures”. The US “one and done” outlook gives a single fixed standard that quickly ages as science and technology advance. The European “system of parts” view automatically gives a hierarchy of standards that are updatable as science and technology improve, which gives long term stability over short term advantage.

Thus I suspect the US will provide bolts in the form of algorithms that Europe will build bridges with when it comes to actual functioning hybrid crypto systems.

From here to the future May 29, 2024 3:50 AM

@penelope

“Regarding “Should NIST be doing anything differently…?”: concern has repeatedly been expressed in the comments of this blog that NIST isn’t really doing anything to advance the hybrid modes; that, to be FIPS-approved, one may have to use the systems just as documented, even if that rules out hybridization. You suggest using hybrid modes “where possible”, which leaves significant ambiguity. Is it meant to imply that if NIST were to make it effectively impossible in certain circumstances, it would be nothing to worry about?”

Part of the problem is that from a standards point of view what we are calling NIST and what we are calling FIPS are two entirely different animals. That is two species of herbivore like a sheep and a goat eating in the same field.

In theory they are equivalent in that the eat grass and do something useful in return for a similar level of care in pastural environments.

In some cases the “useful” is approximately the same as well ie leather / meat / milk.

Further the husbandry side in pasture land is broadly the same, though goats tend to need “shutting in” at night and more day shelter from sun and rain.

But there are notable differences, whilst sheep give more wool goats can pull carts and are much easier to train. Further goats are productive with a greater range of food sources and can be used to “clear land”.

It’s the same with what we are calling NIST and FIPS they are both useful and productive in quite similar ways, but when you get a little closer you start to see increasing differences in “useful” product.

Unfortunately in this case the differences are way more important than the similarities.

It’s been argued that bringing them closer together would bring efficiencies etc etc etc. The real question is what would those outside gain / loose. My feeling based on having been around a while is “a lot”.

Thus the answer which will bring screams from the peanut gallery is an effective interface between the two that reflects the needs of those outside (but without getting captured by any of the three). As this increases the size of the herd it is going to get attacked from “little hands thinking” where it is assumed that smaller must some how be better than larger… It’s actually the same as the various aspects of the “Toys or Tools” argument we have with “security products”.

echo May 29, 2024 11:03 AM

And—one last point—hooray for peer review. A researcher proposed a new result, and reviewers quickly found a fatal flaw in the work. Efforts to repair the flaw are ongoing. We complain about peer review a lot, but here it worked exactly the way it was supposed to.

Also bear in mind that simply waving paper qualifications around and being technically proficient in your day job isn’t necessarily enough to even be considered for peer review especially in some fields. People flanneling on about maths and gadgets have it easy!

https://www.timeshighereducation.com/news/gender-critical-scholars-claim-discrimination-over-bmj-rejections

Gender-critical scholars claim discrimination over BMJ rejections.

This isn’t even scratching the surface with the amount of bad actors and garbage out there trying to gain the sheen of peer review in a field where if you give one bad actor an inch they will take a mile. You also have to put up with things like, say, philosophers of literature appointing themselves as experts and getting jobs with shady think tanks (glorified lobby groups for dark money) who are then cited by media which used as an excuse by politicians who then meddle with law or professional standards and protocols.

The claims centre on papers submitted to BMJ Open by John Armstrong, a mathematician at King’s College London, and Michael Biggs, a sociologist at the University of Oxford.

I’ve read position papers by domain experts on the censor data which address any questions and they’re fine. There’s no way a mathematician or sociologist coming to the field cold let alone with an agenda can even remotely produce a paper on this subject without it being garbage. I’ve read some of the pig’s ear material by people who know nothing about the field or who bring biases to the data. It’s so far wrong it’s not just useless but damaging especially when amplified in the public policy sphere which is the end goal of the authors dodgy work i.e BS looking for being laundered via a peer reviewed journal.

Dr Armstrong added: “If a journal censors findings because they don’t like the results or they don’t like the author, it has abandoned science. The Cass report [on gender identity services for young people] tells us we urgently need objective evidence on questions of sex and gender, so it is vital that our medical journals reclaim scientific objectivity.”

The Cass Review is not peer reviewed, Cass herself has never worked in the field, it’s governing panel wouldn’t pass an ethics committee, methodology was changed half way through the paper, 98% of existing literature was excluded on arbitrary grounds, and he’s citing it as a gold standard? Anyone and I mean anyone citing the Cass review is asking to be given the Wakefield treatment. There’s like three lines in the entire review which anyone could call correct.

These clowns forget that transgender people can be and are professors and PhD’s themselves across multiple fields (like BMJ internal emails point out transgender people are not thick) and would flatten the authors one atom thick if they got their garbage published and the editor of the BMJ might be looking for another job.

DEI in some fields isn’t just there to protect the cohort or be a bolt on. It’s there to stop clowns like this from making a complete royal fool of themselves and is integral to best practice.

Putting science aside these clowns know nothing about the historical legal context including data protection law and workplace discrimination law. From that alone they will be starting off on the wrong footing. Simply understanding that properly helps explain why some of the censor data is weird. The fact they didn’t have a clue about even this kind of shows up how these clowns though they could muscle in tooting their own horns and expected a red carpet to roll out for them because, tah ding, job title only it didn’t work out that way and they got shouty.

Winter May 29, 2024 11:35 AM

@echo

Also bear in mind that simply waving paper qualifications around and being technically proficient in your day job isn’t necessarily enough to even be considered for peer review especially in some fields.

The situation is rather desperate. The number of manuscript submissions outstrips the available peers who can review.

Editors I know cannot find people to review manuscripts and every scientist I know is swamped by requests to review papers. In the lead up to conferences, you get half a dozen or more full papers to review. And the turn around times are shrinking so you have to respond within a few weeks, or less. All in your free time.

Not every manuscript lands on the desk of a domain expert as there simply are not enough domain experts in the world.

echo May 29, 2024 2:01 PM

The entity is wasting their time attacking me. Every time they attack me that’s less time they have to bother someone else. They also leak information every time they attack. In any case their attacks are counterproductive. If I went down a few people might start asking questions about bad actors they didn’t ask before. It also helps inform my public policy view and that’s when I start writing emails and asking to meet job titles with a view to policy initiatives at which point it becomes somebody else’s problem. I don’t think chummy thought this one through.

@moderator

Handle jumping/Disinformation/mirror propaganda/personal attack on multiple commentators/escalating tone/content potential indicator of motive for flooding blog with attacks/recycling of material deleted on multiple previous occasions.

N.B.

On the surface plausible material posted in other topics with possible sock puppets lending support to give entity social credibility.

After notifying moderator the entity usually attacks again sometimes multiple times and sometimes persists for a period after the topic expires.

https://www.schneier.com/blog/archives/2024/05/lattice-based-cryptosystems-and-quantum-cryptanalysis.html/#comment-437569

bcs May 29, 2024 2:23 PM

@From here to the future

First, my question about hybrid systems is with regards to what scene Mr. Schneier used it in the original post. How you would approach that problem is interesting, but a different question.

Getting an increased cryptographic strength wouldn’t be the goal. The goal would be to guard against algorithmic breaks. Even if the cost to break the system using known-today attacks is the same or is only linearly increased, a system that remains secure even if a total break of any single component is found would be useful and as far as I can tell is already possible.

IIRC there isn’t much if anything in the way of evidence that symmetric cryptography will be broken post-quantum so it would seem that that key distribution and authentication is the primary concern. Both of those seem trivially hybridizeable in ways that could be robustly built on existing primitives and without introducing complex new classes of vulnerability:

Running N non-interacting parallel key agreement/distribution systems and then feeding them into Shamir’s secret sharing (or some other combiner that offers provable information-theoretic security) should be possible as a minimal extension using existing primitives. As long as any one of the parallel systems is secure, a break of the others is of zero value. Even a broken implementation of any of the component systems will almost always render the aggregate system as “loudly” unusable (rather than silently insecure).

Similarly for authentication, simply distributing multiple parallel signatures in existing certs (or distributing multiple singly signed cert chains for the same entity) would seem to be robust in the face of known breaks. Un-published zero-days would be a potential issue as an impostor would only need to provide cert chains they are able to forge, but the party being asked to accept the certs has the option of requiring multiple verification. (Of note, a near identical modification would improve robustness around root-CA’s: evicting bad CA’s is much less of a problem if most users are already publishing chains to more than one root. One of these days I’m going to have to see what happens in practice if you do that using existing TLS clients.) This would however open some potential policy vulnerabilities but, depending on how you define those, they might be provably impossible to avoid.

Winter May 29, 2024 3:23 PM

@lurker

Room Temperature QC = Cold Fusion ?

Maybe? But liquid nitrogen would do (−196 °C).

Diamond is a very remarkable stuff that can be superconductive (under hellish conditions) and doped to be a semi-conductor. Reports of Nitrogen “doped” diamond quantum dots have been doing the rounds for a long time.(PDF of book)
‘https://arxiv.org/pdf/1504.05990

echo May 29, 2024 7:26 PM

Ah well. They were told. Back to simple moderator notifications it is then.

@moderator

Handle hopping/Personal attack in attempt to goad personal private information/Personal attack on multiple commentators/disinformation/attempted geolocation and or attempting doxxing/libelous accusations of criminal conduct/possible use of AI or similar tool to recycle scraped data from comments/unnecessary repetition of previously deleted content/contempt of court of third party proceedings/bossing moderator about.

https://www.schneier.com/blog/archives/2024/05/lattice-based-cryptosystems-and-quantum-cryptanalysis.html/#comment-437589

Who? May 30, 2024 7:00 AM

@ penelope

Drop the idea of using a FIPS-approved technology. It is for bureaucrats, not for organizations seriously looking into protecting their assets. If you want protection against quantum computing better listen to technical staff, not to politicians.

Technical matters require technical answers.

OpenSSH uses an hybrid NTRU Prime + x25519 kex method (formally “sntrup761x25519-sha512@openssh.com”) by default since 2022, as protection against quantum attacks while preserving the good features of a well-proven classical algorithm just in case NTRU is found to be vulnerable.

FIPS-compliant modules deliberately introduce weaknesses just for a matter of compliance with rules written by bureaucrats.

Why someone working at a intelligence agency or in the business of cybersecurity uses Microsoft or Apple products is against common sense, seriously. Same about using Linux on these environments.

A lawyer will not stop an attack against a critical infrastructure, maths will do.

Don’t make me choose between certification and security when we are a target for an attack. The former may lose. But, hey, others may choose differently.

Who? May 30, 2024 7:18 AM

Sometimes a simple answer to a problem is better.

Quantum computers may, or may not, be able to break lattice-based cryptosystems in the future. Who knows? But we know for sure they are as bad as their classical counterparts guessing. So the simplest answer is using a symmetric, non-exchanged, key.

If you can afford it, use a random symmetric key for a connection and share it out-of-band, as WireGuard does as an additional protection layer against quantum computing.

And, of course, do a wider research while we have time. Do not restrict your efforts to lattice-based algorithms. Work on other quantum-resistant technologies just in case lattice-based cryptosystems fail.

Who? May 30, 2024 7:24 AM

@ Bruce Schneier

Should NIST be doing anything differently now in its post–quantum cryptography standardization process? The answer is no. They are doing a great job in selecting new algorithms and should not delay anything because of this new research. And users of cryptography should not delay in implementing the new NIST algorithms.

Technical committees have been contamined by three-letter agencies for decades. Just consider what happened to Dual_EC_DRBG. I think there is nothing more to say about reliability of NIST.

penelope May 30, 2024 3:49 PM

@ Who?,

Drop the idea of using a FIPS-approved technology. It is for bureaucrats, not for organizations seriously looking into protecting their assets.

Of course. It’s never my idea to be FIPS-compliant, but I can’t stand up to the bureaucracy alone, and it does tend to spread. For example, if governments insist on using FIPS-compliant servers, web browsers will have to support the FIPS-approved algorithms and modes to communicate with them. I don’t want the extra attack surface, but I probably will want to access those sites from time to time (NIST, for example, to get cipher specifications—even if I’ll use them in stronger modes—and NCSTA reports). And the companies that employ programmers will demand FIPS support to sell into the lucrative government market (possibly indirectly); I could just say no to that, but not everyone is lucky enough to be in that position.

echo May 31, 2024 6:03 AM

@Who?

Why someone working at a intelligence agency or in the business of cybersecurity uses Microsoft or Apple products is against common sense, seriously. Same about using Linux on these environments.

A lawyer will not stop an attack against a critical infrastructure, maths will do.

Don’t make me choose between certification and security when we are a target for an attack. The former may lose. But, hey, others may choose differently.

This is why I keep going on about a security model which includes technical, human rights and equality, economic, and social. It’s a thing.

Duff constitutional law which flows into duff market regulation which flows into duff human rights law which flows into commissioned implementation requires a lawyer. On these NIST protocols and guidance are legal Swiss cheese. The more people focus solely on the technical issues the more they miss the real problems causing it all.

I know people don’t want to hear it but until this is accepted we’ll be here in another ten years moaning about the same problems and asking why aren’t they being fixed?

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.