Microsoft Is Adding New Cryptography Algorithms

Microsoft is updating SymCrypt, its core cryptographic library, with new quantum-secure algorithms. Microsoft’s details are here. From a news article:

The first new algorithm Microsoft added to SymCrypt is called ML-KEM. Previously known as CRYSTALS-Kyber, ML-KEM is one of three post-quantum standards formalized last month by the National Institute of Standards and Technology (NIST). The KEM in the new name is short for key encapsulation. KEMs can be used by two parties to negotiate a shared secret over a public channel. Shared secrets generated by a KEM can then be used with symmetric-key cryptographic operations, which aren’t vulnerable to Shor’s algorithm when the keys are of a sufficient size.

The ML in the ML-KEM name refers to Module Learning with Errors, a problem that can’t be cracked with Shor’s algorithm. As explained here, this problem is based on a “core computational assumption of lattice-based cryptography which offers an interesting trade-off between guaranteed security and concrete efficiency.”

ML-KEM, which is formally known as FIPS 203, specifies three parameter sets of varying security strength denoted as ML-KEM-512, ML-KEM-768, and ML-KEM-1024. The stronger the parameter, the more computational resources are required.

The other algorithm added to SymCrypt is the NIST-recommended XMSS. Short for eXtended Merkle Signature Scheme, it’s based on “stateful hash-based signature schemes.” These algorithms are useful in very specific contexts such as firmware signing, but are not suitable for more general uses.

Posted on September 12, 2024 at 11:42 AM10 Comments

Comments

Clive Robinson September 13, 2024 3:32 PM

@ ALL,

The PQC “Key Encapsulation Mechanism s”(KEMs) are still a “work in progress” as far as NIST is concerned with the standard specifying only “Module-Latice Based Key-Encapsulation-Mechanism”(ML-KEM) so far.

But importantly it is not exactly efficient for anything other than large or long messages. That is sending the average EMail is going to feel the strain. As for SMS style text messages…

As Microsoft Principal Product Manager Lead Aabha Thipsay notes,

“PQC algorithms… come with some trade-offs. For example, these typically require larger key sizes, longer computation times, and more bandwidth than classical algorithms.”

Have a think on that, as Aabha goes on,

” Therefore, implementing PQC in real-world applications requires careful optimization and integration with existing systems and standards.”

What is not said is that the PQC algorithms involve considerable complexity and are “not real life tested” in a hostile environment like the Internet…

Ask yourself “What could go wrong?”

Well think back we had this problem with AES…

That is AES was theoretically secure, but practically… No not really the “up on the contest site” algorithms were quickly proven to be very insecure in practical implementations due to side channel issues that had not been considered during the competition (even now there are AES implementations “in use” that leak worse than Henry’s bucket, that he asked Lisa to help make good).

The fact is, the security of these PQC algorithms is still “untested theoretical” at best and nowhere near close to “battle proven”.

But what we do know is the PQC algorithms are at best “inefficient” which is never a good sign. Which is why quite a few are looking at the fairly rapidly improving “Quantum Crypto” and the likes of satellites for “Quantum Key Distribution”(QKD).

This Microsoft implementation is going to be an,

“All the eggs in one basket”

Solution which also makes it a “Single Point of Failure” or “attack” or “backdoor”…

It’s also why some have been saying “Hybrid Systems” should be the way to go, but… What has been suggested is not what others think should be the way to go.

To say things are a mess right now is a bit of an understatment.

From my perspective, I’m going to assume that PQC is “probably insecure” and look at other ways to get good “privacy” for where it might be needed.

Where “privacy” is not really required like most “On-Line” activities I’d be less worried about someone actually making a Quantum Computer that a “cods-up” in the implementation of PQC in practice.

Some will agree with me others will not, but from a certain “business perspective” the risks attached to PQC is currently to high. Especially when compared to other methods based around existing symmetric crypto and traditional (ie non asymmetric) KeyMat distribution systems.

nobody special September 14, 2024 6:16 PM

https://en.globes.co.il/en/article-photonic-quantum-computer-co-quantum-source-raises-50m-1001488820

Quantum Source is developing a quantum computer, which is a computer that
utilizes natural chemical processes of light particles to produce computing
operations at a rate and power millions or billions times greater than
supercomputers – which are themselves considered extremely powerful computers.

The company says that Quantum computing is a fundamental paradigm shift in computing, with the potential to dramatically accelerate technological advancements in drug design, material development, cybersecurity, and more.

Large-scale, fault-tolerant quantum computers, with millions of qubits, are
critical to unlocking the potential of quantum computing but have yet to reach
commercial viability. Photonic quantum computing technology offers the best
route to commercialization, however, the biggest obstacle has been the massive
inefficiency in creating entangled photonic states, a challenge that Quantum
Source addresses.

tricia September 15, 2024 12:37 PM

Some will agree with me others will not, but from a certain “business perspective” the risks attached to PQC is currently to high.

It can be used in ways that reduce overall risk, without adding much new risk. sntrup761x25519-sha512 in openssh is probably one such case; it should have as much security as x25519 alone, provided there’s no dumb mistake—like sharing key material between the algorithms, or having a buffer overflow in sntrup (has anyone formally proven their absence?).

I think software signing is a case where this should be done soon. Key and signature sizes in tens of kilobytes don’t matter, when the software’s so much larger than that. Signing speed isn’t important. It’d be foolish to trust a brand-new algorithm, but trusting the combination of that and a proven algorithm is fine.

Of course, if the company’s signing key is checked into Git (or worse, posted on Github—it’s happened) or is regularly shared between programmers, or the program itself is insecure, choice of signing algorithms shouldn’t be the priority.

lurker September 15, 2024 8:03 PM

@tricia
“Key and signature sizes in tens of kilobytes don’t matter, when the software’s so much larger than that.”

Key and signature sizes in tens of kilobytes do matter, when the message is so much smaller than that. It’s not just about process efficiency: the smaller the message, the greater the risk to keymat.

tricia September 16, 2024 10:47 AM

lurker, sure, their applicability to “offline” signing of large files doesn’t generalize. But that’s a good place to start.

In some cases, including live chat protocols and SSH, the cost could be worked around by negotiating symmetric cipher and HMAC keys. That might even work for web TLS—if connection setup speed can be optimized, and key material doesn’t leak via timing—given that some publishers think nothing of making everyone download megabyte-plus pages.

By contrast, in DNSSEC, where even RSA signatures are inconveniently large, these new algorithms will be completely untenable without a major re-design—for example, using some type of tiered signature system with only the top level signed per se, and the lower levels just hashed. And even if people figure something out, it can take upward of a decade to deploy DNS protocol revisions. Probably there are a lot of cases like this.

Clive Robinson September 17, 2024 7:40 AM

@ tricia, lurker,

Re : Signing patches and the like.

Yes software patches from MS and similar can be abusively large (forcing Win10 downloads was not MS’s finest hour).

But many other patches can be quite small especially those that are sent out as “source code diffs”.

There was a time however when SMS texts were 140 chars tops and the online equivalent limited to the payload of a single network packet.

Much “human communication” is in effect snippets, I’ve had a reputation for being quite wordy but I don’t think I’ve pushed beyond a few hundred words here very often.

And these PQ Keys as you note,

“Key and signature sizes in tens of kilobytes…”

In effect are coming up into the large pamphlet or small booklet size (60 chars by 30 lines per page).

That is going to shove quite a margin on EMail and similar communications and storage. As for CPU cycles and resultant carbon footprint…

But also “mobile computing” think of how much extra battery is going to be needed in smart devices just to stay with the same usage people have got used to.

I’ve lived through two loads of this in the past firstly with the start of “Secure Sockets” that made e-commerce practical and then “HTTPS Everywhere” that dragged much of the rest of the internet into a semi-secure mode.

History shows that even though both were technically a success, neither really gave the increase in privacy needed at the time or later.

All commonly used on-line “negotiation protocols” are subject to “MITM” or “fall back” attacks, and it’s hard to see of a properly secure way around this without a “shared secret” and “secure second channel” to transfer it between parties.

It’s in part why some are investing so many resources in “Quantum Key Distribution”(QKD) as a sort of first step in building a different foundation not built on the shifting sands of mathematics.

And unlike “Quantum Computing”(QC) QKD appears to be at least moving forwards in a practical and usable sense.

Muhammad Naveed Khurshid September 18, 2024 2:54 AM

I was wondering if as per the law/patriot act/other rules, the NSA already knows how to break these algorithms. My only concern is if the NSA can break it then the bad guys/state actors/ others can break it too. So if that is the case, then what is the point of having these new algorithms?

tricia September 18, 2024 7:46 PM

Muhammad, there are no known public US laws or rules that would require NIST to only approve NSA-breakable algorithms, nor would it be easy to sneak such things through a competition like this—a lot of cryptographers commented on the algorithms, and those found to be flawed were abandoned or fixed.

So, any adverse actions on the NSA’s part are likely to be much more subtle. Side channels and buffer overflows in the implementations, perhaps. The lack of a standardized hybrid mode. Maybe protocol flaws, particularly if people re-design protocols to work around the large key and signature sizes. One conspiracy theory about IPsec is that the NSA tried to over-complicate it, such that people trying to use it would eventually give up and stick with cleartext.

But you’re probably over-estimating the spies. There was a time when their mathematical knowledge and computing power were considered far ahead of the public’s—for example, they knew about differential cryptanalysis about 15 years before its public discovery, and modified DES to protect against it. I think most cryptographers believe those days to be over; they’re ahead, but not nearly that far, and probably don’t have some secret solution to Learning With Errors. Also see Bruce’s post “The NSA is Not Made of Magic“.

Clive Robinson September 19, 2024 7:46 AM

@ tricia, Muhammad,

Re : NSA is not made of magic or money.

You note,

“But you’re probably over-estimating the spies. There was a time when their mathematical knowledge and computing power were considered far ahead of the public’s”

The prosaic reasons were an apparent oddity in themselves,

1, Academia had little or no interest in those areas of mathematics.
2, The only “in field jobs” were as lowly paid civil servants ticking away your life to a reasonable middle class pension.

Then the likes of finance houses and Silicon Valley woke up to the use of certain types of mathematics –and physics– for high frequency trading and extracting value data from collected data.

They were offering starting salaries five or more time civil service salaries and way more for the “brightest and best”.

Thus a career path outside of the NSA and the like opened up to Math PhDs.

Then as a consequence of changes in the way people society changed, academia started up with other career paths as did Start-Ups.

And the “spook houses” discovered they had competition they could not compete with for the brightest and best they used to “grind through the system” because they had nowhere else to go.

Are the spook houses getting applicants still, use but in many cases they are not what would be described as “top draw”.

The old “We can tell you things that you can not learn anywhere else” etc as a draw really does not apply as a recruiting tool the way they used to do.

For most that “have real skills” then they now have better alternatives some way way better, and a while ago that was seen by some as “existential to the spooks”. Obviously it was not but there are still voices complaining about “the way it was, but now is not” for them.

Also the USG has not done it’s self any favours over one or two people who took the lid of carious dirty laundry baskets of the USG. And various legislators of certain colours made real fools of themselves to make it all worse.

So crappy pay and conditions in a government agency cesspit of secrecy and petite fogging bureaucracy is for some reason nolonger as attractive as it once was, especially as the “Hobson’s Choice” has in effect gone for candidates.

It’s why some US Agencies we know have re-enacted the old “go to prison or join the army” option.

Raven9 September 21, 2024 6:14 AM

The current state of publically available encryption is bullshit. Encrypted files that contain headers that say “Hey tyrannical authorities! I am encrypted data encrypted with X encryption algorithm and I belong to the owner of this device so just threaten them with 10yrs in prison and they will spill the beans!”
That is what I mean by bullshit encryption and is pretty much what happens to activists and journalists like Sarah Wilkinson when they get arrested by tyannical government authorities under some bogus application of anti-terrorist law.

So With the current state of fascist governments masquerading as democracy we need perfectly deniable encryption. Undetectable deniable encryption. For example an app that encrypts data then applies that encrypted data to an image file to blend it with the image in such a way that it is still an ordinary image with no new identifying headers or meta. Only by applying the correct password can that image be decrypted. Some years ago I did this as proof of concept I converted the encrpted data to an image. It looked like noise. That image could be made transparent or added to another image like a filter so it is perfectly deniable and in fact you could apply two or more sets of encrypted data to the same image using different passwords for each.

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.