Post-Quantum RSA

Interesting research on a version of RSA that is secure against a quantum computer:

Post-quantum RSA

Daniel J. Bernstein, Nadia Heninger, Paul Lou, and Luke Valenta

Abstract: This paper proposes RSA parameters for which (1) key generation, encryption, decryption, signing, and verification are feasible on today's computers while (2) all known attacks are infeasible, even assuming highly scalable quantum computers. As part of the performance analysis, this paper introduces a new algorithm to generate a batch of primes. As part of the attack analysis, this paper introduces a new quantum factorization algorithm that is often much faster than Shor's algorithm and much faster than pre-quantum factorization algorithms. Initial pqRSA implementation results are provided.

Posted on May 31, 2017 at 6:31 AM • 63 Comments

Comments

Who?May 31, 2017 7:12 AM

I haven't read the paper yet, just did a fast look at it, so I may be wrong here. Will read the full document later.

It seems that using carefully chosen RSA parameters we can have quantum resistant RSA cryptography on our current hardware. So, it should be possible, let us say, generating a pair of RSA certificates in a computer that implements these parameters and move them to a smart card (obviously smart card themselves will not generate quantum computing resistant certificates yet).

Will a firmware upgrade on the readers add this feature to hardware?

Well, at least libraries like LibreSSL should implement the new prime generation algorithms.

Who?May 31, 2017 8:33 AM

@ MrC

Indeed, I stopped reading the article when found that "[their] main result is successful generation of a 1-terabyte exponent-3 RSA key consisting of 4096-bit primes," and "the final multiplication of two 512 GB integers took 176,223 seconds in wall-clock time, using 3.166TB of RAM and 2.5 TB of swap storage." Nonsense. This is not only impossible to store on smart cards, it is out of the scope of most computing systems. I do not want to copy a one terabyte key to my ~/.ssh/authorized_keys file, not to say multiple keys.

Sad, I was expecting something like a method to generate an attack resistant Diffie-Hellman moduli file. This paper is something I would have never expected from these authors, it is certainly not their best research.

Another research targeted to the "fake news media" (I am sorry for the pun), not a serious scientific publication.

MrCMay 31, 2017 8:43 AM

@Who?

Indeed, this seems to be pretty far into "if your key is that big, why not just use a one-time pad?" territory...

AndrewMay 31, 2017 8:57 AM

Not a crypto expert but until a new algorithm is found (or a new approach, like QKD) a huge storage, huge memory or longer processing time may be the only way to protect against quantum processing power.

OtherMay 31, 2017 9:06 AM

Our batch prime-generation algorithm suggests that, to help reduce energy consumption and protect the environment, all users of RSA|including users of traditional pre-quantum RSA|should delegate their key-generation computations to NIST or another trusted third party.
The article does not contain any publishing date. In the PDF file's properties, the creation date is 2017 April 19. I guess that the extra "9" is a typo.

Clive RobinsonMay 31, 2017 9:17 AM

@ All,

I started reading the paper, but stopped when I got to page 4, section "Future Work" 2nd patagraph and read,

    Our Batch prime-generation algorithm suggests that ... users ... Should delegate their key-generation computations to NIST...

I do not now about anybody else, but I don't trust anyone else to make my keys. I use a couole of sources of entropy, a TRNG I designed built and tested, and a physical generator using a number of dice. These are used to generate large quantities of random bits that are both filtered and de-biased prior to a mixing process. The outout is then tested by software I wrote and tested, then used to search for primes with an algorithm I designed, wrote and tested and the results then go into further tests...

I'm sorry but NIST is untrustworthy because their standards process is flawed securiry wise and this appears to have been abused on more than one occasion by the NSA "finessing"...

Would other people take seriously the suggestion that NIST or any other third party that a Govetnment has influance over could be trusted?

The simple answer is no, we fought the Key-Escrow battle some years ago, and the reasons then have not changed for the better. In fact they have changed for the worse, a lot worse.

Clive RobinsonMay 31, 2017 9:21 AM

@ Other,

We appear to have hit the same stumbling point at about the same time...

Clive RobinsonMay 31, 2017 9:46 AM

@ MrC,

Indeed, this seems to be pretty far into "if your key is that big, why not just use a one-time pad?" territory...

Because in some respects the OTP is very inferior to PubKey.

For instance the OTP like all current symmetric cryptography requires a second out of band channel to securely transfer Key Material (KeyMat), otherwise it offers no security. Worse unlike other forms of symmetric crypto the OTP KeyMat needs to be transfered in advance and of a size sufficient for all envisioned communications in any given KeyMat transfer window. Further the OTP KeyMat is not reusable unlike many symmetric crypto systems, this gives rise to real problems for Key Generation (KeyGen). All of which means that there is a lot of Key Managment (KeyMan) issues both physical and logistic with OTP systems that rule it out for most --but not all-- crypto work.

The advantage of PubKey is it allows secure inband KeyMat transfer, thus it does not require the highly problematic secure second channel for secure communications to be established. Further KeyGen for the symetric crypto usually used behind it has considerably less problems and can be done as and when required on demand.

The major issue with Internet security is that it realy will not work with a "secure second channel" because the only way you can securely transfer KeyMat for symetric encryption boils down to a face2face transfer in a secure location. You would have to do this for every secure communications channel you wished to establish and that is in no way practical.

So like it or not we are stuck with PubKey, because the maths behind it is so far more trust worthy than any third party or out of band communications channel.

Dirk PraetMay 31, 2017 9:52 AM

@MrC, @ Who

"a 1-terabyte public key"?!

It will probably fit on a thumb drive or in an expansion slot in your head in a couple of years from now.

@ Clive, @ Other

Our batch prime-generation algorithm suggests that ... all users of RSA|including users of traditional pre-quantum RSA|should delegate their key-generation computations to NIST or another trusted third party.

Err, no. NIST can go covfefe itself.

MrCMay 31, 2017 10:27 AM

@ Clive

How exactly do you plan to do an in-band transfer of a 1TB file? That would take my ISP days to deliver, at which point they'd probably terminate my service. Maybe someone somewhere owns a wire that can handle that in a reasonable timeframe, but for mere mortals like me, transmitting a 1TB file is so massively inconvenient that it's just faster and easier to hand deliver a hard drive instead. At which point, since I'm already forced to do a face-to-face meeting, why not just exchange a symmetric key? The one-time pad bit was slight hyperbole -- my point being that exhausting a TB of pad would take months if not years for most purposes. I'm not seriously advocating that anyone start using 1TB OTPs; rather I'm pointing out that the practical problems with a 1TB public key probably exceed the practical problems with OTP.

AnonMay 31, 2017 11:01 AM

I sort of fail to the point of this.
The problem with RSA is that factoring is polynomial time with quantum computing and this does nothing to change it. Sure it ups the exponent on the on the polynomial a little from my cursory understanding, but it's still polynomial.

As for the suggestion that we leave key generation to NIST, the same jokers responsible for the backdoor in Dual_EC_DRBG, is completely laughable.

And of course, this says nothing of the fact that if their key sizes are such that a single multiply takes more than 2 days on a cluster of computers, even the "simple" operations like modular exponentiation used for encrypting/decrypting will take months.

The only safety this system provides is against anyone ever using it.

SandroMay 31, 2017 11:02 AM

Nice shot Bruce, now you have to confess:
did you want to check how much we pay attention to your posts?

Perhaps the paper could be interesting as speculation, but if we have understand well, 1-terabyte exponent-3 RSA key consisting of 4096-bit primes, multiplication of two 512 GB integers and so on, don't fit well to our (secure signature creation) devices ;-)

Milo M.May 31, 2017 12:26 PM

"Post-quantum RSA is not what one would call lightweight cryptography: the cost of each new encryption or decryption is on the scale of $1 of computer time, many orders of magnitude more expensive than pre-quantum RSA."

ATM fees are gonna wipe out our bank accounts.

AnuraMay 31, 2017 12:36 PM

1 terabit key is capable of fitting on a hard drive or being downloaded over the internet. This has practical applications, it's just not a general purpose replacement for RSA. For highly secure systems that need to communicate over the internet where you need to be certain the key isn't compromised in transit this is potentially very useful. Systems that might currently rely on a very large one time pad, for example, might benefit from switching to this key exchange.

AnonMay 31, 2017 12:43 PM

Just one more note about threat model.

When quantum computers that can factor RSA are eventually created, in the near term immediately afterwards, the only people who will probably have one are nation states (due to cost/difficulty of building one). Thus, the threat any post quantum algorithm needs to secure against is nation state level attacks - anything else can be adequately secured by existing pre-quantum algorithms.

Thus a post quantum algorithm that outsources its key generation to the government is like outsourcing design and security of the hen house to the foxes.

MeMay 31, 2017 1:53 PM

@Anon

I agree, but consider that those able to afford this will be those needing to protect themselves from nation states. I think the implication is that those people will be in our government.

Jesse ThompsonMay 31, 2017 3:04 PM

Just to be sure, don't we already have post-quantum algorithms more easily handled by ordinary computers today? SIDH for example.

In light of this I see projects like "What would it take to make RSA post-quantum?" more like academic explorations than any kind of serious search for a usable cryptosystem (such as one that might recommend involving NIST in *anything*?!)

@Me • May 31, 2017 1:53 PM

I'm really not even kidding: having NIST make your keys for you is not even appropriate *for government agencies*. Just imagine trying to get your fieldwork done without getting killed at the CIA, but NIST is sharing all of your key material with your foreign target just because cheeto thought that would be funny.

Yeah, no.

Nick PMay 31, 2017 3:14 PM

@ All

"until a new algorithm is found (or a new approach, like QKD) a huge storage, huge memory or longer processing time may be the only way to protect against quantum processing power." (Andrew)

"It will probably fit on a thumb drive or in an expansion slot in your head in a couple of years from now." (Dirk)

This is why I figure he posted it and why I sent it. Everyone do remember that there's only a few, post-Quantum methods available. They haven't had the peer review RSA has had. One already has ridiculous, key sizes. Another is patented with a version failing to classical attacks. Any of them might fail in the future.

So, we have work like this going on. The situation with asymmetric is Poe's Law: you get solutions that might be necessary evil or just a joke. Hard to tell as bad as things are. Worst case is we go OTP-like as another commenter suggested where we transfer the key material on encrypted HD's periodically. Since this is cumbersome, the public will opt for trusted, third parties like they do now for free serviced but maybe with HSM's. NSA does that in EKMS btw.

ab praeceptisMay 31, 2017 3:58 PM

Clive Robinson

I started reading the paper, but stopped when I got to page 4, section "Future Work" 2nd patagraph and read,

Our Batch prime-generation algorithm suggests that ... users ... Should delegate their key-generation computations to NIST...

Exactly. Same with me, plus: nist? Seriously? After nists collusion with nsa in tainting crypto nist not only isn't trustworthy but to be considered an enemy.

Moreover, TB keys (and the time needed) are evidently not practically useable.

On the other hand I understand the authors in that they look from an academic perspective and such their work *does* make sense. Moreover the part about *considerably* faster (than Shor) algorithms is interesting and of importance.

jerMay 31, 2017 4:12 PM

Given the stakes, current computing capabilities, no feasible attack angles, and adding a projection of five to ten or even twenty years, I would say this is how you go about hedging your bets. Most entities don't need that, but this is not one of those entities.

ThothMay 31, 2017 5:15 PM

@Clive Robsinson, ab praeceptis

Can we still trust these authors, especially DJB, after making such a weird publication and bad suggestion of "trusting NIST" ?

Or maybe DJB is just trolling us all ?

We need a DJB golden sticker ?

AnuraMay 31, 2017 5:33 PM

Well, it does say "Make RSA Great Again", and that means making sure your keys are used by Real Americans™ and not terrorists.

call girlMay 31, 2017 5:52 PM

The post-quantum speculation is highly premature. We don't even know what "quantum computation" is supposed to do or if it's any more achievable than the "artificial intelligence" that is always 50 years away.

No one seems to take the trouble to clarify whether we are talking about Geordie Rose's "quantum annealing" or one of the various proposed quantum logic gates, or whether any of these gates are achievable in practice.

We do not have clear details on "quantum error correction" other than that it is absolutely essential to any possible realizable quantum computation of non-trivial size.

There is a more subtle philosophical issue underlying this, which concerns the so-called "collapse of the wave function" which in reality is not a collapse but an irreversible dissipation of computational information into waste heat, as described by Landauer

kBT ln 2

where kB is Boltzmann's constant, which might be more familiar to chemists as kB=R/NA, the ratio of the gram-molar ideal gas constant R to the gram-molar Avogadro's number NA, and T is the absolute temperature of the waste heat reservoir.

Thus to convert entropy from information-theoretic units such as bits to a more familiar engineering unit such as Joules/Kelvin, we may multiply by

kB ln 2

as long as we have Boltzmann's constant kB in the desired units, and the natural logarithm of the base of the information-theoretic unit.

ThothMay 31, 2017 6:02 PM

@all

The published pqRSA paper seems to be a troll paper. Notice the keyword tag used "Make RSA Great Again" under the abstract portion. I guess they either have nothing better to do than just troll around with some weird pqRSA. I guess we can take that paper just for the laugh and nothing all too serious.

A dedicated pqRSA golden sticker might look pretty.

Clive RobinsonMay 31, 2017 6:18 PM

@ MrC,

How exactly do you plan to do an in-band transfer of a 1TB file?

I don't, but that is not the point.

The size of the file is time relative, currently a 4Kbit key size is considered adiquate and will remain so for some time to come.

The problem security wise is not the PubKey maths, but the way we current use PubKey.

If we accept that at some point 4Kbit Pk will be broken then we have to accept that due to "collect it all" that any plaintext including symmetric keys sent under it will become available to the NSA et al.

Even increasing the Pk size will not feasibly resolve this problem.

There are two known solutions to this problem.

The first is only send information that has a short temporal period. Thus most financial transactions on the Internet use a Credit Card. However within six months the record of the sale are a "matter of record" and available to the NSA et al any way as "business records". The solution in this case is to also make Internet Credit Cards have a very short temporal life, preferably for only a single transaction, we already know ways to pursue this in a way that Quantum Computing is not likely to effect.

The second known way is to use a shared secret and amplify it in a way such that each party can derive a key for a symetric algorithm with neither being sufficiently effected by Quantum Computing. As an overly simple idea for explanation only each side could send a random number that they then transform via an asymetric algorithm keyed by the shared secret. The problem is obviously establishing the shared secret.

There however may be other ways currently unknown to perform the same functions as current Pk that is Quantum Computer proof for the next century to millennium but without the excessive size of key.

But there may be another way Quantum Key Distribution if the distance, switcheng and one or two other problems can be overcome, may render the current need for Pk or other Quantum Computing sensitive methods to be irrelevant. By for instance enabling a shared secret to be securely transfered easily without the two parties having to meet geographically.

Making both a possability is afterall what research is about.

But I suspect making those two possible, may actually be easier or achived befor Quantum Computing can be scaled up sufficiently to be of use against Pk of more moderate key size than 1TB.

call girlMay 31, 2017 6:25 PM

@Thoth

The published pqRSA paper seems to be a troll paper.

I have that concern as well about the work of people so apparently full of themselves. These papers are coming out of a back office somewhere with the force of gunfire.

There's a frat house. Drugs, sex, and alcohol all lie. Guns generally do not.

However, I must add that mathematical truth does not proceed out of the barrel of a gun.

The math needs to be proven so that ordinary folks can be convinced of its truth.

Something is not right about this, and I am concerned about the freedom of others to criticize this work — if it is anything but a troll paper. We seem to have lost a critical mass of bona fide academic research somewhere about the time college tuition began to exceed prudent levels of debt.

We need to get real about this. If you can't work your way through college debt-free with a summer job like your parents or grand-parents could, your education is way too expensive and you aren't getting what you paid for. Profs should definitely not be funding their own "research" with students' tuition monies, especially when the students are not participating in such jealously guarded research.

AnuraMay 31, 2017 6:35 PM

Trolling aside, a centralized Diffie-Hellman parameter generator might not be a bad idea. Use something like SHAKE256 as a DRBG to generate Diffie-Hellman parameters from a counter and test them using a verifiable process. Then have the servers post the counter value as well as the parameters online as they find valid ones, signing them as they find them.

You can have dozens or more servers cranking out parameters, and then other systems can regularly get new ones for use in an ephemeral Diffie-Hellman key exchange. This allows you to mitigate the issue of reusing Diffie-Hellman parameters, while also ensuring the safety of the parameters and minimizing overhead. Then we can have a secure ephemeral key exchange without the concerns of using ECC.

Werner AlmesbergerMay 31, 2017 6:52 PM

The NIST reference alone makes it pretty clear that this is not serious.

As far as I understand what the state of the art is, quantum computers of suitable size should be able to defeat any RSA. Thus anything that resists would be a breakthrough, even if it required a ridiculously large key. So that's the one thing I actually wouldn't consider a red flag :-)

More: Make RSA Great Again. Reference [1].

Why ? No idea. "Because we can" sometimes is plenty.

- Werner

ThothMay 31, 2017 7:06 PM

@all

Despite the paper being almost obviously a troll paper, they are not proposong anything too out of the way except for requirement of very huge computing capabilities due to the 1 TB of exponent required.

As noted in the paper, 4096 bits Modulus can still be used whereas what they are asking for is 1 TB of exponent.

It is known that currently, most RSA operations uses a fixed 3 byte expinent of 0x010001. If a fixed 1 TB modulus is used and the private modulus are kept secret, it is still not as bad.

We can simply assume a fixed exponent and then only need to generate the 4096 bit private modulus and public modulus and sent the public modulus which might turn out to be rsther efficient as we dont have to consider the exponent if using a fixed exponent.

To avoid storing 1 TB worth of exponent, we can simply use 0x0100......01 (we have been using 0x010001 for exponent for ages) and we simply fill a RAM large enough with 0x010000..01 and do the BigInteger math and get the results hopefully without needing to spend too much time and money on the RAM memory.

All in all, the troll algorithm isn't too impossible to implement :) .

Clive RobinsonMay 31, 2017 7:31 PM

@ ab praeceptis, Thoth,

nist? Seriously? After nists collusion with nsa in tainting crypto nist not only isn't trustworthy but to be considered an enemy.

Which naturaly gives rise to,

Can we still trust these authors, especially DJB, after making such a weird publication and bad suggestion of "trusting NIST" ?

I think we can all agree various NIST standards commitees got owned by the NSA, and we can also probably agree that it was the way the standards committees were formed that allowed the capture to happen.

To be honest all the standards organisations I've had any dealings with get owned/captured by non benign interests be they commercial or governmental agencies.

As I've explained before it's the "Health and safety" version of the political "Think of the children". If you are sociopathic enough to frame your hidden interests behind "Health and Safety" you will outflank ordinary people who have either consciences or a fear of future consequences. It is the reason why we are still wasting trillions on the various "Wars on xxx".

So whilst I can understand intellectualy why NIST or any other standards body could get "owned". Especially as I warned about the NSA et al "owning standards" long before that, I still feel a sense of betrayal even though it was all together inevitable.

The thing about trust is there are two ways of looking at it, the social way and the technical way. Humans as social animals tend to trust implicitly then get betrayed. On the technical side you assume that nothing is trustworthy untill you have either verified it as such or have mittigated it if it is not.

The reality of modern systems is that there is no practical way to verify they are trustworthy. It was comming to this realisation last century that made me start down the road to the "mitigation approach" that gave rise to Castles-v-Prisons.

Thus my view point on NIST is that the safe way of behaving is to take the technical approach. That is to assume as defaulty they are not trustworthy, therefore verify or better still mitigate.

Which brings us back to the second question about the authors of the paper. I will have to read through the paper a couple of times before I make my mind up but it is odd behaviour.

Why say NIST when saying "a trusted third party" would have been more than sufficient.

Maybe one or more of the authors are trying to curry favour with NIST or some other US Gov agency.

What ever the reason I would find it hard to believe that any of the authors are compleatly unaware of community feelings about NIST.

Clive RobinsonMay 31, 2017 7:58 PM

@ Dirk Praet,

Err, no. NIST can go covfefe itself.

Is it an anagram, is it an acronym, no it's a Trumpism...

On the anagram front could it be V-coffee, or is the Doh-gnarled tryint to re work the "We for coffee" joke?

Clive RobinsonMay 31, 2017 8:21 PM

@ Call Girl,

Profs should definitely not be funding their own "research" with students' tuition monies, especially when the students are not participating in such jealously guarded research.

That's not what is happening to students tuition fees. In fact the spending on Uni staff is such that some tutors qualify for food stamps and other assistance.

Apparently a number of US Uni's have in effect become investment houses or hedge funds... Hence certain people at the head of the Uni administration are actually taking the tuition fees and gambling with it, to pay themselves very very excessive incomes...

ab praeceptisMay 31, 2017 8:48 PM

Thoth, Clive Robinson, call girl (et al)

I see two main aspects re. nist and djb.

- there are different groups such as academia, mist/nsa/..., large corps and gov'ments, and us, the security people (keep in mind that our perspective can be quite different from the one of cryptologists).

djb and his colleagues, being in academia, are in one way or another (as professioals) living and working with a construct called academia which is certainly not "free" and which is in a major part financed by state and large corp money.
Moreover *the* major driving - and paying - forces in the field are again states/agencies and large corp.

Which leads to a simple and tough situation: At least you don't attack them (in a professional capacity) and if you want your work to continue (being financed) and accepted you will be pretty much bound to look halfway neutral.

Furthermore we all know that something withouth nist/fips/eal/etc stamp won't be accepted by state agencies and large corps.

*We* security people are mainly interested in the net result ("security") plus being mistrusting is part of our healthy professional attitude. A cryptologist, however, usually is mainly a mathematician and academician. Both his definition and measure for security often is quite different from ours (e.g. attacks being of no less complexity than brute force).

It is hence much less strange for a cryptologist to use nist as an example for a "neutral" and widely accepted third party (also with access to considerable computing resources) than it seem for us.

In other words: When *we* judge (as I among others did) we naturally do that from our perspective and the result is pretty much bound to be a clear "nist? No!".

If we, however, want to judge someone like djb then we must apply the norms and usances of *his* field!

Another aspect - that is more important for us anyway - when asking "can we still trust djb (or Bruce Schneier, or ...)?" is that we do not even need to trust them a lot. a) we ourselves can examine the result of their work, the algorithms and b) quite commonly their own colleagues rigorously examine each others work.

So, as far as I'm concerned: No, no golden sticker for Chacha, 25519, etc. I *do* trust those algorithms with good reasons (and btw. I also trust djb and based on solid grounds).

I suggest to not boil down and reduce that paper to "nist" but to study it. Many years of experience tell me that a djb/Lange paper shouldn't be ignored.

--------

I found call girls thoughts and remarks interesting. I myself also don't expect quantum computers (not 4 qubits experiments but actually useable systems) anytime soon. In fact, I would not even bet more than a pizza on quantum computing anyway for diverse reasons, some of which call girl also mentioned.

However: As security people we must assume the worst.about half a century of "don't worry, that won't be a problem" premises have brought us into the ugly situation we're in now. So, again: Let us assume the worst case scenario, maybe quantum computers, maybe aliens landing, may who knows what.

Which leads me back to djb's paper. Forget that (from our perspective) unhealthy "nist" remark and let's look at the beef.

PK still pretty much means algorithms based either on the factorization or its cousin the log. reduction (for ECC). Now, make a little - and frightening - jump in your minds: If djb and colleagues have found some non-pq Shor competitor then we must assume that nsa/ghcq and accomplices have that, too.

djb just went the big step to the conclusion, namely: So, we have to assume that rsa 4K isn't good enough anymore. But how far do we want to go? Is 8k good enough? 128k?
That seems pretty much akin to how much time will ever bigger rsa buy us?

From what I see djb asks the right question in driving it to the point of assuming that sooner or later we will be at the point that we currently label as "pq" (post-quantum). We must understand that quantum-computing in a way is just the only known realistic looking incarnation of a problem that, as djb demonstrates, may well find other incarnations, some of which, btw. will evolve out of nsa's ever rising computing power obsession.

I also found call girls thoughts interesting as they tough a major *real* boundary for nsa: energy and the related factors (e.g. cooling). We should understand that that means no less than that some major leap in quite different fields (such as cooling live silicon, for instance) might create the situation, the threat that we currently paint and imagine as "quantum computing".

In other words: It is not quantum computing that is threatening us at the horizon. QC, again, is but one incarnation of the real problem, the monster behind it: Massivley increased computing power, no matter how (and in the wrong hands -> nsa and accomplices).

pq-secure crypto must be an urgent target and I agree with djb that it's not anymore good enough to step from 512 to 1k to 2k to 4k rsa. We must prepare for the monster because it will come. If that happens in the QC incarnation or in another one is not the relevant factor. But QC is what we always had in mind as image of the monster and it serves the purpose well.

And please, let us also learn to be humble as scientists and engineers! Not only because djb's paper is humbling but also because arrogance and ignorance have led us to where we are. Let us be humble and let us be mistrusting. Both, in my minds eye, are sine qua nons for good security people.

AndrewMay 31, 2017 10:50 PM

To be honest, 1T long key is not really a terrible idea if you have to set up secure communication, like in diplomacy, for example. It may take few hours, maybe days, to transfer it but then you have a secure quantum resistant channel.
On the other hand, there are quantum processors out there and the progress is really fast. I think today's​ supercomputers can do the job too, RSA is far from being safe.

Some lecture:
http://www.nature.com/news/ibm-s-quantum-cloud-computer-goes-commercial-1.21585

keinerJune 1, 2017 2:46 AM

"This work was supported by the Commission of
the European Communities through the Horizon 2020 program under project number 645622 (PQCRYPTO) and project number 645421 (ECRYPT-CSA); by the Netherlands Organisation for Scientific Research (NWO) under grant 639.073.005;"

Sorry, but, can hardly believe...

CassandraJune 1, 2017 2:54 AM

I wonder if QC-resistant encryption might be developed from undecidable problems? Current methods involve doing things that are mathematically 'hard', like factoring the product of large primes, or extracting discrete logarithms - hard, because there is no known algorithm of doing what is needed quickly. Undecidable problems are formally not susceptible to algorithmic solutions at all.

The MRDP theorem (Matiyasevich, Robinson, Davis, Putnam) proves that a general algorithm for solving all Diophantine equations cannot exist, and Zhi Wei Sun proved that the solvability of a Diophantine equation is undecidable if it has a minimum of eleven integer variables

It is trivial to determine if a set of 11 integers is a solution to a particular Diophantine equation - but given a particular Diophantine equation with 11 variables, there is no general algorithm for solving it, and you can't tell if it can be solved or not.

So, you create a 'public key' from an ordered set of Diophantine equations, some of which have solutions, and some of which don't (or, at least, you don't know if they have or not). You have solutions to some of them, which you keep private - and you have enough equations with solutions in the set so that you can use Shamir's Secret Sharing algorithm with a threshold to construct a decryption key.

An attacker, who does not have any solutions has two problems: (1) No general approach for solving the equations and (2) no method of determining which can be solved and which cannot.

I'm not proposing this as an encryption method. I am neither a mathematician nor a cryptographer, either of whom could be reading this and laughing their socks off. It depends on mathematical undecidability (in the Gödel/Turing sense) applying to quantum physics and quantum computing. I just would not be surprised to find undecidability playing a role in quantum-computing resistant cryptography.

Note that undecidable problems are not especially rare (the halting problem is one):

https://en.wikipedia.org/wiki/List_of_undecidable_problems

The nice thing about Diophantine equations is that you are dealing with integers, and solutions are trivial to verify, which are nice properties to have.

An expert in Quantum Computing might be along to tell me I'm wrong. If so, I'd be grateful for the opportunity to learn. Some background may be found here: http://philsci-archive.pitt.edu/3180/1/Quantum_Hype.pdf

Cassie

Clive RobinsonJune 1, 2017 3:00 AM

@ ab praeceptis, and the usual suspects,

Another aspect - that is more important for us anyway - when asking "can we still trust djb (or Bruce Schneier, or ...)?" is that we do not even need to trust them a lot. a) we ourselves can examine the result of their work, the algorithms and b) quite commonly their own colleagues rigorously examine each others work.

As I said above,

    Thus my view point on NIST is that the safe way of behaving is to take the technical approach. That is to assume as default they are not trustworthy, therefore verify or better still mitigate.

At the end of the day the "mitigate" before trust is the best solution with "verify" before trust second best.

I see no reason why this viewpoint should not also apply to other entities, in fact I would recommend it as "the best MO" to follow currently. To see why look at it this way, mitigation whilst an increased expense is even in the short term going to be less expensive than verification.

I realised this way back last century whilst designing "Intrinsically Safe"(IS) "Industrial Control Systems"(ICS). My thinking on it quite some years ago led me to say of NIST, that instead of having algorithm competitions, they should be looking at frameworks where components such as algorithms could be efficiently replaced. Thus any embedded or other system could be effectively patched when algorithms or their implementation become suspect.

My reasoning was then as now, that a small increased cost during manufacture to meet such a standard would easily save costs and landfill space in the likes of smart meters and potentially lives with people with implanted medical electronics. The problem I saw was that unless regulated in some way manufactures would always go for "short term profit" beyond the point of recklessness (race for the bottom free market behaviours). Something we have now seen with multiple IoT devices and atleast one manufacturer of Implanted Medical Devices.

However I'm also aware that other people think differently, and focus on other asspects. But come up with broadly similar reasoning and answers,

https://www.imperialviolet.org/2016/05/16/agility.html

TreblaJune 1, 2017 3:42 AM

@Andrew, if you are able to transfer a 1 terabyte key you could just as well have transferred the message itself through the same channel.

Dirk PraetJune 1, 2017 3:48 AM

@ keiner

Sorry, but, can hardly believe...

The 645421 (ECRYPT-CSA) CEC project number checks out and even mentions some other well-known contributors like Shamir, Daemen, Rijmen and Preneel. But it's specifically about challenges in authenticated encryption and only briefly mentions the breaking of RSA-2048 by quantum compters. Both EU ECRYPT-CSA and PQCRYPTO projects indeed seem to fall under the Horizon 2020 umbrella, probably set up by Bernstein (et al) in his capacity as professor of mathematics and computer science at the Eindhoven University of Technology in The Netherlands.

The man has quite an impressive track record - also in fighting the USG, cfr. Bernstein v. United States - so I am a bit puzzled here. The main problem with NIST is that by statute they are still required to consult with the NSA, which makes pretty much everything they come up with kinda suspect. In their defense, they did reopen the public comment period for the SP800-90 publications and rescinded EC-DRBG from that standard. Perhaps this was enough for DJB, but like most others here, I remain kinda skeptical.

Does anyone know about European, Russian or international NIST counterparts?

Dirk PraetJune 1, 2017 3:52 AM

@ keiner

Sorry, but, can hardly believe...

The 645421 (ECRYPT-CSA) CEC project number checks out and even mentions some other well-known contributors like Shamir, Daemen, Rijmen and Preneel. But it's specifically about challenges in authenticated encryption and only briefly mentions the breaking of RSA-2048 by quantum compters. Both EU ECRYPT-CSA and PQCRYPTO projects indeed seem to fall under the Horizon 2020 umbrella, probably set up by Bernstein (et al) in his capacity as professor of mathematics and computer science at the Eindhoven University of Technology in The Netherlands.

The man has quite an impressive track record - also in fighting the USG, cfr. Bernstein v. United States - so I am a bit puzzled here. The main problem with NIST is that by statute they are still required to consult with the NSA, which makes pretty much everything they come up with kinda suspect. In their defense, they did reopen the public comment period for the SP800-90 publications and rescinded EC-DRBG from that standard. Perhaps this was enough for DJB, but like most others here, I remain kinda skeptical.

Does anyone know of any European, Russian or international NIST counterparts?

rJune 1, 2017 4:46 AM

With lattice and ring operations, this isn't the only pukpqc operation available.

Not sure how this RSA release its advantageous outside of the alter resource scales vs competing operations.

Ring and lattice are slowwww i believe but not nearly as bounded.

rJune 1, 2017 4:48 AM

The words "parameters" and "nist" in the same document should be classified under "depth charge".

rJune 1, 2017 4:53 AM

The argument against OTP in this case v pqRSA could be reasonable for active states and those capable of creating dedicated devices.

Radio anyone?

Clive RobinsonJune 1, 2017 5:11 AM

@ Dirk Praet,

Does anyone know of any European, Russian or international NIST counterparts?

In what way?

Most countries have their own standards bodies even in Europe, though many defere to other standards bodies like CEN and CENELEC in Europe and ISO and IEC internationaly, along with UN affiliated bodies such as the ITU etc. When it comes to "Putting on the Market" goods usually carry the standards marks for multiple markets thus mobile phones would carry the European CE mark and the US FCC mark.

When it comes to trusting NIST many of their standards can be verified by common sense and a slightly jaundiced point of view. However this does not apply to any of the ICT standards pertaining to security or it's subfields where as you note the NSA takes an interest.

Such standards are at best difficult for other experts in the field of endevor to verify as was seen with the Dual EC RNG. And when the opposition employe a vast number of such experts the difficulties get worse.

However, if NIST were tasked with providing 4Kbit primes you don't have to trust them implicitly. Anybody with a spare PC or two could as with gene folding and SETI at home could randomly take an individual prime and check it either for free or "as a proof of work" which some people are getting keen on for other security protocols.

The question then becomes how you can scale up such a probabalistic method to give you the desired security margins.

MrCJune 1, 2017 7:49 AM

@ Cassandra

How would encryption work? I.e., how would I go from a plaintext plus an ordered set of (unsolvable-to-me) equations to a ciphertext?

TSJune 1, 2017 9:01 AM

Uh,. part of quantum computing was the whole "quantum entanglement" thing,. which would allow 2 computers on opposite sides of the world to read 2 bytes and know the value of the other quantum byte without any transfer time,. perfect security,.

RSA isn't applicable to quantum computing,. because it doesn't need to be - in theory.
Static data still needs to be obfuscated obviously,. but 1 terabyte keys, won't be needed,. you have a unique entanglement with your ISP or google's server,. it will know who you are without needing to send any keys.

call girlJune 1, 2017 9:25 AM

@Clive Robinson

The reality of modern systems is that there is no practical way to verify they are trustworthy. It was comming to this realisation

This is 33° frat-boy-speak. The spelling must improve. No true Scotsman confuses "is" with "ought to be." Those who make such a confusion are of the sort who can't, won't, don't.

last century that made me start down the road to the "mitigation approach" that gave rise to Castles-v-Prisons.

Mitigation is good. Layers are good. The Castle approach and the Prison approach are not mutually exclusive, but both necessary. Bad things from the outside must stay out. Bad things that happen inside must be contained in order to mitigate the damages. The logical premises of these most basic aspects of security must be sound, verifiable, trustworthy, inviolable, and fully borne out in source and executable code and in the hardware on which it runs.

Clive RobinsonJune 1, 2017 10:04 AM

@ Call Girl,

The logical premises of these most basic aspects of security must be sound, verifiable, trustworthy, inviolable, and fully borne out in source and executable code and in the hardware on which it runs.

The problem is even "Uncle Sam" has come to realise that "verifiable" is not actually realisticaly possible even if you make the chips upwards yourself. Because to verify things have not been tampered with requires you to in effect destroy them...

Worse things can be done on chip that very few understand or can spot... In effect you can not verify what you do not know how to find and exclude.

A few years ago the DoD started projects to in effect verify the supply chain. As @Nick P pointed out they now appear to have disappeared. What we don't know is if it's behind the veil or on the scrap heap.

CassandraJune 1, 2017 12:13 PM

@MrC

I think a possible method is described here, although it is way, way, over my head.

https://pacific-mathforindustry.springeropen.com/articles/10.1186/s40736-015-0014-4

I am not endorsing it, or making any claims to its correctness, or security, simply advising of its existence.

My thoughts were simply that the difficulty to solve, but ease to verify a known solution of Diophantine equations made them a possible candidate for a trapdoor function. On the other hand, people with far better mathematical minds than I have been looking for good such functions for some time, and surely would not have missed Diophantines, so I'm just being ignorant again.

Cassie

CassandraJune 1, 2017 12:59 PM

@MrC

The paper I referenced has references to public-key systems built using Diophantines:

A New Public-Key Cipher System Based Upon the Diophantine Equations
http://doi.ieeecomputersociety.org/10.1109/12.368013

[Note: the system was broken by subsequent analysis:
Cryptanalysis of a Public Key Cryptosystem Based on Diophantine Equations via Weighted LLL Reduction
https://link.springer.com/chapter/10.1007/978-3-319-44524-3_18 ]

The key exchange cryptosystem used with higher order Diophantine equations
https://arxiv.org/abs/1103.3742?context=cs.IT

On a key exchange protocol based on Diophantine equations
http://www.hte.hu/documents/169298/393368/2013_3_3_Hirata-Kohno.pdf

The last one is a good read.

Cassie.

ab praeceptisJune 1, 2017 5:17 PM

Clive Robinson, call girl

First a general remark. When I address someone here the intention usually is *not* to argue against that person but merely to indicate to whom I respond or on whose post I relate.

While I largely agree with (and actually am pleased to see someone else here to think straight along a "hardcore" line) it seems to me that your way to address Clive Robinson is inadequate. Clive has quite some reputation here and well deserved I'd say.

Besides, he is right. The two of you, so it seems to me, just come from slightly different perspectives. Clives perspective is pragmatic and I'd agree with his statement that existing modern systems can't be really verified (to be safe and secure). They are simply to complex and to ignorantly designed.
You on the other side make a (good and correct) statement about how system should be. Btw. I'm quite certain that Clive would agree with you - unfortunately, however, that's not how things are.

Just look at the chip circus. Profit driven customers deliver some kind of design spec which rarely has been done with safety and security (as we understand it) in mind. The profit driven fab then transform those designs into something siliconizable and produces the chips; when they do testing they test for a) some basic internal parameters and b) for customer spec conformance and that's it.

Looking later at the overall safety and security of system consisting of that hardware and some software (usually multiple layers, some of which come from who knows where) the result is usually lousy - and btw very difficult to assess in the first place.

ab praeceptisJune 1, 2017 6:06 PM

Clive Robinson

I get your point but I've lost my believe in mitigation (although one might argue that looking at the current state usually all we do comes down to mitigation). Moreover mitigation by definition is not the best solution; after all mitigation implies that something went wrong in the first place.

But what really triggered this response was the link to the imperial violet article.

What he describes there is sad on multiple levels (one of which is his own view). Pretty much everything there is rotten and mistaken and actually describing the sad state of things and why we came there in the first place.

Example: "protocols *should* be extensible" - uhum. Because the holy sea of google or some mistaken professors said so? Looking at that statement from a practical perspective we quickly see what's the real driving force behind it: Something like "We quite never get it right so let's design crap at least in a way that allows for new crap versions".

And besides it's logically questionable. What *is* a new version? Actually - and more healthily - we can decribe a new version as a new "product" that happens to reuse some work and some interfaces from another "product" (that happens to be called 'the old version'"
But there's more ...

What is that thing called protocol? And how is it different from the interface? To make it short, everything gets mixed up - which again explains why we have the whole problem in the first place.
When he brought up tls as an example I wondered whether I should sarcastically smirk or whether I should be saddened and cry.

Let's look somewhat deeper: There is a plethora (OK, a small one, but still) of "AES extension" algorithms. A considerable part of the aead candidates, for instance, either extend aes or (more or less slightly) change an aes phase or at least reuse major parts of aes. Some even spell out their reasoning clearly: it's proven to work, very extensively examined, and brings along a solid security reduction and so, hardly surpringly, quite some of the candidates quote aes' security reduction ...

How does that work? And how does it work that we have aes 128 and 256 and, if needed, can create a 512 bit "extension", too? Obviously the answer is that the cryptologists (had to and) did work professionally and methodically - very much unlike the tls weirdos and their (not so) funny friends at google and elsewhere.

To make the whole game funnier quite some tls users didn't care a rats a** about the "protocol" anyway and, as he correctly describes, for instance just took version or protocol not supported as an *error* rather than to fall back/chose an available one.

*That* is the situation and I very strongly doubt that the google, mozilla, apache, and other people who drove us deep into that crap zoo will be able (or even *honestly* willing) to drive us back out. Frankly, I don't see much more than wild arm waving, some good will theater, and a major dose of ignorance and incompetence, of course.

In case you want an example for "wild arm waving, some good will theater, and a major dose of ignorance and incompetence":

They now formally verify the protocol safety. Wow. Even with a reasonable tool (proverif). There's just a tiny but very ugly problem: Unless one very well understands the problem field a proverif "All is OK!" is all but worthless. Why? Example: proverif (and other spi-calculus/horn clause based tools) doesn't verify much of what one states; it merely cares about the protocol and runs a yolev-dao attacker against it (by default a mildly friendly one btw).
Now, well noted, proverif *is* a fine tool and I'm pleased to see tls 1.3 being tested with it. But there is this ugly problem ...

When you introduce a hash function ("fun hash/1.") proverif does not know nor care what that function does! To offer a brutal example, if hash/1 happened to be a function that simply adds 3 to its parameter proverif would be quite happy. Similarly it simply assumes that any function with one private ("secret") parameter returns a private/secret result.

As I said, it's a protocol verifier. A very useful one, a proven to be solid one, and one that offers the comfort to run worst case attackers against your protocol (if you explicitely ask it to); it's just that that kind of tool doesn't care about your functions and assumes them to be good and proper.

That means that I could *prove* a protocol using md5 hashes and a PK algorithm that simply mod 32 multiplies two 32 bit integers to be secure (if only the protocol itself is secure)!

In other words: by itself those tools prove basically nothing of worth. Only as a part of a good tool chain they are worthwhile and valuable. In particular the building blocks (like key and hash functions) *must* be properly specified, implemented and verified (both algorithm and implementation) to offer a solid tls.

We always scold openssl (and rightfully so) but the infestation goes way deeper.

replyJune 2, 2017 1:24 AM

"Worst case is we go OTP-like as another commenter suggested where we transfer the key material on encrypted HD's periodically"

when if that happens private search shared space and one and third party consent issues will need a very very public educational refresher including the sneak-and-peek and covert entry angles.

MarkHJune 6, 2017 1:42 AM

@Cassandra:

Zhi Wei Sun proved that the solvability of a Diophantine equation is undecidable if it has a minimum of eleven integer variables

I wonder whether this does not accurately represent Sun's work.

If "does this equation have any solution?" is undecidable, then the equation has no solution. If there were any solution, discovery of a solution would answer the decision problem in the affirmative: it would constitute proof that yes, there exists a solution.

Therefore, the assertion that "the solvability of a Diophantine equation is undecidable if it has a minimum of eleven integer variables" implies that no Diophantine equation with a minimum of eleven integer variables has a solution ... and this cannot be the case.

I didn't find Sun's paper, but perhaps it addresses a less specific decision problem.

CassandraJune 6, 2017 4:27 AM

@MarkH

The paper's URL is:

http://maths.nju.edu.cn/~zwsun/htp.pdf

"ON HILBERT'S TENTH PROBLEM AND RELATED TOPICS"

The paper references later work in April of this year (2017):

"Further Results on Hilbert's Tenth Problem"

https://arxiv.org/abs/1704.03504

"...there is no algorithm to test whether an arbitrary polynomial Diophantine equation
P(z1, . . . , z11) = 0 (with integer coefficients) in 11 unknowns has integral solutions..."

The Wikipedia description of undecidability [ https://en.wikipedia.org/wiki/Undecidable_problem ]says:

"...an undecidable problem is a decision problem for which it is known to be impossible to construct a single algorithm that always leads to a correct yes-or-no answer."

Note carefully: the proof of a lack of an algorithm for determining if there is a solution or not does not mean the Diophantine equation has no solutions, it just means there is no single algorithm for finding them. So if you are presented with a polynomial Diophantine equation in 11 unknowns, you have no algorithmic means of determining if there is at least one integral solution. You could try random guessing, but that is a halting problem* - you never know if your random guessing will complete or not.

*the halting problem is another famous undecidable problem.
[ https://en.wikipedia.org/wiki/Halting_problem ]

JoaoJune 7, 2017 10:07 PM

I was under the impression that NTRU Public Key was already available under two licenses (open source license and none open source license) in here: https://github.com/NTRUOpenSourceProject/ntru-crypto and that it should provide a good level of security in public key technology even in a post quantum era.

Also I've seen that McBits Public Key may also be a nice alternative, as NewHope Public Key also could be or even supersingular isogeny.

This new RSA key... will definitely not stick :p

Apparently some one is already doing the necessary work for testing with a fork of openssl at: https://openquantumsafe.org/ for at least key exchange, the public/ private key digital certificates part seems that is missing.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.