But is much easier and more performant to use ECC. You need way less random bits.

Whilst harder to understand than RSA, ECC does have significant advantages currently.

Hence RSA should be “depreciated” and heading for the scrap heap (not that ECC might have that much more of a life).

On another not PQC has had a knock or two reasonscently, I guess we will just have to wait and see if the NIST contest was actually worth it…

]]>So the real nature of the beast inside this problem is the “lack of entropy” for generating independent true random numbers.

The naive algorithm can get very unlucky. Q could be a twin prime of P, so factoring the semiprime just requires extracting the square root of the Semiprime+1.

With regard to Hamming Distance of two alleged primes of equal size, say N bits, then the Hamming Distance will almost always be (N-2)/2 which does not tell you anything, unless it does not hold, then you either do not have two primes, or, as you noted, the random is not random.

Primes have almost always have half of the bits between the endpoints set to one, the other half being zero.

I think that is related to mod 3 stuff.

As to making the P and Q not close (but also not too far apart), it is probably more secure to not use equal bit sized primes.

For example, you want a 512 bit sized semiprime. Instead of using two different 256 bit sized primes, you use a 260 bit sized prime, and a 252 bit sized prime. Checking the Hamming Distance by sliding the 252 bit string over the 260 bit string may be interesting.

You still have to extract the same amount of random bits. You still have to find closest prime which also assumes that your primality check algorithm does not fail. You will still need 508 random bits.

But is much easier and more performant to use ECC. You need way less random bits. You do not have to check for primes.

You actually can roll dice to create your private key.

]]>Nobody in this thread suggested using ‘ring oscillators’, nor are they the topic being discussed.

Wrong and wrong.

You said,

You don’t need a potentially infinite loop for Q.

Which is wrong.

As even you know that with,

unless your random bit source has serious problem to start with and your adversary knows that.

So you pulled in the TRNG for scrutiny in your choice of statment.

So, if you understood the issue from a more practical perspective, as I indicated above with,

Especially in what are effectively “Embedded Systems”, where entropy sources are very nearly non existant.

You would see why I also said that,

Thus the desire to “Cheat The System” by not generating “True” “Independent” Random is very high and very much higher for Q than P…

In short “Embeded Systems” are nearly all “solid state” these days and on a quiet network segment such as in a test/setup lab, getting enough entropy for over a thousand “Truely Independent” “Truely Random” bits[1] could take a very long time.

It’s one of the reasons we have On-Chip TRNGs these days, and if you checked you would find that Ring Oscillators with D-Type latch mixing is very prevelant.

This lack of entropy on Embedded Systems by the way is very far from being a “recent issue” as the finder of this occurance kind of implies with his “not seen before” date. I’ve seen all sorts of issues where people just do not understand what is involved with TRNG’s.

[1] Oh you might want to ask why I said three bits, and what they should actually be as well…

]]>It would be N-2 and make the MSB and LSB ‘1’ and then the increment would be by an even number.

You could also generate the start value to be a multiple of the smallest n primes, not just 2, then add 1 and use an increment sequence that avoids multiples of those small primes. That is just an optimisation. The point here is that the MS bit set to 1 ensures that the prime will have a definded size.

Which is small fry issues compared to Failure, Bias, Interdependence

Indeed, but you are diverting the discussion. The topic was how to generate P and Q so that the difference of squares (Fermat) method would be useless in practice to find them.

I’ve described this several times before,

Indeed you have, so there is no need to repeat this all over again and again.

Nobody in this thread suggested using ‘ring oscillators’, nor are they the topic being discussed.

You don’t need a potentially infinite loop for Q.

Actually you do… If all goes well you need to,

1, Truely Randomly generate,

2, Truely Independent bits,

3, Sufficient to alow all valid P’s and Q’s to be generated.

We know that three bits of N –the PQ multiple– have a fixed value (1xx..xx01). Therefor as a realy rough estimate if nothing goes wrong then you would need PubKeyLen-3 bits of Truely Random and Truely Independent bits.

Not,

To generate an N-bit prime: get N-1 random bits, add a ‘1’ as MS bit, increment until prime.

It would be N-2 and make the MSB and LSB ‘1’ and then the increment would be by an even number.

Which is small fry issues compared to,

1, Failure

2, Bias

3, Interdependence

On the True Random Bit Generator (TRBG).

Most chip based TRBG’s are based on very lousy “CMOS Inverter Ring Oscillators” that are “Digitally Mixed” via D-Type Latches. These are so bad, that the chip designers hide the deficiencies behind “crypto algorithms” the implication being “hash algorithms” but as an observer of only the output you would not be able to tell if it was a “Block Cipher” with a key that is known to some people but not the majority (it’s actually quite likely as the gate count thus silicon real-estate would be smaller and the likes of AES hardware is built into,CPU’s these days).

Importantly crypto algorithms do not in any way create Shannon Entropy, only determanistic complexity. Sadly a point way to few appear to understand the implication of even these days. It’s why for many years I’ve called the practice of “hiding” a TRBG behind a Crypto Algorithm either a “scam” or deluded “Magic Pixiedust thinking”.

One of the issues with “Ring Oscillators” is the entropy they produce at the output is caused by the same process that can cause “soft latch-up” meta-stability.

I’ve described this several times before, and those interested will find one example on,

https://www.schneier.com/blog/archives/2013/10/insecurities_in.html

If you realy want to dig into it, have a look at Uni of Cal Prof Behzad Razavi books “Design of Analog CMOS Inegrated Circuits” “Design of CMOS Phase-Locked Loops” the second of which goes into the design of CMOS inverter Ring Oscillators in some depth. Alternatively if you want to start at the begining the early 1970’s Motorola CMOS data book / APP Notes are a gentler intro.

As for,

As for Hamming distance, this is really a non-problem unless your random bit source has serious problem to start with and your adversary knows that. In fact, requiring a minimal value here could be worse than just leaving it to chance.

Obviously the Hamming Distance should not be “zero” and low numbers and “same bits set” suspect.

The purpose is not actually to use the Hamming Distances individually, but as a fast way to take a reading of the health of the TRNG. How much analysis you do is upto you, and it rather depends on what your security criteria are.

After all even Professionaly designed TRNG’s that ordinarily have better than 32bits of entropy can just by being illuminated with a quite low power 10GHz CW EM source get the entropy taken down to a little less than 7bits.

Whilst a search space of ~4billion is achievable on modern computers whilst a search space of ~100 is almost in range of “by hand”. Thus detecting dynamic changes is something that TRNG designers/users do actually need to be aware of.

]]>Once for P

One or more times for Q

You don’t need a potentially infinite loop for Q.

To generate an N-bit prime: get N-1 random bits, add a ‘1’ as MS bit,

increment until prime.

Do this for P and Q, with different sizes N, adding up to the required key size.

The numerical difference between P and Q will be of the same order of magnitude as the largest one even for a small difference between the sizes.

As for Hamming distance, this is really a non-problem unless your random bit source has serious problem to start with and your adversary knows that. In fact, requiring a minimal value here could be worse than just leaving it to chance.

]]>This has publicly “untied the bag” but it is still upto you to “look in the bag” to see if it’s a pigglet or a cat[1]…

As indicated there is a simple algorithm that will drop you into this mess…

1, Get a True Random Number “R1”.

2, Seed a counter “X” with TR1.

3, Inc X untill Prime is found.

4, Use Prime as “P”.

5, Inc X untill Prime is found.

6, Use Prime as “Q”.

What you should actuall do is after step 4 is,

Generate a second “independent” “true” random number “R2”. Check if R2 is “sufficiently different”[2] from the first random number R1. If not generate another independent true random number for R2, “loop untill” R2 is sufficiently different to R1. Only when sufficiently different then reseed the counter X with R2.

Then carry on at step 5.

Why do people not do this well there are a couple of reasons.

Firstly programers instinctively do not like potential endless loops… That is that “loop untill sufficiently different”.

Secondly they especially don’t like them if the generator process that loop is dependent on, is it’s self not deterministic, slow or unreliable. Which the process of generating “Independent True randomnumbers” can be all to easily seen as.

That is TRNG’s can take a very very long time to come up with 1/4 of the RSA Key length bits. Especially in what are effectively “Embedded Systems”, where entropy sources are very nearly non existant.

Worse you have to do the generation process at least twice,

1, Once for P

2, One or more times for Q

But what makes it realy bad is Whilst the entropy for P can be effectively gathered in the background that for Q can not… Therefore the program “halts” for an unknown period of time that could be “for ever” in human terms.

Thus the desire to “Cheat The System” by not generating “True” “Independent” Random is very high and very much higher for Q than P…

So the real nature of the beast inside this problem is the “lack of entropy” for generating independent true random numbers.

How you fix this by the way is not by using “on chip TRNG’s”. Because as far as security is concerned they “just move the problem not solve it”…

[1] Comes from a very early confidence trick. Basically at a “live stock market” it was very usual to find “pigglets to bring on” being sold (fatten it up as it grows and slaughter it late autumn and it will give your familly meat and more importantly fat over winter). The problem with pigglets is they are very wriggly and so escape very easily, therefor you would be given the pigglet you bought in a tied sack… The thing is unless you look in the bag, you don’t know if you have a valuable pigglet or a worthless cat, but if you do look there is a very good chance what is in the sack will escape… So often people did not look so it was possible to pass off a cat in a bag as a pigglet in a bag hence the old caution.

[2] The questions of what “sufficiently different” means and how you test for it arises. In theory just one bit difference “can be sufficient” but you actually want the difference to be spread across all the bits some how. And that’s where the fun starts with True Random Number generators, because the reality is it’s very hard to remove bias and all dependence. One way is to measure the “Hamming Distance”,

https://en.m.wikipedia.org/wiki/Hamming_distance

Which thankfully for truely independent binary strings of equal length has a relatively simple test. Which is XOR them together, then count the number of set bits in the result. The problem, is if they are not “truely independent”… that is if some partially or fully determanistic process is used by accident or design in the strings generation.

]]>See https://arstechnica.com/information-technology/2022/03/researcher-uses-600-year-old-algorithm-to-crack-crypto-keys-found-in-the-wild/ ]]>

h00ps://www.cisa.gov/uscert/ncas/current-activity/2021/06/30/printnightmare-critical-windows-print-spooler-vulnerability

At this rate, the usual “it’s the cat” or “it was the dog” is going to be substituted by “printer” or anything print-related.

]]>