So just keep using RSA.

I am wondering that is there any way to improve to create primes . I mean there is no duplicate of public key left.

]]>Although I am bit late for this discussion, I do not understand why all this enornmous intellectual strain to figure out new PRNGs when a even a lousy hardware generator would give each bit its own piece of entropy ? It is true that hrng bits will be somewhat correlated and somewhat biased (without postprocessing which is not a good idea) but it is still infinitely better than a handfull of perfectly correlated bits produced deterministically by any prng.

It is sad that peolpe are spending so much time and research effort on futile topic especially for crypto purposes where speed and perfect de-biasness are of no importance.

]]>[1] If “the random number used for your ‘random’ start point is weak…”

I take “weak” to mean here, that the number is to any significant extent predictable to an adversary. In such case, the resulting key is insecure, no matter how else the rest of the procedure is done. The initial “guess” must be unpredictable, or it’s game over.

[2] “…it becomes increasingly likely that you will end up picking the same prime number than someone else”

Yes — but all the same, very very very unlikely indeed. Typically, 1024 bit keys are based on randomly chosen 512 bit primes. The “narrow range” of 512 bit integers contains (if I did my calculation correctly) about 2 x 10^151 primes.

If (say) one trillion trillion (that is, 10^24) such keys were generated, the probability that there would be even a single duplicated factor among ALL of them, is less than 10^-100.

[3] “Everyone is using the exact same primality test, after all.”

No. There are three well-known tests for pseudoprimality (that can tell whether a number if probably prime), and two practical tests for primality (that can prove a prime to be so), which might be used in real-world cryptosystems. Most of these tests have parameters that must be chosen by the designers, and so may exist in numerous variants. Also, two of the pseudoprime tests require “random” numbers, and so might never run the same way twice.

When RSA keys are based on pseudoprimality testing, there is a risk that a modulus factor will be composite, which of course results in an insecure key. The system designer can choose the test parameters to set the probability of this disaster to a level that is judged to be acceptable. However, aside from this risk, the test chosen doesn’t affect the probability of duplicated factors.

]]>*Everyone is using the exact same primality test, after all*

Err no they are not…

In the not so distant past, asside from the “Brut Force” method, there was not a diffinitive test for primality in very large primes, just a probabilistic test, which had the advantage of being very quick.

Then three graduate level students in India came up with a diffinitive test, which whilst not quite as quick as the probabilistic method was “none to shabby”.

There are also other ways of finding primes more quickly than sequential search from a random point.

One of which is based on the simple knowledge that the two numbers either side of multiplying the sequence of primes (including 2) has a better than average chance of being “twin primes” (2x3x5=30, 29/31). Also contrary to what is often said primes do have a fairly regular pattern because they effectivly “reflect” around these points and each sub-multiple down towards the next lowest point.

To see this think about each prime being the fundemental of a harmonic generator, each harmonic strikes out all it’s harmonics from further consideration, this is the basis for the sieve method. Now write out all the numbers from 0 to 60 and underline those that are prime. Now examin the pattern either side of 30 notice any similarities around the point, how about 60-90, 90-120, etc?

Thus if you were lazy you could encode the ofsets from a suitably high reflection point and the first couple of hundred primes. Ask how long a prime the user wanted find the nearest reflection point and use the offsets to give you a much greater chance of finding two primes very quickly.

Oh these reflection point twin primes have one prime with the two Least Significant Bits being both 1, the other 01, which means that the result of multiplying them is 3 mod 4. However as waluable as this may appear to be I would not use them ðŸ˜‰

]]>To “generate” the prime number necessary for encryption, you start at a “random” point within a specific range, say, 200-digits numbers.

You then test the candidates for primality,

If the range where you search for big prime numbers is too narrow (it is highly likely that everyone uses the same range), or the random number used for your “random” start point is weak, it becomes increasingly likely that you will end up picking the same prime number than someone else. Everyone is using the exact same primality test, after all. ]]>

Amen to that.

]]>The world has changed! “Proprietary” used to refer to technology for which was secret, and/or for which some kind of license was required.

Apparently, the term has been redefined: “proprietary” is now any technology for which I need to pay more than $0.00, and/or requires that I move my bottom from my chair.

Bruce’s “Practical Cryptography” can be got used from Amazon for less than $16.

I’m old enough to remember making trips to the library, in order to learn cool and useful stuff.

Sincerely,

A. Fossil

]]>Let’s say I want to implement Fortuna – with some effort, I can get access to the book, one way or another.

Let’s say I want to implement CSPRNG – probably I will skip Fortuna, because it’s not publically available.

But now the interesting scenario – somebody else has implementation called “Fortuna”, and I need to evaluate whether it works and has not been weakened somehow – intentionally or not. Seems I’m out of luck then…

]]>