I was about to offer a small correction -- that the idea is not strictly a myth -- but I looked at US patent 4405829 A filed in 1977, and it does indeed specify that the decryption exponent be computed as the inverse of encryption exponent mod lcm(p-1, q-1), which is identical to lambda(p*q) [also known as the Carmichael function]. So I'm in your debt -- I was not aware that RSA was originally specified that way!

That being said, computing the decryption exponent by the usual weaker criterion of inverse mod phi(p*q) is sufficient to guarantee the property of completeness; and further, to my knowledge, the potential non-uniqueness of the decryption exponent d does not give rise to any practical attack.

A purely academic question would be the historical inquiry into how phi came to be substituted for lamdba in the many textbooks, websites and other presentations.

In practice, security applications of RSA choose one exponent to be small (that is, representable in a number of bits far less than the length of the modulus n=p*q). With this constraint, the embarrassing collision e=d is not possible.

]]>Yeah, I noticed that, too. I daresay the idea that the RSA private exponent must be an inverse of the public exponent modulo phi(p*q) is the most enduring myth in all cryptography. Of course the truth is that the private exponent must be an inverse of the public exponent modulo lambda(p*q).

]]>Perhaps better key selection would be MAX: 55, PUB: 7, PRIV: 3.

This again shows how easy it is to weaken the system by mistake in implementation (unless some three letter agency was selecting the particular coefficients ;-).

]]>

Vertigo! Vertigo!

]]>I've also never gotten over the fact that they chose 521 bit prime curves instead of 512 bit prime curves (where every other prime curve is a multiple of 32 or (in one case) 16) the only reason I can think of for that is that weak 512 bit curves are significantly less likely than for other prime curves.

I'd rather expect it the other way round (but that's based on nothing but intuition; I think round numbers are more likely to lead to vulnerabilities). Remember that weakening crypto isn't the NSA's only interest; they want codes in use that

a) they can break if needed

b) nobody else can break

So they may well have fixed an obvious (to them) vulnerability they expected to become public knowledge soon, like they did with DES. That doesn't mean they didn't introduce another, less obvious, vulnerability...]]>

That said I am considering using ECDSA (or the ec based digital SIG system Nick spoke of on Friday) for my blogsig project. This is due to the need to fit both a signature and metadata into an 80 character (a single standard length line) signature. Given I have stated that non-repudiation and absolute certainty are not part of the brief I think it is a reasonable enough choice. A blogsig is designed only to certify that there is a *high* (not absolute or legally provable) probability the signed post was composed by the keyholder and has not been modified (except for reformatting, a concession we must make with blogs).

I was citing a passage from the Ars Technica article. The reason it's typically thought to be same is probably that the best

It also says "RC4 Yes NOT DESIRABLE" and "Forward Secrecy No NOT DESIRABLE"

That said, my bank did recently roll out a new online baking upgrade with better password requirements (not only do they actually treat upper and lower case letters differently but they actually require upper case, lower case AND numbers in the password.

Is there a list of financial institutions using ECC and/or PFS? My bank and CC provider are not.

]]>Like whoever Perseid was replying to, I thought I knew that breaking RSA and factoring integers were equivalent problems, but I guess that is not accurate and in fact it's at best an open question. Presumably there are some results newer than '98, but I didn't google one up that quick.

]]>That is just plain wrong? We still don't have a proof in a reasonable setting that breaking RSA is as difficult as factoring, or have I missed something? And the Diffie-Hellman hypothesis has its name for a reason, instead of just being the discrete logarithm problem. Even stuff like the probabilistic signature scheme came years after the original RSA.

]]>BTW: The DarkMail Kickstarter already has $54k pledged of $197k after only a couple days. :)

]]>Because they have resources.

]]>Have you seen that Phil Zimmerman and LavaBit are working together to put together a new DarkMail protocol:

]]>@Douglas Knight:

*[Bruce] worries even about NSA breaking ECC even when it had no role in design, like curve25519.*

Does this mean that NSA has made sufficient advances to use math to break the ciphers?

Does it also mean that there is thus **less reason to trust the math?**

Yes, I know that. I'm thinking there may be cases that you still would want to use ECC.

]]>But you mention ECDHE-RSA-AES128-SHA256. That **is** ECC. That's why the first two letters are the same.

ECC may have theoretical problems involving NSA/NIST curves. There are some "safer" curves out there. Until then lets crank it up to 11. I'm still leaning RSA until we get "safer" curves are widely adopted.

]]>One point that has been in the news recently is the Dual Elliptic Curve Deterministic Random Bit Generator (Dual_EC_DRBG). This is a random number generator standardized by the National Institute of Standards and Technology (NIST) andpromoted by the NSA. Dual_EC_DRBG generates random-looking numbers using the mathematics of elliptic curves. The algorithm itself involves taking points on a curve and repeatedly performing an elliptic curve "dot" operation. After publication, it was reported that it could have been designed with a backdoor, meaning that the sequence of numbers returned could be fully predicted by someone with the right secret number.

The NSA has a legion of mathematicians and better-than-Google computing power. While the "backdoor" might not result in a "fully predicted" sequence, it could narrow the possibilities significantly, such that what should be a steel vault is only a cardboard box.

Elliptic curve cryptography is supposed to reduce processor effort and give us higher security. While in the past a change from the NSA resulted in the strengthening an algorithm (and it took seven years for people to figure out that's what the change was for), with the Snowden leaks the NSA's motives have become debatable.

"Trust us, we're your government," just doesn't reassure many people.

]]>The biggest red flag is that we don't know where the constants used to generate the curves came from. Given the NSAs influence, they could have easily influenced them. The constants are hashed, so I find it unlikely that they were computed, but it is possible that the NSA knows of an attack on some curves that occur with a brute forcible probability.

I've also never gotten over the fact that they chose 521 bit prime curves instead of 512 bit prime curves (where every other prime curve is a multiple of 32 or (in one case) 16) the only reason I can think of for that is that weak 512 bit curves are significantly less likely than for other prime curves.

]]>Did we have any insight from the Snowden papers if the NSA has identified any vulnerabilities in this?

I'm guessing people with Snowden's (former) access at NSA wouldn't hear about that until there was a tool to decrypt ECC automatically.

]]>We do not -- at least not yet -- but I strongly believe that the NSA has a significant advantage in breaking ECC. This doesn't mean it's bad, but I think we need to 1) make sure we know where our curves come from, and 2) build in a hefty security margin.

]]>...

If you're worried about ensuring the highest level of security while maintaining performance, ECC makes sense to adopt."

Funny, but I feel like, "least understood" + "highest level of security" = contradiction.

I hear there's some interesting work now on lattice-based cryptography, does anyone know any good explanations of that?

]]>