The Doghouse: Crypteto
The most important issue of any encryption product is the ‘bit key strength’. To date the strongest known algorithm has a 448-bit key. Crypteto now offers a
49,152-bit key. This means that for every extra 1 bit increase that Crypteto has over its competition makes it 100% stronger. The security and privacy this offers
Yes, every key bit doubles an algorithm’s strength against brute-force attacks. But it’s hard to find any real meaning in a work factor of 249152.
Coupled with this truly remarkable breakthrough Crypteto does not compromise on encryption speed. In the past, incremental key strength improvements have effected the speed that data is encrypted. The usual situation was that for every 1 bit increase in key strength there was a consequent reduction in encryption
speed by 50%.
That’s not even remotely true. It’s not at all obvious how key length is related to encryption speed. Blowfish has the same speed, regardless of key length. AES-192 is about 20% slower than AES-128, and AES-256 is about 40% slower. Threefish, the block cipher inside Skein, encrypts data at 7.6 clock cycles/byte with a 256-bit key, 6.1 clock cycles/byte with a 512-bit key, and 6.5 clock cycles/byte with a 1024-bit key. I’m not claiming that Threefish is secure and ready for commercial use — at any keylength — but there simply isn’t a chance that encryption speed will drop by half for every key bit added.
This is a fundamental asymmetry of cryptography, and it’s important to get right. The cost to encrypt is linear as a function of key length, while cost to break is geometric. It’s one of the reasons why, of all the links in a security chain, cryptography is the strongest.
Normally I wouldn’t bother with this kind of thing, but they explicitly asked me to comment:
But Hawthorne Davies has overcome this issue. By offering an algorithm with an unequalled key strength of 49,152 bits, we are able to encrypt and decrypt data at speeds in excess of 8 megabytes per second. This means that the aforementioned Gigabyte of data would take 2 minutes 13 seconds. If Bruce Schneier, the United State’s foremost cryptologist, were to increase his Blowfish 448 bit encryption algorithm to Blowfish 49152, he would be hard pressed to encrypt one Gigabyte in 4 hours.
We look forward to receiving advice and encouragement from the good Dr. Schneier.
I’m not a doctor of anything, but sure. Read my 1999 essay on snake-oil cryptography:
Warning Sign #5: Ridiculous key lengths.
Jaws Technology boasts: “Thanks to the JAWS L5 algorithm’s statistically unbreakable 4096 bit key, the safety of your most valued data files is ensured.” Meganet takes the ridiculous a step further: “1 million bit symmetric keys — The market offer’s [sic] 40-160 bit only!!”
Longer key lengths are better, but only up to a point. AES will have 128-bit, 192-bit, and 256-bit key lengths. This is far longer than needed for the foreseeable future. In fact, we cannot even imagine a world where 256-bit brute force searches are possible. It requires some fundamental breakthroughs in physics and our understanding of the universe. For public-key cryptography, 2048-bit keys have same sort of property; longer is meaningless.
Think of this as a sub-example of Warning Sign #4: if the company doesn’t understand keys, do you really want them to design your security product?
Or read what I wrote about symmetric key lengths in 1996, in Applied Cryptography (pp. 157–8):
One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)
Given that k = 1.38×10-16 erg/°Kelvin, and that the ambient temperature of the universe is 3.2°Kelvin, an ideal computer running at 3.2°K would consume 4.4×10-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.
Now, the annual energy output of our sun is about 1.21×1041 ergs. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn’t have the energy left over to perform any useful calculations with this counter.
But that’s just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.
These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.
Ten years later, there is still no reason to use anything more than a 256-bit symmetric key. I gave the same advice in 2003 Practical Cryptography (pp. 65-6). Even a mythical quantum computer won’t be able to brute-force that large a keyspace. (Public keys are different, of course — see Table 2.2 of this NIST document for recommendations).
Of course, in the real world there are smarter ways than to brute-force keysearch. And the whole point of cipher cryptanalysis is to find shortcuts to brute-force search (like this attack on AES), but a 49,152-bit key is just plain stupid.
EDITED TO ADD (9/30): Now this is funny:
Some months ago I sent individual emails to each of seventeen experts in cryptology, all with the title of Doctor or Professor. My email was a first announcement to the academic world of the TOUAREG Encryption Algorithm, which, somewhat unusually, has a session key strength of over 49,000 bits and yet runs at 3 Megabytes per second. Bearing in mind that the strongest version of BLOWFISH has a session key of 448 bits and that every additional bit doubles the task of key-crashing, I imagined that my announcement would create more than a mild flutter of interest.
Much to his surprise, no one responded.
Here’s some more advice: my 1998 essay, “Memo to the Amateur Cipher Designer.” Anyone can design a cipher that he himself cannot break. It’s not even hard. So when you tell a cryptographer that you’ve designed a cipher that you can’t break, his first question will be “who the hell are you?” In other words, why should the fact that you can’t break a cipher be considered evidence of the cipher’s security?
If you want to design algorithms, start by breaking the ones out there. Practice by breaking algorithms that have already been broken (without peeking at the answers). Break something no one else has broken. Break another. Get your breaks published. When you have established yourself as someone who can break algorithms, then you can start designing new algorithms. Before then, no one will take you seriously.
EDITED TO ADD (9/30): I just did the math. An encryption speed of 8 megabytes per second on a 3.33 GHz CPU translates to about 400 clock cycles per byte. This is much, much slower than any of the AES finalists ten years ago, or any of the SHA-3 second round candidates today. It’s kind of embarrassingly slow, really.