Entries Tagged "Cryptography Engineering"

Page 1 of 1

Is Cryptography Engineering or Science?

Responding to a tweet by Thomas Ptacek saying, “If you’re not learning crypto by coding attacks, you might not actually be learning crypto,” Colin Percival published a well-thought-out rebuttal, saying in part:

If we were still in the 1990s, I would agree with Thomas. 1990s cryptography was full of holes, and the best you could hope for was to know how your tools were broken so you could try to work around their deficiencies. This was a time when DES and RC4 were widely used, despite having well-known flaws. This was a time when people avoided using CTR mode to convert block ciphers into stream ciphers, due to concern that a weak block cipher could break if fed input blocks which shared many (zero) bytes in common. This was a time when people cared about the “error propagation” properties of block ciphers ­ that is, how much of the output would be mangled if a small number of bits in the ciphertext are flipped. This was a time when people routinely advised compressing data before encrypting it, because that “compacted” the entropy in the message, and thus made it “more difficult for an attacker to identify when he found the right key”. It should come as no surprise that SSL, designed during this era, has had a long list of design flaws.

Cryptography in the 2010s is different. Now we start with basic components which are believed to be highly secure ­ e.g., block ciphers which are believed to be indistinguishable from random permutations ­ and which have been mathematically proven to be secure against certain types of attacks ­ e.g., AES is known to be immune to differential cryptanalysis. From those components, we then build higher-order systems using mechanisms which have been proven to not introduce vulnerabilities. For example, if you generate an ordered sequence of packets by encrypting data using an indistinguishable-from-random-permutation block cipher (e.g., AES) in CTR mode using a packet sequence number as the CTR nonce, and then append a weakly-unforgeable MAC (e.g., HMAC-SHA256) of the encrypted data and the packet sequence number, the packets both preserve privacy and do not permit any undetected tampering (including replays and reordering of packets). Life will become even better once Keccak (aka. SHA-3) becomes more widely reviewed and trusted, as its “sponge” construction can be used to construct—with provable security—a very wide range of important cryptographic components.

He recommends a more modern approach to cryptography: “studying the theory and designing systems which you can prove are secure.”

I think both of statements are true—and not contradictory at all. The apparent disagreement stems from differing definitions of cryptography.

Many years ago, on the Cryptographer’s Panel at an RSA conference, then-chief scientist for RSA Bert Kaliski talked about the rise of something he called the “crypto engineer.” His point was that the practice of cryptography was changing. There was the traditional mathematical cryptography—designing and analyzing algorithms and protocols, and building up cryptographic theory—but there was also a more practice-oriented cryptography: taking existing cryptographic building blocks and creating secure systems out of them. It’s this latter group he called crypto engineers. It’s the group of people I wrote Applied Cryptography, and, most recently, co-wrote Cryptography Engineering, for. Colin knows this, directing his advice to “developers”—Kaliski’s crypto engineers.

Traditional cryptography is a science—applied mathematics—and applied cryptography is engineering. I prefer the term “security engineering,” because it necessarily encompasses a lot more than cryptography—see Ross Andersen’s great book of that name. And mistakes in engineering are where a lot of real-world cryptographic systems break.

Provable security has its limitations. Cryptographer Lars Knudsen once said: “If it’s provably secure, it probably isn’t.” Yes, we have provably secure cryptography, but those proofs take very specific forms against very specific attacks. They reduce the number of security assumptions we have to make about a system, but we still have to make a lot of security assumptions.

And cryptography has its limitations in general, despite the apparent strengths. Cryptography’s great strength is that it gives the defender a natural advantage: adding a single bit to a cryptographic key increases the work to encrypt by only a small amount, but doubles the work required to break the encryption. This is how we design algorithms that—in theory—can’t be broken until the universe collapses back on itself.

Despite this, cryptographic systems are broken all the time: well before the heat death of the universe. They’re broken because of software mistakes in coding the algorithms. They’re broken because the computer’s memory management system left a stray copy of the key lying around, and the operating system automatically copied it to disk. They’re broken because of buffer overflows and other security flaws. They’re broken by side-channel attacks. They’re broken because of bad user interfaces, or insecure user practices.

Lots of people have said: “In theory, theory and practice are the same. But in practice, they are not.” It’s true about cryptography. If you want to be a cryptographer, study mathematics. Study the mathematics of cryptography, and especially cryptanalysis. There’s a lot of art to the science, and you won’t be able to design good algorithms and protocols until you gain experience in breaking existing ones. If you want to be a security engineer, study implementations and coding. Take the tools cryptographers create, and learn how to use them well.

The world needs security engineers even more than it needs cryptographers. We’re great at mathematically secure cryptography, and terrible at using those tools to engineer secure systems.

After writing this, I found a conversation between the two where they both basically agreed with me.

Posted on July 5, 2013 at 7:04 AMView Comments

New Book: Cryptography Engineering

I have a new book, sort of. Cryptography Engineering is really the second edition of Practical Cryptography. Niels Ferguson and I wrote Practical Cryptography in 2003. Tadayoshi Kohno did most of the update work—and added exercises to make it more suitable as a textbook—and is the third author on Cryptography Engineering. (I didn’t like it that Wiley changed the title; I think it’s too close to Ross Anderson’s excellent Security Engineering.)

Cryptography Engineering is a techie book; it’s for practitioners who are implementing cryptography or for people who want to learn more about the nitty-gritty of how cryptography works and what the implementation pitfalls are. If you’ve already bought Practical Cryptography, there’s no need to upgrade unless you’re actually using it.

EDITED TO ADD (3/23): Signed copies are available. See the bottom of this page for details.

EDITED TO ADD (3/29): In comments, someone asked what’s new in this book.

We revised the introductory materials in Chapter 1 to help readers better understand the broader context for computer security, with some explicit exercises to help readers develop a security mindset. We updated the discussion of AES in Chapter 3; rather than speculating on algebraic attacks, we now talk about the recent successful (theoretical, not practical) attacks against AES. Chapter 4 used to recommended using nonce-based encryption schemes. We now find these schemes problematic, and instead recommend randomized encryption schemes, like CBC mode. We updated the discussion of hash functions in Chapter 5; we discuss new results against MD5 and SHA1, and allude to the new SHA3 candidates (but say it’s too early to start using the SHA3 candidates). In Chapter 6, we no longer talk about UMAC, and instead talk about CMAC and GMAC. We revised Chapters 8 and 15 to talk about some recent implementation issue to be aware of. For example, we now talk about the cold boot attacks and challenges for generating randomness in VMs. In Chapter 19, we discuss online certificate verification.

Posted on March 23, 2010 at 2:42 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.