Is Cryptography Engineering or Science?

Responding to a tweet by Thomas Ptacek saying, "If you're not learning crypto by coding attacks, you might not actually be learning crypto," Colin Percival published a well-thought-out rebuttal, saying in part:

If we were still in the 1990s, I would agree with Thomas. 1990s cryptography was full of holes, and the best you could hope for was to know how your tools were broken so you could try to work around their deficiencies. This was a time when DES and RC4 were widely used, despite having well-known flaws. This was a time when people avoided using CTR mode to convert block ciphers into stream ciphers, due to concern that a weak block cipher could break if fed input blocks which shared many (zero) bytes in common. This was a time when people cared about the "error propagation" properties of block ciphers ­ that is, how much of the output would be mangled if a small number of bits in the ciphertext are flipped. This was a time when people routinely advised compressing data before encrypting it, because that "compacted" the entropy in the message, and thus made it "more difficult for an attacker to identify when he found the right key". It should come as no surprise that SSL, designed during this era, has had a long list of design flaws.

Cryptography in the 2010s is different. Now we start with basic components which are believed to be highly secure ­ e.g., block ciphers which are believed to be indistinguishable from random permutations ­ and which have been mathematically proven to be secure against certain types of attacks ­ e.g., AES is known to be immune to differential cryptanalysis. From those components, we then build higher-order systems using mechanisms which have been proven to not introduce vulnerabilities. For example, if you generate an ordered sequence of packets by encrypting data using an indistinguishable-from-random-permutation block cipher (e.g., AES) in CTR mode using a packet sequence number as the CTR nonce, and then append a weakly-unforgeable MAC (e.g., HMAC-SHA256) of the encrypted data and the packet sequence number, the packets both preserve privacy and do not permit any undetected tampering (including replays and reordering of packets). Life will become even better once Keccak (aka. SHA-3) becomes more widely reviewed and trusted, as its "sponge" construction can be used to construct -- with provable security -- a very wide range of important cryptographic components.

He recommends a more modern approach to cryptography: "studying the theory and designing systems which you can prove are secure."

I think both of statements are true -- and not contradictory at all. The apparent disagreement stems from differing definitions of cryptography.

Many years ago, on the Cryptographer's Panel at an RSA conference, then-chief scientist for RSA Bert Kaliski talked about the rise of something he called the "crypto engineer." His point was that the practice of cryptography was changing. There was the traditional mathematical cryptography -- designing and analyzing algorithms and protocols, and building up cryptographic theory -- but there was also a more practice-oriented cryptography: taking existing cryptographic building blocks and creating secure systems out of them. It's this latter group he called crypto engineers. It's the group of people I wrote Applied Cryptography, and, most recently, co-wrote Cryptography Engineering, for. Colin knows this, directing his advice to "developers" -- Kaliski's crypto engineers.

Traditional cryptography is a science -- applied mathematics -- and applied cryptography is engineering. I prefer the term "security engineering," because it necessarily encompasses a lot more than cryptography -- see Ross Andersen's great book of that name. And mistakes in engineering are where a lot of real-world cryptographic systems break.

Provable security has its limitations. Cryptographer Lars Knudsen once said: "If it's provably secure, it probably isn't." Yes, we have provably secure cryptography, but those proofs take very specific forms against very specific attacks. They reduce the number of security assumptions we have to make about a system, but we still have to make a lot of security assumptions.

And cryptography has its limitations in general, despite the apparent strengths. Cryptography's great strength is that it gives the defender a natural advantage: adding a single bit to a cryptographic key increases the work to encrypt by only a small amount, but doubles the work required to break the encryption. This is how we design algorithms that -- in theory -- can't be broken until the universe collapses back on itself.

Despite this, cryptographic systems are broken all the time: well before the heat death of the universe. They're broken because of software mistakes in coding the algorithms. They're broken because the computer’s memory management system left a stray copy of the key lying around, and the operating system automatically copied it to disk. They're broken because of buffer overflows and other security flaws. They're broken by side-channel attacks. They're broken because of bad user interfaces, or insecure user practices.

Lots of people have said: "In theory, theory and practice are the same. But in practice, they are not." It’s true about cryptography. If you want to be a cryptographer, study mathematics. Study the mathematics of cryptography, and especially cryptanalysis. There's a lot of art to the science, and you won't be able to design good algorithms and protocols until you gain experience in breaking existing ones. If you want to be a security engineer, study implementations and coding. Take the tools cryptographers create, and learn how to use them well.

The world needs security engineers even more than it needs cryptographers. We're great at mathematically secure cryptography, and terrible at using those tools to engineer secure systems.

After writing this, I found a conversation between the two where they both basically agreed with me.

Posted on July 5, 2013 at 7:04 AM • 25 Comments

Comments

Stefan LucksJuly 5, 2013 7:49 AM

While you give two "definitions" for cryptography, (crypto science and crypto engineering, as you put it), neither is even remotely related to "If you’re not learning crypto by coding attacks, you might not actually be learning crypto.".

Secondly, I don't believe in the dichotomy between crypto science and crypto engineering. (I would prefer other names, but so what.) Science and engineering are the two poles in the world of crypto, but any good cryptographer must understand both and almost all real cryptographic work is somewhere in between. Just as most of earth is neither at the north pole nor at the south pole, but in between.

(I believe that the main reason why there is such a lot of lousy crypto out there is that too many wanna-be crypto designers either understand the science part and not the engineering part, or vice verse. Which is what makes me so uneasy about your essay -- your message seems to be "choose either the science part or the engineering part, either is OK", while IMHO the message should be "if you like math and want to do crypto, you must also understand the engineering, and if you like engineering and want to do crypto, you must also do math -- otherwise, don't ever do crypto.)

As an example, there are a lot of crypto implementations where nonces can be used more than once, under certain circumstances. If you don't understand the theory -- why is it so important never to allow nonce reuse (for certain crypto systems), you are likely to make that mistake. So even

As a more concrete example, consider the BEAST attack against TLS from 2011. It is a practical demonstration, with some rather diligent exploits and some tough coding works. So it would seem to be found on the engineering side. However, the attack is based on a theoretical observation from Phil Rogaway from about 10 years earlier. Apparently, the authors of the TLS record protocol (or SSL at that time) where crypto engineers without a proper understanding of the theory they should have understood -- otherwise, they would never have thought of using the CBC "Chained IV" mode. (The IV for the first message is random, all other messages use the final block of the previous message as their IV.)

The BEAST authors' main innovation was to make TLS actually encrypt some chosen plaintexts. This appears to be great crypto engineering (or attack engineering). But without understanding the theory (the crypto science), they would not have been able to attack anything.

For an example on the design side, ... well there is Skein. Or any other well-designed hash function -- including Keccak. You cannot design a reasonable hash function from the scratch without both, a really good understanding of the underlying theory (crypto science) and the practical issues like, how efficiently it can be put into software or hardware.

Bruce SchneierJuly 5, 2013 7:52 AM

@Stefan

Interesting, and convincing. You're right that a lot of actual cryptography blends theory and practice.

HannoJuly 5, 2013 8:14 AM

"This was a time when DES and RC4 were widely used, despite having well-known flaws"

Now we're decades later and RC4 is still widely used :-)

QnJ1Y2UJuly 5, 2013 8:14 AM

As you can imagine, non-experts are even more confused about cryptography terminology. This article about Snowden and the NSA is eye-rolling for a variety of reasons, but the statements about "encryption" are over the top:

http://www.washingtonpost.com/opinions/...

And the twitter conversation that followed wasn't an improvement. Some snippets:

https://twitter.com/ggreenwald/status/351728841116499972
https://twitter.com/marcthiessen/status/351737348544933890
https://twitter.com/20committee/status/351785839778480129
https://twitter.com/jaysonblair7/status/351805362116116482


princetonJuly 5, 2013 8:42 AM

"Don't reuse a nonce" is just a sound byte. An initialization vector is typically just appended to the cipher text but what is its source? Many times the source is a PRNG but you don't mess with the output of a random number source, at all. There is no such thing as a random number, only sources of random numbers (von Neumann). If the source is truly random, then it is possible for the same number to be output twice in a row. As soon as you say "oh, here is that same number again, throw it away" then it is no longer a random number source. I can generate pages of random characters and somewhere in there find a seemingly intelligent string. ALL it can mean is, don't store a nonce. Don't store a nonce. Don't store a nonce. But that's not what people say.

IndeedJuly 5, 2013 8:48 AM

Crypto.cat is a good example of why proper coding is essential as its been broken so many times now I can't count. http://tobtu.com/decryptocat.php

1password can be cracked at only 3MH/s, Lastpass in a reasonable time using a 7970gpu at 750MH/s

princetonJuly 5, 2013 8:52 AM

The context of crypto discussion is often foggy. Often they're talking about encryption and not cryptography. Are you protecting information at rest or in communication? One of my old textbooks, outstanding material, is Koblitz's "A Course in Number Theory and Cryptography" but it covers only math pertaining to shared key systems and hardly touches block cipher design. Other material treats all of cryptography as if it consisted of nothing more than various arrangements of P-boxes and S-boxes.

PrincetonJuly 5, 2013 9:09 AM

Imagine a dark room into which a bunch of people had been herded. Behind them the door is shut. Inside the temperature will slowly rise until they are all consumed in the heat. Every time they begin to scream for someone to do something... "Open the door!" another yells back "No! Don't go out there!" and another "Just wait here." That is the current state of information security.

PrincetonJuly 5, 2013 9:19 AM

Chapter 2.
An hour had passed and the temperature in the room was now searing. Several collapsed. A few folder paper they had into fans they could sell to others. Finally someone reached for the door. The doorknob burnt their hand. The others screamed "You were told not to open the door! That's what you get!" Then they beat him and threw his body into the corner.

Dr. I. Needtob AtheJuly 5, 2013 9:36 AM

"Traditional cryptography is a science..."

I wonder. I'm having difficulty finding a definition of science that would include cryptography. Most of the definitions I've seen refer to observations of nature, or study of the Universe, but cryptography seems to be independent of the real world. How does it fit in? It's certainly a branch of mathematics, and mathematics is said to be the language of science, but is it technically a science itself?

Jeff WarnicaJuly 5, 2013 10:05 AM

There also is, even beyond crypto engineering, another layer, for lack of a better word, lets call it operations.

As a systems admin, everything I need to configure something with crypto, I always get the feeling that everyone else has entirely washed their hands of producing something which is actually usable.

Joel Spolsky talk about leaky abstractions as an inevitable side effect of simplification. It would be great if configuring Apache or using openssl(1) leaked some forgettable details. I'm not sure if its because its academics who work on that and don't care about real world usability, or those who work on crypto have lawyers hovering over them... Every layer adds sharp edges to bang my head into and broken glass to walk over.

For example, I don't think the apache mod_ssl guys know anything about crypto, and just exposed exactly every option openssl has, and managed to throw together a template configuration which basically "works". How is a mere mortal like I to know how far outside that is safe? What combinations and interactions of settings are reasonable? I don't have time to decode the full depth of the openssl docs, which of course, also would require decoding the full depth of the academic papers backing the actual low level algorithms.

Is anyone doing configuration checks to prevent me from a 21st century equivalent of double-ROT-13?

I know guys who spend hours or days tweaking their settings, some of them I wouldn't trust to go to get groceries, and some I respect very much. Scared of double-ROT-13ing myself, I generally go with the defaults and worry that they are actually reasonable today, and not just something that someone made work in 1999 and has since forgotten about.

name.withheld.for.obvious.reasonsJuly 5, 2013 10:45 AM

@bruce

This makes me unconformattable as the modality of the application space has changed:

The NSA scooping up the files (specifically encrypted) allows a number of cryptographic attacks no possible until recently.

This is my concern with salts stored in files such as Microsoft Word. With the revelations that these documents are being cached by you know who, I imagine one of the first targets is the salts of particular files. Why, decrypting data used to be looking through the haystack, to look through another haystack. When a source is producing encrypted data it is generally in short indeterminate periods. Thus functional crypto-analysis is relatively weak at breaking encrypted at rest files using strong private and controlled keys. But, if you have all the files from all the sources, things start to change.

Now bits of entropy will be lost...

wumpusJuly 5, 2013 11:17 AM

"This was a time when people avoided using CTR mode to convert block ciphers into stream ciphers, due to concern that a weak block cipher could break if fed input blocks which shared many (zero) bytes in common."

If you have "lots of zeros" while using CTR you have a bigger problem than the possibility of breaking those zeros. You can easily forget that you absolutely can not reuse your key under this scheme. Reuse any key and you have all the issues of a reused one time pad.

Oddly enough, I've mainly seen CTR+CBC used lately (simply starting the counter with an encrypted seed would also solve the key reuse problem. CTR+CBC will likely include a salt anyway so no transmission efficiency issues). This breaks one of the big reasons for using CTR (you can compute arbitrary amounts of blocks in parallel), but gets rid of the issues with "many zero bytes". Actually, I doubt anybody cares that the bytes are zero, just that a counter implies an average of two bits changing between sequences. First, this is somewhat weak (it might even match some chosen plaintexts), and secondly there is the danger that not enough will change between rounds. If the second (and few more) rounds haven't completely changed, you might as well be using a reduced round encryption scheme. It might still be secure (the thing had better be designed for this case), but you are essentially using a reduced round cryptosystem compared to CTR+CBC and a single XOR will give you the full round system.

MortJuly 5, 2013 1:36 PM

@princeton
"Don't reuse a nonce" isn't "just a sound byte", it's practically the definition of a nonce.
If you generate nonces randomly it is "possible" for the same number to be output twice in a row, but for a large sized nonces the chance is completely negligible. For smaller nonces random generation would be disastrous, but a simple counter can work fine.

I guess the bottom line is, as always, most developers should use a library that will handle this automatically.

Nick PJuly 5, 2013 2:40 PM

@ Bruce Schneier

I ran into this nice little project reading the discussion you linked to. It seems like quite a step up above popular cryptographic libraries in usability, security and performance. You rarely get all that at once.

NaCl crypto (Bernstein et al)
http://nacl.cr.yp.to/features.html

Nick PJuly 5, 2013 2:46 PM

I forgot to mention that it seems to solve some of the difficulty other commenters, here and at ycombinator, mentioned. The major actions are just one function. Error handling, side channel prevention, primitive selection, etc is all built in. Just call the function and you're done.

Rick SmithJuly 5, 2013 4:47 PM

The actual border lands might not be surveyed yet, but I agree there's a distinct field of cryptographic engineering, just as computer architecture can be independent of circuit design.

In both cases we try to establish design rules so that engineers can build things with predictable properties. In both cases we can push the envelope of those rules and yield disaster.

A lot of people are misled by the theoretical mathematical strength of crypto mechanisms, and they let the perfect become the enemy of the good.

MingoVJuly 5, 2013 5:50 PM

"Traditional cryptography is a science -- applied mathematics..."

Applied mathematics is not a science. I taught statistics, and I never considered it to be a science. It's a set of mathematical tools useful to scientists and others.

Science needs the scientific method (my version): observe a not-yet-explained phenomenon, hypothesize something non-trivial related to the phenomenon, make observations under controlled conditions (eg: experiments), analyze the results, and assess whether the hypothesis was confirmed.

How does one do that with cryptography? If cryptography is a science, then so is developing the best way to create unique Sudoku puzzles.

Matthew GreenJuly 6, 2013 3:10 AM

Just as a note to Stefan Lucks above:

The theoretical understanding that led to BEAST dates to about 2001, but TLS 1.0 dates to 1999. The TLS designers (or rather the OpenSSL maintainers) were well aware of the problem with blockwise chosen plaintext attacks /after/ it was pointed out, but they were stuck with a bad spec. (They did hack mitigations into OpenSSL starting in about 2002.) In a perfect world they would have guessed the problems before someone explicitly told them about it, but whatever.

What earns the TLS designers a life sentence in cryptography jail is the spec for TLS 1.1 (2006) where they proposed a raft of dubious fixes, some of which they admit were not carefully analyzed.

As to Colin's points: I've personally reviewed several products that could have used Colin's CTR+HMAC description as a starting point. Some still didn't get it right.

IcicleJuly 9, 2013 4:14 PM

Is Cryptography Engineering or Science?
According to Karl Popper, if cryptography is not falsifiable it would not be a science.
Is it so?
All cryptographic systems will/could be broken eventually given enough time and resources, therefore it is falsifiable. (Even OTP is susceptible to rubber-hose cryptanalysis.)

The easy answer would be that cryptography is:
part science, part engineering, and part art.

For example: I own "The Art of Computer Programming", "The Science of Programming", "The Practice of Programming" and "Software Engineering" and I like them all for different reasons.

Wordplay aside, the "art" in cryptography is perhaps synonymous with the phrase coined by Bruce: "security mindset".

So my conclusion is a bit boring: I agree with Bruce.
To learn traditional cryptography one needs to learn how to break cryptographic systems.
To learn security engineering one has to learn how to break computer systems.

Which raises an interesting question: "If I research how to break computer systems by browsing the internet, and my internet traffic is hoovered up by the NSA and stored in their databases - when will NSA decide if I am a white hat or a black hat and purge my data/metadata from their systems? Never ever?"

Let's make it easier for them: I visit this blog and read Bruce's books because I want to learn how to build computer systems that are useful and as secure as possible for my clients.
I'd rather be the best security engineer I can, than a wanna-be crypto designer.
(I wonder how many "bits of identifying information" I have left on this blog so far?)

The dichotomy of crypto engineering vs crypto science feels a bit artificial.

A better topic for discussion is:


What is more fun and profitable: Social Engineering or Social Science?

Clive RobinsonJuly 9, 2013 5:55 PM

@ Dr. I. Needtob Athe,

    I'm having difficulty finding a definition of science that would include cryptography. Most of the definitions I've seen refer to observations of nature...

No cryptography is not a science in the traditional sense for a couple of reasons, the first is as others have noted a subset of mathmatics, and secondly it does not relate to tangible objects, energy or forces.

Originaly science as we now call it was refered to as "Natural Philosophy" to distinquish it from religion, law and the philosphy behind them which included various forms of logic.

In essence then you could look on science as begining with hard science of quantum physics, working through physics as applied to physical objects, energy and forces, through chemistry to biology to the point where you enter the soft sciences of anthrapology through philosophy and morals etc which eventually take you into religion again. Einstine kind of covered the entirety of pre quantum science with his assertion about god does not play dice with the world.

But you will note that mathmatics does not make it into the list even when you get to the understanding of truth through logic in philosophy. The reason is that it is a way to describe for the purpose of modeling our physical world. It is in effect like a drawing of a physical object it uses information to convay meaning about a physical object. The reason maths can be used to model the world is two fold, firstly we know of ways to quantify and compare physical atributes by agreed scales, secondly within our physical world all events are probablistic in nature to a certain extent due to various constraints on physical objects which we regard as the laws of nature.

Mathmatics gives us the ability to describe in an abstract way the physical world around us but it does not in any way shape the physical world. We can thus play around with the abstract descriptions in any way we like alowed by the axioms and rules of mathmatics but in the main such playing has no meaning in the physical world. To give meaning we make observations of the physical world, from these we make predictions which we then usually make into a mathmatical model, which we then test in various ways to check for agreement or disagreement with the physical world. Often we find that our models are incompleate or don't fully agree with the physical world (think Newton's laws of motion). When we do find exceptions in general we use them to formulate new models, it's one of the reasons for the saying "In physics you are taught a succession of lies that progresivly aproximate the truth more accuratly".

Importantly we may carry on using a model even though it's known to be wrong simply because it "works" for the particular application we have in mind. For instance Newton's laws of motion are more than sufficient for calculating orbital paths around the Earth and Solar system, however they are insufficient for GPS systems where we need to use Einstein's laws to obtain the desired accuracy and repeatability.

So no mathmatics is not a science but a tool for doing science it is agnostic to it's use in the physical world. Cryptography is not of the physical world it describes how to manipulate information which is independant of matter/energy and phisical laws of nature that constrain them.

It's important to remember this because our crypto models are from the security aspect just one tiny part of what makes a physical implementation secure. For instance have you ever seen a crypto algorithm that specificaly takes into account the effects of the laws of nature on matter and energy?

FigureitoutJuly 9, 2013 8:41 PM

What is more fun and profitable: Social Engineering or Social Science?
Icicle
--Sorry to sh*t in your cornflakes, but none of the above. SE is risky if you actually do it and if you get caught you really don't have any legitimate excuse; not to mention flat out deceiving someone's trust to their eyes/ears/face, pretty despicable. SS is soft fluff that doesn't really use a primary tool to make it a respectable science, math. So please, for humanity's sake, don't encourage it and stick w/ the crypto and programming b/c you create more value; even though theoretically crypto shouldn't even be necessary and is a huge waste.

GJuly 10, 2013 10:39 PM

Just for the record, the Universe's expansion is accelerating -- there will be no "big crunch." ;)

IcicleJuly 12, 2013 9:00 PM

@ Figureitout
I agree with you, because it was a joke.
And I appreciate your comment because it proves the second point I tried to make.

In my comment I tried to make the following points:

  1. Another perspective on Science vs. Engineering (on topic)

  2. Point to the difficulty for machines and humans to determine intent (off topic but topical)

  3. Offer a silly joke using wordplay and over-generalization. (*)

(*) Kieran Healy is a counter-example of a researcher in social sciences that I assume is paid sufficiently and must feel a great sense of accomplishment from the reactions to his great article.


In my previous comments on this blog I've used hyperbole and irony to make a point, and most readers either got it or ignored my comments. But the reaction of Figureitout illustrates the difficulty of expressing oneself in a clear and unambiguous manner. And if people have trouble determining if an expression indicates benevolent intent or malevolent, then how can any official trust the reports created by the NSA machinery + staff? (Unless they get monetary incitements to do so, of course...)


By the way, thanks Figureitout for the reference to Nicky Hager's book Secret Power. I have read about his articles, but didn't know his book was online. ECHELON training is called "indoctrination". What is the training called that PRISM personnel get? Brainwashing??? ;)


While I expected to see someone comment on the demarcation problem in scientific methodology, or other topics in the philosophy of science like Clive Robinson's exposition on the role of mathematical models in science, you raised an interesting problem that needs to be examined further.


So I offer this question for consideration:

Has NSA created a system that could be used as a judge in a Turing-test?

By this I mean:
1) If a human has difficulties determining whether they are talking to a machine or another human, how could any human create a program that automates this process? And succeed in a sufficiently high degree?

and:

2) Before we understand the brain, cognition, humour and language - completely and fully - and proved this scientifically (our hypotheses and models), we cannot put faith in a machine's attempt to deduce intent from intercepted messages.
And we should NOT put faith in officials who do so!


My stance is this:


  • Machines should be useful tools that help humanity.

  • Humans should not be useful tools to machines. (In any shape or form.)


Burt KaliskiAugust 5, 2013 9:48 PM

Bruce, you've prompted discussion some good discussion about whether and how the field of cryptography includes both science and engineering.

When I made the comment about the rise of the crypto-engineer sometime around 1995, it was at the time when message authentication codes such as HMAC were being built out of hash functions, and symmetric encryption schemes such as 3DES-CBC were being constructed from block ciphers. The design of these new techniques, I contended, exhibited aspects of engineering: applying science to build larger structures from smaller components.

I suppose one could also say that engineering is involved in the design of hash functions and block ciphers, but in those cases, the components are rotations and permutations and mathematical operations: not quite yet cryptography. Building cryptographic schemes and protocols out of cryptographic primitives is more what I think of as crypto engineering.

And by the mid-1990s, it really was becoming an engineering practice.

Not just an ad hoc assembly of parts with anecdotal evidence that it should work, but a rigorous, specified, "provable" design relating the security of the overall scheme against particular attacks to the conjectured properties of underlying primitives.

The solid scientific foundation of the 1980s defining security meant and demonstrating how to prove it gave way to practical applications where secure cryptographic schemes could be designed "just right" for a given environment. That's the engineering part.

So they're both needed. And it's clear that they're both continuing as new primitives are both discovered -- think lattice-based cryptography -- and applied -- think fully homomorphic encryption.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..