Crypto-Gram

June 15, 1998

by Bruce Schneier
President
Counterpane Systems

schneier@schneier.com
http://www.counterpane.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on cryptography and computer security.

A free monthly newsletter providing summaries, analyses, insights, and commentaries on cryptography and computer security.

Copyright (c) 1998 by Bruce Schneier


In this issue:


Side-Channel Attacks Against Cryptosystems

In the last few years, new kinds of cryptanalytic attack have begun to appear in the literature: attacks that target specific implementation details. The “timing attack” made a big press splash in 1995: RSA private keys could be recovered by measuring the relative times cryptographic operations took. This attack has been successfully implemented against smart cards and other security tokens, and against electronic commerce servers across the Internet.

Researchers have generalized these methods to include attacks on a system by measuring power consumption, radiation emissions, and other “side channels,” and have implemented them against a variety of public-key and symmetric algorithms in “secure” tokens. Related research has looked at fault analysis: deliberately introducing faults into cryptographic processors in order to determine the secret keys. The effects of this attack can be devastating.

What’s the big idea here?

There are two ways to look at a cryptographic primitive: block cipher, digital signature function, whatever. The first is as a chunk of math. The second is a physical (or software) implementation of that math.

Traditionally, cryptanalysis has been directed solely against the math. Differential and linear cryptanalysis are good examples of this: high-powered mathematical tools that can be used to break different block ciphers.

On the other hand, timing attacks, power analysis, and fault analysis all makes assumptions about implementation, and uses additional information garnered from attacking those implementations. Failure analysis assumes a one-bit feedback from the implementation—was the message successfully decrypted—in order to break the underlying cryptographic primitive. Timing attacks assumes that an attacker knows how long a particular encryption operation takes.

I like to think of these attacks as biological. There are some things you just can’t learn about an organism by taking it apart. Sometimes you have to look at the inputs and outputs. How does it move? What does it eat? If you thwack it in a particular way, how does it react? If you break it, what happens?

Call them side-channel attacks. Normal attacks look at the plaintext and the ciphertext and attempt to recover the key. Side-channel attacks also look at some other information—how the power consumption changes as the cipher executes, what the output looks like when you cut some wires—in an attempt to recover the key.

These attacks don’t necessarily generalize. A fault-analysis attack just isn’t possible against an implementation that doesn’t permit an attacker to create and exploit the required faults. But these attacks can be much more powerful. For example, differential fault analysis of DES requires between 50 and 200 ciphertext blocks (no plaintext) to recover a key. Contrast this with the best non-side-channel attack against DES, which requires just under 64 terabytes of plaintext and ciphertext encrypted under a single key.

Some researchers have claimed that this is cheating. True, but in real-world systems, attackers cheat. Their job is to recover the key, not to follow some rules of conduct. Prudent engineers of secure systems anticipate this and adapt to it. It is our belief that most operational cryptanalysis makes use of side-channel information. Sound as a side-channel—listening to the rotation of electromechanical rotor machines—was alluded to in The Codebreakers. TEMPEST is another side channel that can be very effective. And in his book Spycatcher, Peter Wright discussed data leaking onto a transmission line as a side channel used to break a French cryptographic device.

Defenses are hard. You can either reduce the amount of side-channel information that leaks, or make the leakage irrelevant. Both have problems, although researchers are working on them. This is a powerful attack, and it will be a while before there is a good theory of the defense. In the meantime, any system where a device is held by one person, and the secrets within the device are held by another, is at risk.

Side-channel cryptanalysis: http://www.schneier.com/paper-side-channel.html

Differential-power analysis: http://www.cryptography.com/dpa/index.html

Industry reaction: http://www.news.com/News/Item/0,4,23025,00.html

Differential-fault analysis: http://www.cs.technion.ac.il/~biham/publications.html

Timing attack: http://www.cryptography.com/timingattack/


News

The L0pht, a hacker group from Boston, testifies before Congress: http://www.techweb.com/wire/story/TWB19980524S0001

Network Associates settles its patent dispute with RSA Data Security, Inc: http://www.news.com/News/Item/0,4,22613,00.html

Cryptography heats up on Capital Hill:
http://www.news.com/News/Item/0,4,22657,00.html
http://www.nytimes.com/library/tech/98/06/cyber/…
http://www.zdnet.com/zdnn/special/crypto.html


Counterpane Systems—Featured Research

“Cryptanalytic Attacks on Pseudorandom Number Generators”

J. Kelsey, B. Schneier, D. Wagner, and C. Hall, Fast Software Encryption, Fifth International Workshop Proceedings (March 1998), Springer-Verlag, 1998, pp. 168-188.

Our work centered around analyzing pseudo-random number generators (PRNGs): the mechanisms used by real-world secure systems to generate cryptographic keys, initialization vectors, and other values assumed to be random. We argue that PRNGs are their own unique type of cryptographic primitive, and propose a model for analyzing them. We discuss possible attacks against this model, and cryptanalyze four real-world PRNGs: X9.17, RSAREF 2.0, DSA RNG, and CryptoLib.

http://www.schneier.com/paper-prngs.html


Risks of Key Recovery, Key Escrow, and Trusted Third Party Encryption

A year ago, a group of cryptographers published a report outlining the risks of key escrow, key recovery, and trusted third-party encryption. (The differences in the above are more marketing distinctions than anything else.) Our meta-conclusion was simple: The deployment of key recovery systems designed to facilitate surreptitious government access to encrypted data and communications introduces substantial risks and costs. These risks and costs may not be appropriate for many applications of encryption, and they must be more fully addressed as governments consider policies that would encourage ubiquitous key recovery.

Our report was designed to stimulate a public, technical debate and analysis that, in our judgment, must precede any responsible policy decision that could result in the wide-scale deployment of key recovery systems. While there are numerous and important economic, social, and political issues raised by key recovery, the report’s analysis was confined to the technical problems created by deployment of key recovery systems designed to meet government access specifications. As of mid-1998, no substantive response addressing these technical concerns has been offered.

So we re-released the report.

While efforts have been made over the last year to design key recovery systems for commercial purposes, they do not alleviate the concerns raised by deployment at the scale and in the manner required to meet government demands. The design of secure key recovery systems remains technically challenging, and the risks and costs of deploying key recovery systems are poorly understood. Most significantly, government demands for access place additional requirements on key recovery systems, including covert access, ubiquitous adoption, and rapid access to plaintext. There is good reason to believe that these additional requirements amplify the costs and risks of key recovery substantially.

Members of the law enforcement and intelligence communities continue to express concern about widespread use of unescrowed cryptography. At the same time, these communities have expressed increasing alarm over the vulnerability of “critical infrastructure.” But there is a significant risk that widespread insertion of government-access key recovery systems into the information infrastructure will exacerbate, not alleviate, the potential for crime and information terrorism. Increasing the number of people with authorized access to the critical infrastructure and to business data will increase the likelihood of attack, whether through technical means, by exploitation of mistakes or through corruption. Furthermore, key recovery requirements, to the extent that they make encryption cumbersome or expensive, can have the effect of discouraging or delaying the deployment of cryptography in increasingly vulnerable computing and communications networks.

The technical concerns about key recovery and trusted third-party systems in 1998 remain largely unchanged from our 1997 analysis. We specifically do not address questions of how and whether key recovery might benefit law enforcement and whether there are alternatives to key recovery that might achieve equal or greater benefits. However, the predictable costs and risks of key recovery, particularly when deployed on the scale desired by law enforcement, are very substantial. The onus is on the advocates of key recovery to make the case that the benefits outweigh these substantial risks and costs.

Full Report: http://www.cdt.org/crypto/risks98/


CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses, insights, and commentaries on cryptography and computer security.

To subscribe, visit http://www.schneier.com/crypto-gram.html or send a blank message to crypto-gram-subscribe@chaparraltree.com. Back issues are available on http://www.counterpane.com.

Please feel free to forward CRYPTO-GRAM to colleagues and friends who will find it valuable. Permission is granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is president of Counterpane Systems, the author of Applied Cryptography, and an inventor of the Blowfish, Twofish, and Yarrow algorithms. He served on the board of the International Association for Cryptologic Research, EPIC, and VTW. He is a frequent writer and lecturer on cryptography.

Counterpane Systems is a five-person consulting firm specializing in cryptography and computer security. Counterpane provides expert consulting in, design and analysis, implementation and testing, threat modeling, product research and forecasting, classes and training, intellectual property, and export consulting. Contracts range from short-term design evaluations and expert opinions to multi-year development efforts.

http://www.counterpane.com/

Sidebar photo of Bruce Schneier by Joe MacInnis.