WORD IN EDGEWISE: Scrambled Message

Key recovery is like trying to fit a square peg into a round hole. No matter how much you finagle it, it's simply not going to work.

  • Bruce Schneier
  • Information Security
  • October 19, 1998

In the September issue of Information Security, Commerce Undersecretary William Reinsch suggests that U.S. crypto export policy hinges on the concept of “balance” (Q&A: “Crypto’s Key Man”).

For key recovery policy to be successful, he argues, it must achieve a balance between privacy and access, between the needs of consumers and the requirements of the law-enforcement community.

For those who have followed the key recovery debate, Reinsch’s comments will have a familiar ring. Ever since the Clipper chip first made headlines in 1993, the crypto community has debated the notion of key recovery (or key escrow, or data recovery, or trusted third party or any other marketing term used to describe the same concept).

Unfortunately, Reinsch’s comments demonstrate that today’s debate hasn’t changed much from that of five years ago. The issues and positions remain the same, and little progress has been made toward the type of compromise he describes. A policy of balance was impossible to achieve five years ago, and it remains so today. From his comments, it’s clear Mr. Reinsch doesn’t understand why.


Government proponents of key recovery argue that people who use encryption cannot be trusted to use it wisely—or, for that matter, legally. Hence, controls must be put in place to ensure that law-enforcement can decrypt messages to crack down on illicit, encrypted communications. An added bonus is that companies will be able to decrypt the messages of errant employees.

Over the last year, a number of product vendors have tried to design commercial key recovery systems. None of these systems, however, was able to meet the scale and specifications required by federal policy. The design of secure key recovery systems remains technically challenging, and the risks and costs of deploying such systems are poorly understood. Most importantly, government demands for access place additional requirements on key recovery systems, including covert access, ubiquitous adoption and rapid access to plaintext. These requirements amplify the costs and risks of key recovery.

The law-enforcement and intelligence communities have outlined conflicting reasons why key recovery is necessary. On the one hand, they express concern about the widespread use of “unescrowed” cryptography. On the other, they express alarm about the vulnerability of the nation’s “critical infrastructure.” The problem is, the widespread insertion of government-access key recovery systems into the information infrastructure will exacerbate, not alleviate, the potential for crime and information terrorism. Increasing the number of people with authorized access to the critical infrastructure and business data will increase the likelihood of attack, whether through technical means, by exploitation of mistakes or through corruption. Furthermore, key recovery requirements, to the extent that they make encryption cumbersome or expensive, may actually discourage or delay the deployment of cryptography in increasingly vulnerable computing and communications networks. Even the National Security Agency (NSA), in an internal, unclassified report, concluded that the risks of key recovery far outweigh the advantages.

Cryptographers have no idea how to build a secure key recovery system. Moreover, we have no idea how to graft a secure key recovery mechanism on existing systems. The “needs” of law-enforcement are at odds with the requirements for secure system design. Our critical computer infrastructure is too important to put at risk through the mechanism of key recovery.

Cryptography does not give us new powers or new abilities. It takes existing business and social constructs—privacy, authenticity, fairness, anonymity, audit—and moves them onto computer networks. These constructs are essential to business, to democracy, to civilization. The cryptographic policy Reinsch advocates is a threat—not a cornerstone—to these fundamental principles.


Nitty Gritty

  • Key recovery means that users must rely on the key-recovery mechanism for security.
  • Key recovery means that the security of users’ data is not under their control.
  • Key recovery takes simple key-management problems makes them complex.
  • Key recovery builds single points of failure for security systems.
  • Key recovery takes security out of the control of those need to be protected.

For a report on the risks of key recovery, go to www.cdt.org/crypto/risks98.

Categories: Computer and Information Security, National Security Policy

Sidebar photo of Bruce Schneier by Joe MacInnis.