Locks and Full Disclosure

  • Bruce Schneier
  • IEEE Security & Privacy
  • March/April 2003

View or Download in PDF Format

The full disclosure vs bug secrecy debate is a lot larger than computer security. Blaze’s paper on master-key locking systems in this issue is an illustrative case in point. It turns out that the ways we’ve learned to conceptualize security and attacks in the computer world are directly applicable to other areas of security—like door locks. But the most interesting part of this entire story is that the locksmith community went ballistic after learning about what Blaze did.

The technique was known in the locksmithing community and in the criminal community for over a century, but was never discussed in public and remained folklore. Customers who bought these master key systems for over a century were completely oblivious to the security risks. Locksmiths liked it this way, believing that the security of a system is increased by keeping these sorts of vulnerabilities from the general population.

The bug secrecy position is a lot easier to explain to a layman. If there’s a vulnerability in a system, it’s better not to make that vulnerability public. The bad guys will learn about it and use it, the argument goes. Last month’s SQL Slammer is a case in point. If the hacker who wrote the worm didn’t have access to the public information about the SQL vulnerability, maybe he wouldn’t have written the worm. The problem, according to this position, is more the information about the vulnerability and less the vulnerability itself.

This position ignores the fact that public scrutiny is the only reliable way to improve security. Several master key designs are immune to the 100-year-old attack that Blaze rediscovered. They’re not common in the marketplace because customers don’t understand the risks, and because locksmiths continue to knowingly sell a flawed security system. This is no different than the computer world. Before software vulnerabilities were routinely published, vendors would not bother spending the time and money to fix vulnerabilities, believing that the security of secrecy. And since customers didn’t know any better, they bought these systems believing them to be secure. If we return to a world of bug secrecy in computers, we’ll have 100-year-old vulnerabilities known by a few in the security community and by the hacker underground.

That’s the other fallacy with the locksmiths’ argument. Techniques like this are passed down as folklore in the criminal community as well as the locksmithing community. In 1994, a thief made his own master key to a series of safe-deposit boxes and stole A $1.5 million in jewels. The same thing happens in the computer world. By the time a computer vulnerability is announced in the press and patched, it’s already folklore in the hacker underground. Attackers don’t abide by secrecy agreements.

This culture clash is happening in many areas of security. Attorney General Ashcroft is trying to keep details of many anti-terrorism countermeasures secret so as not to educate the terrorists. But at the same time, the people—to whom he is ultimately accountable—are not allowed to evaluate the countermeasures, or comment on their efficacy. Security can’t improve because there’s no public debate or public education. Whatever attacks and defenses people learn will become folklore, never spoken about in the open but whispered from security engineer to security engineer and from terrorist to terrorist. And maybe in 100 years someone will publish an attack that some security engineers knew about, that terrorist and criminals had been exploiting for much of that time, but that the general public was blissfully unaware of.

Secrecy prevents people from assessing their own risk. For example, in the master key case, even if there weren’t more secure designs available, many customers might have decided not to use master keying if they knew how easy it was for an attacker to make his own master key.

I’d rather have as much information as I can to make an informed decision about security. I’d rather have the information I need to pressure vendors to improve security. I don’t want to live in a world where locksmiths can sell me a master key system that they know doesn’t work or where the government can implement security measures without accountability.

Categories: Computer and Information Security

Sidebar photo of Bruce Schneier by Joe MacInnis.