Internet Shield: Secrecy and security

  • Bruce Schneier
  • SF Chronicle
  • March 2, 2003

THERE’S considerable confusion between the concepts of secrecy and security, and it is causing a lot of bad security and some surprising political arguments. Secrecy is not the same as security, and most of the time secrecy contributes to a false feeling of security instead of to real security.

Last month, the SQL Slammer worm ravished the Internet, infecting in some 15 minutes about 13 root servers that direct information traffic, and thus disrupting services as diverse as the 911 network in Seattle and much of Bank of America’s 13,000 ATM machines. The worm took advantage of a software vulnerability in a Microsoft database management program, one that allowed a malicious piece of software to take control of the computer.

This vulnerability had been made public six months previously when a respected British computer researcher had published the code on the Web.

During the same month, an AT&T researcher published a paper that revealed the vulnerability in master- key systems for door locks, the kind that allow you to have a key to your office and the janitor to have a single key that opens every office. The gap in security is this: The system allows someone with only one office key, and access to the lock, to create a master key for himself. This vulnerability was known in the locksmithing community for more than a century, but was never revealed to the general public.

Many argue that secrecy is good for security; that both computer and lock vulnerabilities are better kept secret. Making public the weak point only helps the bad guys, the argument goes. Now that more burglars know about the lock vulnerability, maybe we’re more at risk. If the hacker who wrote the Sapphire worm (a.k.a. SQL Slammer) program didn’t have access to the public information about the software’s vulnerability, maybe he wouldn’t have written the worm. The problem is, according to this position, with the information about the weak spot, not the weak spot itself.

This position in the debate ignores the fact that public scrutiny is the only reliable way to improve security—be it of the nation’s roads, bridges and ports or of our critically important computer networks. Several master- key designs are secure to the key- copying kind of attack, but they’re not widely used because customers don’t understand the risks they are taking by installing the old system, and because locksmiths continue to knowingly sell a flawed security system. It is no different in the computer world.

At the same time as the SQL software vulnerability was publicized, Microsoft made a software patch available to plug the security chinks in the networks. But before software bugs were routinely published, software companies routinely denied their existence and wouldn’t bother fixing them, believing in the security of secrecy. And because customers didn’t know any better, they bought these systems believing them to be secure. If we return to a practice of keeping these software bugs secret, we’ll have vulnerabilities known by a few in the security community and by much of the hacker underground.

That’s the other fallacy with the locksmiths’ argument. Techniques such as this are passed down as folklore in the criminal community as well as the locksmithing community. In 1994, a thief made his own master key to a series of hotel safe-deposit boxes and stole $1.5 million in jewels. The same thing happens in the computer world. By the time most computer vulnerabilities are announced in the press, they’re folklore in the hacker underground. Attackers don’t abide by secrecy agreements.

HOMELAND SECURITY AT ISSUE

This clash of the secrecy versus openness camps is happening in many areas of security. U.S. Attorney General John Ashcroft is trying to keep details of many anti-terrorism countermeasures secret. Secret arrests are now permitted, and the criteria for those secret arrests are themselves secret. The standards for the Department of Homeland Security’s color-coded terrorism threat levels are secret. Profiling information used to flag certain airline passengers is secret. Information about the infrastructure of plants and government buildings are secret. This keeps terrorists in the dark, but at the same time, the citizenry—to whom the government is ultimately accountable—is not allowed to evaluate the countermeasures, or comment on their efficacy. Security can’t improve because there’s no public debate or public education. The nature of the attacks people learn to mount, and the defenses to counter them, will become folklore, never spoken about in the open but whispered from security engineer to security engineer and from terrorist to terrorist. And maybe in 100 years someone will publish the details of a method that some security engineers knew about, that terrorists and criminals had been exploiting for much of that time, but that the general public was blissfully unaware of.

Secrecy prevents people from assessing their own risk. In the master-key case, even if there weren’t more secure designs available, many customers might have decided not to use master keying if they knew how easy it was for an intruder to make his own master key. Ignorance is bliss, but bliss is not the same as security. It’s better to have as much information as possible to make informed security decisions.

I’d rather have the information I need to exert market pressure on vendors to improve security. I don’t want to live in a world where locksmiths can sell me a master-key system that they know doesn’t work or where the government can implement vulnerable security measures without accountability.

Categories: Computer and Information Security, Terrorism

Sidebar photo of Bruce Schneier by Joe MacInnis.