Schneier on Security
A blog covering security and security technology.
« DHS's Identification Card |
| Police Foil Bank Electronic Theft »
April 4, 2005
The Price of Restricting Vulnerability Information
Interesting law article:
There are calls from some quarters to restrict the publication of information about security vulnerabilities in an effort to limit the number of people with the knowledge and ability to attack computer systems. Scientists in other fields have considered similar proposals and rejected them, or adopted only narrow, voluntary restrictions. As in other fields of science, there is a real danger that publication restrictions will inhibit the advancement of the state of the art in computer security. Proponents of disclosure restrictions argue that computer security information is different from other scientific research because it is often expressed in the form of functioning software code. Code has a dual nature, as both speech and tool. While researchers readily understand the information expressed in code, code enables many more people to do harm more readily than with the non-functional information typical of most research publications. Yet, there are strong reasons to reject the argument that code is different, and that restrictions are therefore good policy. Code's functionality may help security as much as it hurts it and the open distribution of functional code has valuable effects for consumers, including the ability to pressure vendors for more secure products and to counteract monopolistic practices.
Posted on April 4, 2005 at 7:25 AM
• 13 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
The solution is simple enough:
Stop making mistakes.
No mistakes == No need for disclosure.
Until humans stop making mistakes disclosure is necessary to enlighten ourselves on our mistakes and strive for not making the same mistake again.
Bruce, this is your closest yet to actually endorsing virus writers. By saying "open distribution of functional code has the valuable effect...of counteract[ing] monopolistic practices" you could mean nothing else. News reports about vulnerabilities in Microsoft products damage the company's reputation and potentially affect its sales--that is true enough and I would agree that that is fair. Bad press does not required functional code to be openly distributed, however. The general public does not need to see proof of the vulnerabilities. Open distribution of functional code would not add to the negative publicity unless the code is used to create a virus or a worm. That is the manner by which it can, as you say, counteract monopolistic practices.
Computer security isn't the only information field with problems. Molecular biology has its problems too! Theoretically, it is possible to synthesize a virus from it's genomic code. I do not believe that the variola virus genetic code has been removed from Genbank yet, but I am sure there is pressure to do so. Also, with the wholesale patenting of genes, unless you pay a fee to the patent holder, in some cases, you are not allowed to test people for the presence of a gene that predisposes them to a cancer, even though the genetic sequence of the disease gene is public knowledge. Computer security can be fixed if programmers write better code and build safeguards into their systems. Public and personal genetic health is another matter entirely - one in which we do not have rights to our own genomes. Congress still has not addressed the issue of genetic discrimination, i.e. does the presence of a gene that predisposes you to a cancer mean that a health insurance company will refuse to fund your treatment because it is a pre-existing condition? What does this have to do with security? A lot. Some day, our genomic data will be part of our medical records. Even if you don't want to know if you'll get Huntington's disease, an insurance company might want that information so it will know whether to sell you a policy or not.
"Bruce, this is your closest yet to actually endorsing virus writers. By saying "open distribution of functional code has the valuable effect...of counteract[ing] monopolistic practices" you could mean nothing else."
Just an FYI: Please note the source that BS linked to the quote:(being) Jennifer Stisa Granick
I stand corrected. I didn't notice it notice it was a citation. Yet one can reasonably interpret a citation without commentary as an endorsement of the view expressed. The author is basically saying that functional code should be distributed precisely so that exploits could be developed, with the justification that these exploits, somehow, extrajudicially, protect consumers from monopolies. Security research thus becomes a means to promote lawlessness.
"Even if you don't want to know if you'll get Huntington's disease, an insurance company might want that information so it will know whether to sell you a policy or not"
in X years from now one is (potentially) a dead/chronically ill person. what we as a society are going to do about this.
i would expect that society/government will pay money to this person.
P.S."Even if you don't want to know if you'll get Huntington's disease, an insurance company might want that information so it will know whether to sell you a policy or not"
does it mean that if i am a healthy (genetically) person i pay less to the insurance company for the same level of protection ?
Parents will have to start to think about insurance costs of their not-born children well before marriage (sounds rather cynical)
This is getting way off topic, but:
"Parents will have to start to think about insurance costs of their not-born children well before marriage."
Something similar is already happening. There is a recessive gene that, when a child gets two copies, causes them to waste away and die at age about 6 or 7. This gene is fairly common among the Jews, who tend to intermarry. A rabbi who lost several children to this started up a genetic testing service aimed at the Jewish community:
Unmarried people would send a blood sample. They wouldn't get told whether they had the recessive gene, but when they start dating someone, they can write to the service with that person's name/identity. If the other person also has a sample with them, the service will tell the couple if they both have the recessive gene. (The service then expanded to cover other genetic conditions too.)
I read about this a few months ago, but I can't remember more details.
Chung Leong, I think a citation need not imply support for the cited position. I might, for instance, cite as provocative a novel defense of a position I don't personally hold.
With regard to the original topic, it's hard for me to see how one can be precise about a vulnerability without giving the code, or the moral equivalent thereof--a complete specification of what instructions need to be carried out, and under which conditions. One might argue that the latter is preferable because it has the same level of precision, but isn't ready to work "out of the box."
But it's hard to see how this helps matters in any but the shortest of runs. The specificity already permits anyone experienced with the system and its instructions to produce running code, and they aren't likely to keep that code secret for long. In the meantime, you've put up some barrier against legitimate researchers who might find a useful remedy. Not a very high barrier, perhaps, but one all the same.
In any case, I remember a talk on the life cycle of vulnerabilities and their exploits. Usually, the peak of exploit instances peaks way after the vulnerability is discovered, after the exploit is first coded up, and even after the patch has been developed and researched. You can let the toothpaste out of the tube, it seems, but you can't make everyone brush. :-)
I found the following quote quite interesting. In particular, I had no idea about the mentioned study.
Customer pressure for better products is one important incentive for companies to create secure products. The ability of consumers to bring that pressure to bear must be backed, not undermined, by law. Varian argues that system design will not improve unless liability rules are structured such that the party who is best suited to manage risk bears the financial responsibility if security is breached. Varian points to Ross Anderson's study of fraud at automated teller machines. In the U.K., where errors are presumptively against bank customers, the machines are insecure and fraud is rampant. In the U.S., where errors are presumptively the fault of the bank, teller machines are far more secure and there is far less fraud. Liability rules can allocate the incentives for security to maximize benefit.142 However, liability cannot be imposed in the absence of information about insecurity. In a networked economy, it is all the more important for customers to be well informed about security.
It seems to me that a lot of these issues are really just symptoms of a strong anti-consumer movement that's been growing over the last decade or two. In a way, corporations are becoming citizens with rights that exceed those of regular people. Taken from that viewpoint, a lot of our security, privacy, speech and similar decisions make sense.
Yeah, we're waaaay off topic now, but speaking of open disclosure of information that may do more harm than good, I question the underpinning of your post:
"This gene is fairly common among the Jews, who tend to intermarry. A rabbi who lost several children to this started up a genetic testing service aimed at the Jewish community. [...] I read about this a few months ago, but I can't remember more details."
What do you mean by intermarry? A Jew marrying another Jew? That would be the opposite of intermarriage, at least by Jewish definition.
Could I see the reference or more details to this? Can you *disclose* a supporting source? Sounds like some fanatical right-wing quack theology to me.
First of all, it is common knowledge that intermarriage among Jews (Jews marrying non-Jews) today is quite high. The LA Times has conducted several studies (mainly comparing LA to the rest of the country) and found a sharp rise from the 1970s to the 1990s of Jewish intermarriage (Jews were defined as those with at least one Jewish parent and who give their religion as Judiasm, those with at least one Jewish parent, and Jews by choice). On top of those numbers, it should be obvious that immigrants of any/all types tend to intermarry, especially by the second and third generation. Israel itself is debating the rise in intermarriage that was seen with each rise in immigration.
Second, I hope you are not trying to quietly advocate some sort of opinion on the quasi-science of banning intermarriage or suggest that we consider a return to Jim Crow laws. Please realize that current American anti-intermarriage laws have been written without any scientific input or merit. In fact you yourself credit a Rabbi with developing a program to help prevent intermarriage, which sounds eerily familiar to the ultra-religious anti-cousin marriage lobbyists in the American North East around the turn of the last Century who passed laws to prevent it. I'm not saying that recessive genes shouldn't be studied, or that science doesn't matter (e.g. If we should avoid recessive genes then how can we think blue eyes are ok? Even stranger, why do American jurors statistically prefer them on the stand?).
I am saying quackery is not science, so don't be so easily fooled and/or try to pass it off as such, especially if you can not disclose your sources.
Disclosure and public scrutiny makes for better security products. Take the example of how the federal government only accepts encryption standards that are subjected to public review rather than using closed, proprietary cryptosystems. This public scrutiny forces these cryptosystems to undergo examination by professionals. I think open source projects also enjoy the benefits of this public scrutiny. Proprietary technologies should also appreciate the value of public review and disclosure.
Brian Tung, you missed my point. At issue here isn't whether disclosure of code leads to exploits, but the author's position that it should be permitted because it does lead to exploits.
I give her credit for being honest. Absent clear and present threats, consumers would not view a more secure product as being superior to one that's less. Security researchers have to thus help the public see the light by assisting the development of exploits against this lesser product.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.