The Halfway House Between Science and Secrets
A recent National Research Council report recognizes that the 9/11 attacks provoked counter-productive security measures that stifle access to fruitful scientific research. Security expert Bruce Schneier talks with Science Progress about the science that makes us smarter and the security that makes us safer.
Listen to the Audio on ScienceProgress.org
Transcript
Earlier this month the National Research Council released a Congressionally-mandated report, ‘Science and Security in a Post 9/11 World,’ which recognizes that the 9/11 attacks provoked a misallocation of United States security resources and led to counter-productive security measures. The NRC warns that the widespread practice of labeling scientific research as ‘sensitive but unclassified’ has had grave consequences for our security and our economy. In order to encourage more sensible science-security policymaking, the NRC has recommended the creation of a new high-level Science and Security Commission to give scientists and government security officials a place to deliberate and negotiate security policies as they relate to science and engineering research.
To better understand the relationship between scientific research and national defense, Science Progress spoke with security technologist and author Bruce Schneier about why secrecy makes for bad policy in science and engineering, and whether or not a new institutionalized science-security dialogue would be helpful or simply theatrical.
J onathan Pfeiffer, Science Progress: The National Research Council is concerned that the federal government has categorized too many scientific research results after 9/11 as ‘sensitive but unclassified.’ Can you explain what this term means and why this is a problem?
Bruce Schneier: It’s kind of a weasel term in the U.S. government and military. There are classified military information and government secrets, and there are levels of classification: confidential, secret, top secret, and code words above that. And there are special rules for dealing with that information. ‘Sensitive but unclassified’ is a halfway house between public information and classified information. It’s not really a secret, but someone somewhere doesn’t want someone else to know, so it becomes a gray area. The rules are a lot sloppier, there’s a lot more leeway, and more and more—not only in science, but everywhere—information that used to be given to the public as a matter of course becomes ‘sensitive but unclassified.’ It could be phone directories; it could be hours of operation for buildings; it could be locations of polling places. And a lot of scientific data, information, and knowledge—stuff that is used by the scientific community, used by businesses, used by everybody—gets stuck in this halfway house between secret and open. It’s a form of secrecy, and it’s a form of stifling information sharing. And where it affects scientists is that science thrives on information sharing. Science works because one person’s research becomes another person’s footnotes.
SP: You wrote in Beyond Fear that secrecy in science and engineering ‘stifles the cycle of innovation and invention that fuels the economic engine of the world’s open societies.’
Schneier: My research becomes data you use for your research. It becomes data someone else uses for their technology. It becomes stuff someone else uses for their products. And all of these feed on each other. Our society and our technology are great because of that openness. Whatever scientists and researchers do becomes fodder for this engine of more research. And when you start cutting off branches—when you start saying, ‘No, that’s too sensitive. Don’t make it public’—you’re stifling research, because now no one else can use that research. Then we can’t benefit from that research, and the benefits are often hidden. We don’t know which piece of research will fuel the next big advance in computers, medical devices, or transportation. So you can’t just say, ‘That doesn’t have application in anything except bad things, so we can make it secret.’ We don’t know that.
There’s a conceit here. The counter-argument is going to be: ‘If we make this research into explosives public, the bad guys will find out about it.’ But that’s true for everything in our society. Everything we do can be used by the good guys and the bad guys. We use cars to get around; bank robbers use cars to drive away. We use telephones; the mafia uses telephones. The reason society works is that the good uses outweigh the bad uses. Sure, you can ban telephones. Sure, you can ban automobiles. Sure, you can ban scientific research. But it doesn’t help, because the benefits of doing these things—of having these technologies, of making them open and available—greatly outweigh the disadvantages. That’s what we’re losing sight of. We’re trying to contain certain types of research because of near-term fears, but then we lose all the long-term benefits.
SP: The NRC is now recommending the full implementation of NSDD-189, Ronald Reagan’s 1985 order to keep unclassified research results open and available to the maximum possible extent. Do you have any concerns about referring, in the world of post 9/11 policymaking, to Cold War-era policies?
Schneier: Well, the devil is in the details. That is a good document if it really does say that we should make research open and available. I don’t care when it was written, if it was written twenty years, thirty years, or fifty years ago. So no, I have no concern about that. We have to read the details to make sure there are no hidden gotchas, but in general, I have no concern.
SP: Do you think we now face science and security issues that policymakers in 1985 were not able to anticipate?
Schneier: I think not. I think the rhetoric that 9/11 changed everything is overrated. The issues of science and openness are just as important now as they were then. The threats and the risks are just as big—and as small—now as they were then. The same philosophy of openness that has served our country for over two hundred years should still be in effect. So, no, I don’t see any change in worldview that would make us have to reassess scientific openness policies from twenty or thirty years ago.
SP: Some scholars of global and international studies will say that the world is a different place now, though perhaps outside the issue of security. Are you concentrating only on security when you say there should be no change in worldview?
Schneier: The world is a different place. There’s a lot more globalization. One of the things that is unanticipated is how much of the research has now moved outside the U.S.: because of secrecy concerns, because of weird laws prohibiting certain kinds of research, or because of problems with visas. A lot of really good research is now being done in Europe, Asia, Australia, and the Middle East. So these policies are hurting us in ways that will take us decades to recover, because we’re losing the scientific advantage our country had.
SP: You have argued before that the value of secrecy should be judged on a case-by-case basis. However, the federal government also needs broad principles and guidelines for regulating scientific publishing. What can the government do?
Schneier: The first principle is that openness should be assumed, and that we should strive for openness wherever possible. There are areas of research that are wholly military, like minefield detection, military missile technology, or nuclear weapons technology. And it makes sense that parts of that research should be classified and should be kept secret. The stuff on the edges, like research into nuclear power, which has both commercial and military applications, should be judged on a case-by-case basis with the bias toward making things open. So it’s only a big problem if you try to classify broad swaths of research. But if you realize that the things that should be kept secret are actually very narrow, then it becomes a much easier problem.
SP: People often assume that the security and science communities are divided by their goals: It is supposed that the science community generally wants everything to be open so that collaboration can flourish, and that the security community is keen to restrict publishing and even to restrict actual research when it could make terror more effective. Is this a valid assumption?
Schneier: Yeah, that’s probably true. Most security people have a fetish for secrecy. It’s a belief that secrecy will make them safer. It’s nonsense. It makes no sense. It’s not the way to play the game. You will find that those in security, especially in national security, want to make everything secret, in the thought that it will make us safer. And science is about openness; science is all about publishing. If you do the research and you don’t publish, you might as well not bother to do the research. You haven’t increased the wealth of human knowledge if you don’t publish.
SP: How effective and constructive right now is the dialogue between the science and security communities?
Schneier: The dialogue is pretty terrible. Right now, especially in politics, security is winning. Whether it’s right or wrong, whether it makes sense or not, security wins. We’re living in a world where common sense, where balanced reasoning, where doing what’s right tends not to win over fear, over paranoia, over security. The dialogue is terrible.
SP: The NRC is exploring ways to institutionalize the dialogue, so that persons with proper security clearances can get together to discuss sensitive issues in scientific research. You argued in Beyond Fear that institutions and bureaucracies often want to appear to be doing good things for security. Is this a problem for the institutionalization of the science-security dialogue?
Schneier: ‘Security theater’ is what I define as security that doesn’t do anything but looks good. A lot of airport security is a great example: It doesn’t actually make us safer, but it looks like we’re doing something. The fear is that when policymakers try to do things in security, they have a predilection toward security theater, because it makes them look like they’re doing something to the public, to their constituents, to whomever. So there is a natural draw toward security theater—toward security measures that make a big press splash because they look good.
The fear is that in a dialogue between scientists and the security community, both sides will be drawn toward security theater. I don’t think that’s much of a risk, or if it is, the value of the dialogue is much greater. Remember: The scientists have a different agenda. Their agenda is amassing human knowledge for progress and openness. So they will have a predilection toward openness, and not a predilection toward secrecy. So they’re going to look at security theater in the same way I do as a researcher: with derision, rather than as a way of reassuring the public that something is being done, even though it might not be effective. So I don’t think there is a big risk, and if there is, the value of an institutionalized dialogue here, I think, is enormous.
Categories: Audio, Recorded Interviews, Text, Written Interviews