Entries Tagged "nuclear power"

Page 2 of 2

Avian Flu and Disaster Planning

If an avian flu pandemic broke out tomorrow, would your company be ready for it?

Computerworld published a series of articles on that question last year, prompted by a presentation analyst firm Gartner gave at a conference last November. Among Gartner’s recommendations: “Store 42 gallons of water per data center employee—enough for a six-week quarantine—and don’t forget about food, medical care, cooking facilities, sanitation and electricity.”

And Gartner’s conclusion, over half a year later: Pretty much no organizations are ready.

This doesn’t surprise me at all. It’s not that organizations don’t spend enough effort on disaster planning, although that’s true; it’s that this really isn’t the sort of disaster worth planning for.

Disaster planning is critically important for individuals, families, organizations large and small, and governments. For the individual, it can be as simple as spending a few minutes thinking about how he or she would respond to a disaster. For example, I’ve spent a lot of time thinking about what I would do if I lost the use of my computer, whether by equipment failure, theft or government seizure. As a result, I have a pretty complex backup and encryption system, ensuring that 1) I’d still have access to my data, and 2) no one else would. On the other hand, I haven’t given any serious thought to family disaster planning, although others have.

For an organization, disaster planning can be much more complex. What would it do in the case of fire, flood, earthquake, and so on? How would its business survive? The resultant disaster plan might include backup data centers, temporary staffing contracts, planned degradation of services, and a host of other products and service—and consultants to tell you how to use it all.

And anyone who does this kind of thing knows that planning isn’t enough: Testing your disaster plan is critical. Far too often the backup software fails when it has to do an actual restore, or the diesel-powered emergency generator fails to kick in. That’s also the flaw with the emergency kit suggestions I linked to above; if you don’t know how to use a compass or first-aid kit, having one in your car won’t do you much good.

But testing isn’t just valuable because it reveals practical problems with a plan. It also has enormous ancillary benefits for your organization in terms of communication and team building. There’s nothing like a good crisis to get people to rely on each other. Sometimes I think companies should forget about those team-building exercises that involve climbing trees and building fires, and instead pretend that a flood has taken out the primary data center.

It really doesn’t matter what disaster scenario you’re testing. The real disaster won’t be like the test, regardless of what you do, so just pick one and go. Whether you’re an individual trying to recover from a simulated virus attack, or an organization testing its response to a hypothetical shooter in the building, you’ll learn a lot about yourselves and your organization, as well as your plan.

There is a sweet spot, though, in disaster preparedness. Some disasters are too small or too common to worry about. (“We’re out of paper clips!? Call the Crisis Response Team together. I’ll get the Paper Clip Shortage Readiness Program Directive Manual Plan.”) And others are too large or too rare.

It makes no sense to plan for total annihilation of the continent, whether by nuclear or meteor strike: that’s obvious. But depending on the size of the planner, many other disasters are also too large to plan for. People can stockpile food and water to prepare for a hurricane that knocks out services for a few days, but not for a Katrina-like flood that knocks out services for months. Organizations can prepare for losing a data center due to a flood, fire, or hurricane, but not for a Black-Death-scale epidemic that would wipe out a third of the population. No one can fault bond trading firm Cantor Fitzgerald, which lost two thirds of its employees in the 9/11 attack on the World Trade Center, for not having a plan in place to deal with that possibility.

Another consideration is scope. If your corporate headquarters burns down, it’s actually a bigger problem for you than a citywide disaster that does much more damage. If the whole San Francisco Bay Area were taken out by an earthquake, customers of affected companies would be far more likely to forgive lapses in service, or would go the extra mile to help out. Think of the nationwide response to 9/11; the human “just deal with it” social structures kicked in, and we all muddled through.

In general, you can only reasonably prepare for disasters that leave your world largely intact. If a third of the country’s population dies, it’s a different world. The economy is different, the laws are different—the world is different. You simply can’t plan for it; there’s no way you can know enough about what the new world will look like. Disaster planning only makes sense within the context of existing society.

What all of this means is that any bird flu pandemic will very likely fall outside the corporate disaster-planning sweet spot. We’re just guessing on its infectiousness, of course, but (despite the alarmism from two and three years ago), likely scenarios are either moderate to severe absenteeism because people are staying home for a few weeks—any organization ought to be able to deal with that—or a major disaster of proportions that dwarf the concerns of any organization. There’s not much in between.

Honestly, if you think you’re heading toward a world where you need to stash six weeks’ worth of food and water in your company’s closets, do you really believe that it will be enough to see you through to the other side?

A blogger commented on what I said in one article:

Schneier is using what I would call the nuclear war argument for doing nothing. If there’s a nuclear war nothing will be left anyway, so why waste your time stockpiling food or building fallout shelters? It’s entirely out of your control. It’s someone else’s responsibility. Don’t worry about it.

Almost. Bird flu, pandemics, and disasters in general—whether man-made like 9/11, natural like bird flu, or a combination like Katrina—are definitely things we should worry about. The proper place for bird flu planning is at the government level. (These are also the people who should worry about nuclear and meteor strikes.) But real disasters don’t exactly match our plans, and we are best served by a bunch of generic disaster plans and a smart, flexible organization that can deal with anything.

The key is preparedness. Much more important than planning, preparedness is about setting up social structures so that people fall into doing something sensible when things go wrong. Think of all the wasted effort—and even more wasted desire—to do something after Katrina because there was no way for most people to help. Preparedness is about getting people to react when there’s a crisis. It’s something the military trains its soldiers for.

This advice holds true for organizations, families, and individuals as well. And remember, despite what you read about nuclear accidents, suicide terrorism, genetically engineered viruses and mutant man-eating badgers, you live in the safest society in the history of mankind.

This essay originally appeared in Wired.com.

EDITED TO ADD (8/1): A good rebuttal.

Posted on July 26, 2007 at 7:14 AMView Comments

How Australian Authorities Respond to Potential Terrorists

Watch the video of how the Australian authorities react when someone—dressed either as an American or Arab tourist—films the Sydney Harbor Bridge and a nuclear reactor.

The synopsis: The Arab is intercepted within three minutes both times, while the U.S. tourist is given instructions on how to get inside the nuclear facility.

Moral for terrorists: dress like an American.

By the way, Lucas Heights is a research reactor. It produces medical isotopes and performs research, and doesn’t produce power.

Posted on April 24, 2007 at 7:12 AMView Comments

Security Through Begging

From TechDirt:

Last summer, the surprising news came out that Japanese nuclear secrets leaked out, after a contractor was allowed to connect his personal virus-infested computer to the network at a nuclear power plant. The contractor had a file sharing app on his laptop as well, and suddenly nuclear secrets were available to plenty of kids just trying to download the latest hit single. It’s only taken about nine months for the government to come up with its suggestion on how to prevent future leaks of this nature: begging all Japanese citizens not to use file sharing systems—so that the next time this happens, there won’t be anyone on the network to download such documents.

Even if their begging works, it solves the wrong problem. Sad.

EDITED TO ADD (3/22): Another article.

Posted on March 20, 2006 at 2:01 PMView Comments

Jamming Aircraft Navigation Near Nuclear Power Plants

The German government want to jam aircraft navigation equipment near nuclear power plants.

This certainly could help if terrorists want to fly an airplane into a nuclear power plant, but it feels like a movie-plot threat to me. On the other hand, this could make things significantly worse if an airplane flies near the nuclear power plant by accident. My guess is that the latter happens far more often than the former.

Posted on September 29, 2005 at 6:40 AMView Comments

Finding Nuclear Power Plants

Recently I wrote about the government requiring pilots not to fly near nuclear power plants, and then not telling them where those plants are, because of security concerns. Here’s a story about how someone found the exact location of the nuclear power plant in Oyster Creek, N.J., using only publicly available information.

But of course a terrorist would never be able to do that.

Posted on April 6, 2005 at 9:05 AMView Comments

The Silliness of Secrecy

This is a great article on some of the ridiculous effects of government secrecy. (Unfortunately, you have to register to read it.)

Ever since Sept. 11, 2001, the federal government has advised airplane pilots against flying near 100 nuclear power plants around the country or they will be forced down by fighter jets. But pilots say there’s a hitch in the instructions: aviation security officials refuse to disclose the precise location of the plants because they
consider that “SSI”—Sensitive Security Information.

“The message is; ‘please don’t fly there, but we can’t tell you where there is,'” says Melissa Rudinger of the Aircraft Owners and Pilots Association, a trade group representing 60% of American pilots.

Determined to find a way out of the Catch-22, the pilots’ group sat down with a commercial mapping company, and in a matter of days plotted the exact geographical locations of the plants from data found on the Internet and in libraries. It made the information available to its 400,000 members on its Web site—until officials from the Transportation Security Administration asked them to take the information down. “Their concern was that [terrorists] mining the Internet could use it,” Ms. Rudinger says.

And:

For example, when a top Federal Aviation Administration official testified last year before the 9/11 commission, his remarks were
broadcast live nationally. But when the administration included a transcript in a recent report on threats to commercial airliners, the testimony was heavily edited. “How do you redact something that
is part of the public record?” asked Rep. Carolyn Maloney, (D., N.Y.) at a recent hearing on the problems of government
overclassification. Among the specific words blacked out were the seemingly innocuous phrase: “we are hearing this, this, this, this
and this.”

Government officials could not explain why the words were withheld, other than to note that they were designated SSI.

Posted on March 24, 2005 at 9:48 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.