Entries Tagged "national security policy"

Page 58 of 59

New U.S. Government Cybersecurity Position

From InfoWorld:

The Department of Homeland Security Cybersecurity Enhancement Act, approved by the House Subcommittee on Economic Security, Infrastructure Protection and Cybersecurity, would create the position of assistant secretary for cybersecurity at DHS. The bill, sponsored by Representatives Mac Thornberry, a Texas Republican, and Zoe Lofgren, a California Democrat, would also make the assistant secretary responsible for establishing a national cybersecurity threat reduction program and a national cybersecurity training program….

The top cybersecurity official at DHS has been the director of the agency’s National Cyber Security Division, a lower-level position, and technology trade groups for several months have been calling for a higher-level position that could make cybersecurity a higher priority at DHS.

Sadly, this isn’t going to amount to anything. Yes, it’s good to have a higher-level official in charge of cybersecurity. But responsibility without authority doesn’t work. A bigger bully pulpit isn’t going to help without a coherent plan behind it, and we have none.

The absolute best thing the DHS could do for cybersecurity would be to coordinate the U.S. government’s enormous purchasing power and demand more secure hardware and software.

Here’s the text of the act, if anyone cares.

Posted on May 6, 2005 at 8:05 AMView Comments

The PITAC Report on CyberSecurity

I finally got around to reading the President’s Information Technology Advisory Committee (PITAC) report entitled “Cyber Security: A Crisis of Prioritization” (dated February 2005). The report looks at the current state of federal involvement in cybersecurity research, and makes recommendations for the future. It’s a good report, and one which the administration would do well to listen to.

The report’s recommendations are based on two observations. The observations are that 1) cybersecurity research is primarily focused on current threats, and not long-term threats, and 2) there simply aren’t enough cybersecurity researchers, and no good mechanism for producing them. The federal government isn’t doing enough to foster cybersecurity research, and the effects of this shortfall will be felt more in the long term than the short term.

To remedy this problem, the report makes four specific recommendations (in much more detail than I summarize here). One, the government needs to increase funding for basic cybersecurity research. Two, the government needs to increase the number of researchers working in cybersecurity. Three, the government need to better foster the transfer of technology from research to product development. And four, the government needs to improve its own cybersecurity coordination and oversight. Four good recommendations.

More specifically, the report lists ten technologies that need more research. They are (not in any priority order):

Authentication Technologies
Secure Fundamental Protocols
Secure Software Engineering and Software Assurance
Holistic System Security
Monitoring and Detection
Mitigation and Recovery Methodologies
Cyber Forensics
Modeling and Testbeds for New Technologies
Metrics, Benchmarks, and Best Practices
Non-Technology Issues that Can Compromise Cyber Security

It’s a good list, and I am especially pleased to see the tenth item—one that is usually forgotten. I would add something on the order of “Dynamic Cyber Security Systems”—I think we need serious basic research in how systems should react to new threats and how to update the security of already fielded system—but that’s all I would change.

The report itself is a bit repetitive, but it’s definitely worth skimming.

Posted on April 27, 2005 at 8:52 AMView Comments

Security Trade-Offs

An essay by an anonymous CSO. This is how it begins:

On any given day, we CSOs come to work facing a multitude of security risks. They range from a sophisticated hacker breaching the network to a common thug picking a lock on the loading dock and making off with company property. Each of these scenarios has a probability of occurring and a payout (in this case, a cost to the company) should it actually occur. To guard against these risks, we have a finite budget of resources in the way of time, personnel, money and equipment—poker chips, if you will.

If we’re good gamblers, we put those chips where there is the highest probability of winning a high payout. In other words, we guard against risks that are most likely to occur and that, if they do occur, will cost the company the most money. We could always be better, but as CSOs, I think we’re getting pretty good at this process. So lately I’ve been wondering—as I watch spending on national security continue to skyrocket, with diminishing marginal returns—why we as a nation can’t apply this same logic to national security spending. If we did this, the war on terrorism would look a lot different. In fact, it might even be over.

The whole thing is worth reading.

Posted on April 22, 2005 at 12:32 PMView Comments

Secrecy and Security

Nice op-ed on the security problems with secrecy.

Some information that previously was open no doubt needs to be classified now. Terrorism alters perspectives. But the terrorist threat also has provided cover for bureaucrats who instinctively opt for secrecy and public officials who would prefer to keep the public in the dark to avoid accountability.

Posted on April 7, 2005 at 9:40 AMView Comments

The Silliness of Secrecy

This is a great article on some of the ridiculous effects of government secrecy. (Unfortunately, you have to register to read it.)

Ever since Sept. 11, 2001, the federal government has advised airplane pilots against flying near 100 nuclear power plants around the country or they will be forced down by fighter jets. But pilots say there’s a hitch in the instructions: aviation security officials refuse to disclose the precise location of the plants because they
consider that “SSI”—Sensitive Security Information.

“The message is; ‘please don’t fly there, but we can’t tell you where there is,'” says Melissa Rudinger of the Aircraft Owners and Pilots Association, a trade group representing 60% of American pilots.

Determined to find a way out of the Catch-22, the pilots’ group sat down with a commercial mapping company, and in a matter of days plotted the exact geographical locations of the plants from data found on the Internet and in libraries. It made the information available to its 400,000 members on its Web site—until officials from the Transportation Security Administration asked them to take the information down. “Their concern was that [terrorists] mining the Internet could use it,” Ms. Rudinger says.

And:

For example, when a top Federal Aviation Administration official testified last year before the 9/11 commission, his remarks were
broadcast live nationally. But when the administration included a transcript in a recent report on threats to commercial airliners, the testimony was heavily edited. “How do you redact something that
is part of the public record?” asked Rep. Carolyn Maloney, (D., N.Y.) at a recent hearing on the problems of government
overclassification. Among the specific words blacked out were the seemingly innocuous phrase: “we are hearing this, this, this, this
and this.”

Government officials could not explain why the words were withheld, other than to note that they were designated SSI.

Posted on March 24, 2005 at 9:48 AMView Comments

Sensitive Security Information (SSI)

For decades, the U.S. government has had systems in place for dealing with military secrets. Information is classified as either Confidential, Secret, Top Secret, or one of many “compartments” of information above Top Secret. Procedures for dealing with classified information were rigid: classified topics could not be discussed on unencrypted phone lines, classified information could not be processed on insecure computers, classified documents had to be stored in locked safes, and so on. The procedures were extreme because the assumed adversary was highly motivated, well-funded, and technically adept: the Soviet Union.

You might argue with the government’s decision to classify this and not that, or the length of time information remained classified, but if you assume the information needed to remain secret, than the procedures made sense.

In 1993, the U.S. government created a new classification of information—Sensitive Security Information—that was exempt from the Freedom of Information Act. The information under this category, as defined by a D.C. court, was limited to information related to the safety of air passengers. This was greatly expanded in 2002, when Congress deleted two words, “air” and “passengers,” and changed “safety” to “security.” Currently, there’s a lot of information covered under this umbrella.

The rules for SSI information are much more relaxed than the rules for traditional classified information. Before someone can have access to classified information, he must get a government clearance. Before someone can have access to SSI, he simply must sign an NDA. If someone discloses classified information, he faces criminal penalties. If someone discloses SSI, he faces civil penalties.

SSI can be sent unencrypted in e-mail; a simple password-protected file is enough. A person can take SSI home with him, read it on an airplane, and talk about it in public places. People entrusted with SSI information shouldn’t disclose it to those unauthorized to know it, but it’s really up to the individual to make sure that doesn’t happen. It’s really more like confidential corporate information than government military secrets.

The U.S. government really had no choice but to establish this classification level, given the kind of information they needed to work with. for example, the terrorist “watch” list is SSI. If the list falls into the wrong hands, it would be bad for national security. But think about the number of people who need access to the list. Every airline needs a copy, so they can determine if any of their passengers are on the list. That’s not just domestic airlines, but foreign airlines as well—including foreign airlines that may not agree with American foreign policy. Police departments, both within this country and abroad, need access to the list. My guess is that more than 10,000 people have access to this list, and there’s no possible way to give all them a security clearance. Either the U.S. government relaxes the rules about who can have access to the list, or the list doesn’t get used in the way the government wants.

On the other hand, the threat is completely different. Military classification levels and procedures were developed during the Cold War, and reflected the Soviet threat. The terrorist adversary is much more diffuse, much less well-funded, much less technologically advanced. SSI rules really make more sense in dealing with this kind of adversary than the military rules.

I’m impressed with the U.S. government SSI rules. You can always argue about whether a particular piece of information needs to be kept secret, and how classifications like SSI can be used to conduct government in secret. But if you take secrecy as an assumption, SSI defines a reasonable set of secrecy rules against a new threat.

Background on SSI

TSA’s regulation on the protection of SSI

Controversies surrounding SSI

My essay explaining why secrecy is often bad for security

Posted on March 8, 2005 at 10:37 AMView Comments

GovCon

There’s a conference in Washington, DC, in March that explores technologies for intelligence and terrorism prevention.

The 4th Annual Government Convention on Emerging Technologies will focus on the impact of the Intelligence Reform and Terrorism Prevention Act signed into law by President Bush in December 2004.

The departments and agencies of the National Security Community are currently engaged in the most comprehensive transformation of policy, structure, doctrine, and capabilities since the National Security Act of 1947.

Many of the legal, policy, organizational, and cultural challenges to manage the National Security Community as an enterprise and provide a framework for fielding new capabilities are being addressed. However, there are many emerging technologies and commercial best practices available to help the National Security Community achieve its critical mission of keeping America safe and secure.

There’s a lot of interesting stuff on the agenda, including some classified sessions. I’m especially interested in this track:

Track Two: Attaining Tailored Persistence

Explore the technologies required to attain persistent surveillance and tailored persistence.

What does “persistent surveillance” mean, anyway?

Posted on February 3, 2005 at 9:07 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.