Entries Tagged "national security policy"

Page 57 of 58

New Cybersecurity Position at DHS

There’s a major reorganization going on at the Department of Homeland Security. One of the effects is the creation of a new post: assistant secretary for cyber and telecommunications security.

Honestly, it doesn’t matter where the nation’s chief cybersecurity chief sits in the organizational chart. If he has the authority to spend money and write regulations, he can do good. If he only has the power to suggest, plead, and cheerlead he’ll be as frustrated as all the previous ones were.

Posted on July 20, 2005 at 7:44 AMView Comments

Billions Wasted on Anti-Terrorism Security

Recently there have been a bunch of news articles about how lousy counterterrorism security is in the United States, how billions of dollars have been wasted on security since 9/11, and how much of what was purchased doesn’t work as advertised.

The first is from the May 8 New York Times (available at the website for pay, but there are copies here and here):

After spending more than $4.5 billion on screening devices to monitor the nation’s ports, borders, airports, mail and air, the federal government is moving to replace or alter much of the antiterrorism equipment, concluding that it is ineffective, unreliable or too expensive to operate.

Many of the monitoring tools—intended to detect guns, explosives, and nuclear and biological weapons—were bought during the blitz in security spending after the attacks of Sept. 11, 2001.

In its effort to create a virtual shield around America, the Department of Homeland Security now plans to spend billions of dollars more. Although some changes are being made because of technology that has emerged in the last couple of years, many of them are planned because devices currently in use have done little to improve the nation’s security, according to a review of agency documents and interviews with federal officials and outside experts.

From another part of the article:

Among the problems:

  • Radiation monitors at ports and borders that cannot differentiate between radiation emitted by a nuclear bomb and naturally occurring radiation from everyday material like cat litter or ceramic tile.
  • Air-monitoring equipment in major cities that is only marginally effective because not enough detectors were deployed and were sometimes not properly calibrated or installed. They also do not produce results for up to 36 hours—long after a biological attack would potentially infect thousands of people.
  • Passenger-screening equipment at airports that auditors have found is no more likely than before federal screeners took over to detect whether someone is trying to carry a weapon or a bomb aboard a plane.
  • Postal Service machines that test only a small percentage of mail and look for anthrax but no other biological agents.

The Washington Post had a series of articles. The first lists some more problems:

  • The contract to hire airport passenger screeners grew to $741 million from $104 million in less than a year. The screeners are failing to detect weapons at roughly the same rate as shortly after the attacks.
  • The contract for airport bomb-detection machines ballooned to at least $1.2 billion from $508 million over 18 months. The machines have been hampered by high false-alarm rates.
  • A contract for a computer network called US-VISIT to screen foreign visitors could cost taxpayers $10 billion. It relies on outdated technology that puts the project at risk.
  • Radiation-detection machines worth a total of a half-billion dollars deployed to screen trucks and cargo containers at ports and borders have trouble distinguishing between highly enriched uranium and common household products. The problem has prompted costly plans to replace the machines.

The second is about border security.

And more recently, a New York Times article on how lousy port security is.

There are a lot of morals here: the problems of believing companies that have something to sell you, the difficulty of making technological security solutions work, the problems with making major security changes quickly, the mismanagement that comes from any large bureaucracy like the DHS, and the wastefulness of defending potential terrorist targets instead of broadly trying to deal with terrorism.

Posted on June 3, 2005 at 8:17 AMView Comments

New U.S. Government Cybersecurity Position

From InfoWorld:

The Department of Homeland Security Cybersecurity Enhancement Act, approved by the House Subcommittee on Economic Security, Infrastructure Protection and Cybersecurity, would create the position of assistant secretary for cybersecurity at DHS. The bill, sponsored by Representatives Mac Thornberry, a Texas Republican, and Zoe Lofgren, a California Democrat, would also make the assistant secretary responsible for establishing a national cybersecurity threat reduction program and a national cybersecurity training program….

The top cybersecurity official at DHS has been the director of the agency’s National Cyber Security Division, a lower-level position, and technology trade groups for several months have been calling for a higher-level position that could make cybersecurity a higher priority at DHS.

Sadly, this isn’t going to amount to anything. Yes, it’s good to have a higher-level official in charge of cybersecurity. But responsibility without authority doesn’t work. A bigger bully pulpit isn’t going to help without a coherent plan behind it, and we have none.

The absolute best thing the DHS could do for cybersecurity would be to coordinate the U.S. government’s enormous purchasing power and demand more secure hardware and software.

Here’s the text of the act, if anyone cares.

Posted on May 6, 2005 at 8:05 AMView Comments

The PITAC Report on CyberSecurity

I finally got around to reading the President’s Information Technology Advisory Committee (PITAC) report entitled “Cyber Security: A Crisis of Prioritization” (dated February 2005). The report looks at the current state of federal involvement in cybersecurity research, and makes recommendations for the future. It’s a good report, and one which the administration would do well to listen to.

The report’s recommendations are based on two observations. The observations are that 1) cybersecurity research is primarily focused on current threats, and not long-term threats, and 2) there simply aren’t enough cybersecurity researchers, and no good mechanism for producing them. The federal government isn’t doing enough to foster cybersecurity research, and the effects of this shortfall will be felt more in the long term than the short term.

To remedy this problem, the report makes four specific recommendations (in much more detail than I summarize here). One, the government needs to increase funding for basic cybersecurity research. Two, the government needs to increase the number of researchers working in cybersecurity. Three, the government need to better foster the transfer of technology from research to product development. And four, the government needs to improve its own cybersecurity coordination and oversight. Four good recommendations.

More specifically, the report lists ten technologies that need more research. They are (not in any priority order):

Authentication Technologies
Secure Fundamental Protocols
Secure Software Engineering and Software Assurance
Holistic System Security
Monitoring and Detection
Mitigation and Recovery Methodologies
Cyber Forensics
Modeling and Testbeds for New Technologies
Metrics, Benchmarks, and Best Practices
Non-Technology Issues that Can Compromise Cyber Security

It’s a good list, and I am especially pleased to see the tenth item—one that is usually forgotten. I would add something on the order of “Dynamic Cyber Security Systems”—I think we need serious basic research in how systems should react to new threats and how to update the security of already fielded system—but that’s all I would change.

The report itself is a bit repetitive, but it’s definitely worth skimming.

Posted on April 27, 2005 at 8:52 AMView Comments

Security Trade-Offs

An essay by an anonymous CSO. This is how it begins:

On any given day, we CSOs come to work facing a multitude of security risks. They range from a sophisticated hacker breaching the network to a common thug picking a lock on the loading dock and making off with company property. Each of these scenarios has a probability of occurring and a payout (in this case, a cost to the company) should it actually occur. To guard against these risks, we have a finite budget of resources in the way of time, personnel, money and equipment—poker chips, if you will.

If we’re good gamblers, we put those chips where there is the highest probability of winning a high payout. In other words, we guard against risks that are most likely to occur and that, if they do occur, will cost the company the most money. We could always be better, but as CSOs, I think we’re getting pretty good at this process. So lately I’ve been wondering—as I watch spending on national security continue to skyrocket, with diminishing marginal returns—why we as a nation can’t apply this same logic to national security spending. If we did this, the war on terrorism would look a lot different. In fact, it might even be over.

The whole thing is worth reading.

Posted on April 22, 2005 at 12:32 PMView Comments

Secrecy and Security

Nice op-ed on the security problems with secrecy.

Some information that previously was open no doubt needs to be classified now. Terrorism alters perspectives. But the terrorist threat also has provided cover for bureaucrats who instinctively opt for secrecy and public officials who would prefer to keep the public in the dark to avoid accountability.

Posted on April 7, 2005 at 9:40 AMView Comments

The Silliness of Secrecy

This is a great article on some of the ridiculous effects of government secrecy. (Unfortunately, you have to register to read it.)

Ever since Sept. 11, 2001, the federal government has advised airplane pilots against flying near 100 nuclear power plants around the country or they will be forced down by fighter jets. But pilots say there’s a hitch in the instructions: aviation security officials refuse to disclose the precise location of the plants because they
consider that “SSI”—Sensitive Security Information.

“The message is; ‘please don’t fly there, but we can’t tell you where there is,'” says Melissa Rudinger of the Aircraft Owners and Pilots Association, a trade group representing 60% of American pilots.

Determined to find a way out of the Catch-22, the pilots’ group sat down with a commercial mapping company, and in a matter of days plotted the exact geographical locations of the plants from data found on the Internet and in libraries. It made the information available to its 400,000 members on its Web site—until officials from the Transportation Security Administration asked them to take the information down. “Their concern was that [terrorists] mining the Internet could use it,” Ms. Rudinger says.

And:

For example, when a top Federal Aviation Administration official testified last year before the 9/11 commission, his remarks were
broadcast live nationally. But when the administration included a transcript in a recent report on threats to commercial airliners, the testimony was heavily edited. “How do you redact something that
is part of the public record?” asked Rep. Carolyn Maloney, (D., N.Y.) at a recent hearing on the problems of government
overclassification. Among the specific words blacked out were the seemingly innocuous phrase: “we are hearing this, this, this, this
and this.”

Government officials could not explain why the words were withheld, other than to note that they were designated SSI.

Posted on March 24, 2005 at 9:48 AMView Comments

Sensitive Security Information (SSI)

For decades, the U.S. government has had systems in place for dealing with military secrets. Information is classified as either Confidential, Secret, Top Secret, or one of many “compartments” of information above Top Secret. Procedures for dealing with classified information were rigid: classified topics could not be discussed on unencrypted phone lines, classified information could not be processed on insecure computers, classified documents had to be stored in locked safes, and so on. The procedures were extreme because the assumed adversary was highly motivated, well-funded, and technically adept: the Soviet Union.

You might argue with the government’s decision to classify this and not that, or the length of time information remained classified, but if you assume the information needed to remain secret, than the procedures made sense.

In 1993, the U.S. government created a new classification of information—Sensitive Security Information—that was exempt from the Freedom of Information Act. The information under this category, as defined by a D.C. court, was limited to information related to the safety of air passengers. This was greatly expanded in 2002, when Congress deleted two words, “air” and “passengers,” and changed “safety” to “security.” Currently, there’s a lot of information covered under this umbrella.

The rules for SSI information are much more relaxed than the rules for traditional classified information. Before someone can have access to classified information, he must get a government clearance. Before someone can have access to SSI, he simply must sign an NDA. If someone discloses classified information, he faces criminal penalties. If someone discloses SSI, he faces civil penalties.

SSI can be sent unencrypted in e-mail; a simple password-protected file is enough. A person can take SSI home with him, read it on an airplane, and talk about it in public places. People entrusted with SSI information shouldn’t disclose it to those unauthorized to know it, but it’s really up to the individual to make sure that doesn’t happen. It’s really more like confidential corporate information than government military secrets.

The U.S. government really had no choice but to establish this classification level, given the kind of information they needed to work with. for example, the terrorist “watch” list is SSI. If the list falls into the wrong hands, it would be bad for national security. But think about the number of people who need access to the list. Every airline needs a copy, so they can determine if any of their passengers are on the list. That’s not just domestic airlines, but foreign airlines as well—including foreign airlines that may not agree with American foreign policy. Police departments, both within this country and abroad, need access to the list. My guess is that more than 10,000 people have access to this list, and there’s no possible way to give all them a security clearance. Either the U.S. government relaxes the rules about who can have access to the list, or the list doesn’t get used in the way the government wants.

On the other hand, the threat is completely different. Military classification levels and procedures were developed during the Cold War, and reflected the Soviet threat. The terrorist adversary is much more diffuse, much less well-funded, much less technologically advanced. SSI rules really make more sense in dealing with this kind of adversary than the military rules.

I’m impressed with the U.S. government SSI rules. You can always argue about whether a particular piece of information needs to be kept secret, and how classifications like SSI can be used to conduct government in secret. But if you take secrecy as an assumption, SSI defines a reasonable set of secrecy rules against a new threat.

Background on SSI

TSA’s regulation on the protection of SSI

Controversies surrounding SSI

My essay explaining why secrecy is often bad for security

Posted on March 8, 2005 at 10:37 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.