Entries Tagged "NSA"

Page 50 of 53

NSA Helps Microsoft with Windows Vista

Is this a good idea or not?

For the first time, the giant software maker is acknowledging the help of the secretive agency, better known for eavesdropping on foreign officials and, more recently, U.S. citizens as part of the Bush administration’s effort to combat terrorism. The agency said it has helped in the development of the security of Microsoft’s new operating system — the brains of a computer — to protect it from worms, Trojan horses and other insidious computer attackers.

[…]

The NSA declined to comment on its security work with other software firms, but Sager said Microsoft is the only one “with this kind of relationship at this point where there’s an acknowledgment publicly.”

The NSA, which provided its service free, said it was Microsoft’s idea to acknowledge the spy agency’s role.

It’s called the “equities issue.” Basically, the NSA has two roles: eavesdrop on their stuff, and protect our stuff. When both sides use the same stuff — Windows Vista, for example — the agency has to decide whether to exploit vulnerabilities to eavesdrop on their stuff or close the same vulnerabilities to protect our stuff. In its partnership with Microsoft, it could have decided to go either way: to deliberately introduce vulnerabilities that it could exploit, or deliberately harden the OS to protect its own interests.

A few years ago I was ready to believe the NSA recognized we’re all safer with more secure general-purpose computers and networks, but in the post-9/11 take-the-gloves-off eavesdrop-on-everybody environment, I simply don’t trust the NSA to do the right thing.

“I kind of call it a Good Housekeeping seal” of approval, said Michael Cherry, a former Windows program manager who now analyzes the product for Directions on Microsoft, a firm that tracks the software maker.

Cherry says the NSA’s involvement can help counter the perception that Windows is not entirely secure and help create a perception that Microsoft has solved the security problems that have plagued it in the past. “Microsoft also wants to make the case that [the new Windows] more secure than its earlier versions,” he said.

For some of us, the result is the exact opposite.

EDITED TO ADD (1/11): Another opinion.

Posted on January 9, 2007 at 12:43 PMView Comments

U.S. Government to Encrypt All Laptops

This is a good idea:

To address the issue of data leaks of the kind we’ve seen so often in the last year because of stolen or missing laptops, writes Saqib Ali, the Feds are planning to use Full Disk Encryption (FDE) on all Government-owned computers.

“On June 23, 2006 a Presidential Mandate was put in place requiring all agency laptops to fully encrypt data on the HDD. The U.S. Government is currently conducting the largest single side-by-side comparison and competition for the selection of a Full Disk Encryption product. The selected product will be deployed on Millions of computers in the U.S. federal government space. This implementation will end up being the largest single implementation ever, and all of the information regarding the competition is in the public domain. The evaluation will come to an end in 90 days. You can view all the vendors competing and list of requirements.”

Certainly, encrypting everything is overkill, but it’s much easier than figuring out what to encrypt and what not to. And I really like that there is a open competition to choose which encryption program to use. It’s certainly a high-stakes competition among the vendors, but one that is likely to improve the security of all products. I’ve long said that one of the best things the government can do to improve computer security is to use its vast purchasing power to pressure vendors to improve their security. I would expect the winner to make a lot of sales outside of the contract, and for the losers to correct their deficiencies so they’ll do better next time.

Side note: Key escrow is a requirement, something that makes sense in a government or corporate application:

Capable of secure escrow and recovery of the symetric [sic] encryption key

I wonder if the NSA is involved in the evaluation at all, and if its analysis will be made public.

Posted on January 3, 2007 at 2:00 PMView Comments

Fukuyama on Secrecy

From the New York Times:

All new threats entail huge uncertainties. Then, as now, there was a pronounced tendency to assume the worst, and for the government to claim enormous discretion in protecting the American public. The Bush administration has consistently argued that it needs to be protected from Congressional oversight and media scrutiny. An example is the National Security Agency’s warrantless surveillance of telephone traffic into and out of the United States. Rather than going to Congress and trying to negotiate changes to the law that regulates such activities, the administration simply grabbed that authority for itself, saying, in effect, “Trust us: if you knew what we know about the threat, you’d be perfectly happy to have us do what we’re doing.” In other areas, like the holding of prisoners in Guantanamo and interrogation methods used there and in the Middle East, one can only quote Moynihan on an earlier era: “As fears of Communist conspiracies and German subversion mounted, it was the U.S. government’s conduct that approached the illegal.”

Even if we do not at this juncture know the full scope of the threat we face from jihadist terrorism, it is certainly large enough to justify many changes in the way we conduct our lives, both at home and abroad. But the American government does have a track record in dealing with similar problems in the past, one suggesting that all American institutions — Congress, the courts, the news media — need to do their jobs in scrutinizing official behavior, and not take the easy way out of deferring to the executive. Past experience also suggests that the government would do far better to make public what it knows, as well as the limits of that knowledge, if we are to arrive at a balanced view of the challenges we face today.

Posted on October 12, 2006 at 6:54 AMView Comments

Indexes to NSA Publications Declassified and Online

In May 2003, Michael Ravnitzky submitted a Freedom of Information Act (FOIA) request to the National Security Agency for a copy of the index to their historical reports at the Center for Cryptologic History and the index to certain journals: the NSA Technical Journal and the Cryptographic Quarterly. These journals had been mentioned in the literature but are not available to the public. Because he thought NSA might be reluctant to release the bibliographic indexes, he also asked for the table of contents to each issue.

The request took more than three years for them to process and declassify — sadly, not atypical — and during the process they asked if he would accept the indexes in lieu of the tables of contents pages: specifically, the cumulative indices that included all the previous material in the earlier indices. He agreed, and got them last month. The results are here.

This is just a sampling of some of the article titles from the NSA Technical Journal:

“The Arithmetic of a Generation Principle for an Electronic Key Generator” · “CATNIP: Computer Analysis – Target Networks Intercept Probability” · “Chatter Patterns: A Last Resort” · “COMINT Satellites – A Space Problem” · “Computers and Advanced Weapons Systems” · “Coupon Collecting and Cryptology” · “Cranks, Nuts, and Screwballs” · “A Cryptologic Fairy Tale” · “Don’t Be Too Smart” · “Earliest Applications of the Computer at NSA” · “Emergency Destruction of Documents” · “Extraterrestrial Intelligence” · “The Fallacy of the One-Time-Pad Excuse” · “GEE WHIZZER” · “The Gweeks Had a Gwoup for It” · “How to Visualize a Matrix” · “Key to the Extraterrestrial Messages” · “A Mechanical Treatment of Fibonacci Sequences” · “Q.E.D.- 2 Hours, 41 Minutes” · “SlGINT Implications of Military Oceanography” · “Some Problems and Techniques in Bookbreaking” · “Upgrading Selected US Codes and Ciphers with a Cover and Deception Capability” · “Weather: Its Role in Communications Intelligence” · “Worldwide Language Problems at NSA”

In the materials the NSA provided, they also included indices to two other publications: Cryptologic Spectrum and Cryptologic Almanac.

The indices to Cryptologic Quarterly and NSA Technical Journal have indices by title, author and keyword. The index to Cryptologic Spectrum has indices by author, title and issue.

Consider these bibliographic tools as stepping stones. If you want an article, send a FOIA request for it. Send a FOIA request for a dozen. There’s a lot of stuff here that would help elucidate the early history of the agency and some interesting cryptographic topics.

Thanks Mike, for doing this work.

Posted on September 26, 2006 at 12:58 PMView Comments

Terrorists, Data Mining, and the Base Rate Fallacy

I have already explained why NSA-style wholesale surveillance data-mining systems are useless for finding terrorists. Here’s a more formal explanation:

Floyd Rudmin, a professor at a Norwegian university, applies the mathematics of conditional probability, known as Bayes’ Theorem, to demonstrate that the NSA’s surveillance cannot successfully detect terrorists unless both the percentage of terrorists in the population and the accuracy rate of their identification are far higher than they are. He correctly concludes that “NSA’s surveillance system is useless for finding terrorists.”

The surveillance is, however, useful for monitoring political opposition and stymieing the activities of those who do not believe the government’s propaganda.

And here’s the analysis:

What is the probability that people are terrorists given that NSA’s mass surveillance identifies them as terrorists? If the probability is zero (p=0.00), then they certainly are not terrorists, and NSA was wasting resources and damaging the lives of innocent citizens. If the probability is one (p=1.00), then they definitely are terrorists, and NSA has saved the day. If the probability is fifty-fifty (p=0.50), that is the same as guessing the flip of a coin. The conditional probability that people are terrorists given that the NSA surveillance system says they are, that had better be very near to one (p=1.00) and very far from zero (p=0.00).

The mathematics of conditional probability were figured out by the Scottish logician Thomas Bayes. If you Google “Bayes’ Theorem”, you will get more than a million hits. Bayes’ Theorem is taught in all elementary statistics classes. Everyone at NSA certainly knows Bayes’ Theorem.

To know if mass surveillance will work, Bayes’ theorem requires three estimations:

  1. The base-rate for terrorists, i.e. what proportion of the population are terrorists;
  2. The accuracy rate, i.e., the probability that real terrorists will be identified by NSA;
  3. The misidentification rate, i.e., the probability that innocent citizens will be misidentified by NSA as terrorists.

No matter how sophisticated and super-duper are NSA’s methods for identifying terrorists, no matter how big and fast are NSA’s computers, NSA’s accuracy rate will never be 100% and their misidentification rate will never be 0%. That fact, plus the extremely low base-rate for terrorists, means it is logically impossible for mass surveillance to be an effective way to find terrorists.

I will not put Bayes’ computational formula here. It is available in all elementary statistics books and is on the web should any readers be interested. But I will compute some conditional probabilities that people are terrorists given that NSA’s system of mass surveillance identifies them to be terrorists.

The US Census shows that there are about 300 million people living in the USA.

Suppose that there are 1,000 terrorists there as well, which is probably a high estimate. The base-rate would be 1 terrorist per 300,000 people. In percentages, that is .00033%, which is way less than 1%. Suppose that NSA surveillance has an accuracy rate of .40, which means that 40% of real terrorists in the USA will be identified by NSA’s monitoring of everyone’s email and phone calls. This is probably a high estimate, considering that terrorists are doing their best to avoid detection. There is no evidence thus far that NSA has been so successful at finding terrorists. And suppose NSA’s misidentification rate is .0001, which means that .01% of innocent people will be misidentified as terrorists, at least until they are investigated, detained and interrogated. Note that .01% of the US population is 30,000 people. With these suppositions, then the probability that people are terrorists given that NSA’s system of surveillance identifies them as terrorists is only p=0.0132, which is near zero, very far from one. Ergo, NSA’s surveillance system is useless for finding terrorists.

Suppose that NSA’s system is more accurate than .40, let’s say, .70, which means that 70% of terrorists in the USA will be found by mass monitoring of phone calls and email messages. Then, by Bayes’ Theorem, the probability that a person is a terrorist if targeted by NSA is still only p=0.0228, which is near zero, far from one, and useless.

Suppose that NSA’s system is really, really, really good, really, really good, with an accuracy rate of .90, and a misidentification rate of .00001, which means that only 3,000 innocent people are misidentified as terrorists. With these suppositions, then the probability that people are terrorists given that NSA’s system of surveillance identifies them as terrorists is only p=0.2308, which is far from one and well below flipping a coin. NSA’s domestic monitoring of everyone’s email and phone calls is useless for finding terrorists.

As an exercise to the reader, you can use the same analysis to show that data mining is an excellent tool for finding stolen credit cards, or stolen cell phones. Data mining is by no means useless; it’s just useless for this particular application.

Posted on July 10, 2006 at 7:15 AMView Comments

NSA Combing Through MySpace

No surprise.

New Scientist has discovered that Pentagon’s National Security Agency, which specialises in eavesdropping and code-breaking, is funding research into the mass harvesting of the information that people post about themselves on social networks. And it could harness advances in internet technology – specifically the forthcoming “semantic web” championed by the web standards organisation W3C – to combine data from social networking websites with details such as banking, retail and property records, allowing the NSA to build extensive, all-embracing personal profiles of individuals.

Posted on June 15, 2006 at 6:13 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.