Crypto-Gram

April 15, 2009

by Bruce Schneier
Chief Security Technology Officer, BT
schneier@schneier.com
http://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <http://www.schneier.com/crypto-gram-0904.html>. These same essays appear in the “Schneier on Security” blog: <http://www.schneier.com/>. An RSS feed is available.


In this issue:


Fourth Annual Movie-Plot Threat Contest

Let’s face it, the War on Terror is a tired brand. There just isn’t enough action out there to scare people. If this keeps up, people will forget to be scared. And then both the terrorists and the terror-industrial complex lose. We can’t have that.

We’re going to help revive the fear. There’s plenty to be scared about, if only people would just think about it in the right way. In this Fourth Movie-Plot Threat Contest, the object is to find an existing event somewhere in the industrialized world—Third World events are just too easy—and provide a conspiracy theory to explain how the terrorists were really responsible.

The goal here is to be outlandish but plausible, ridiculous but possible, and—if it were only true—terrifying. Entries should be formatted as a news story, and are limited to 150 words (I’m going to check this time) because fear needs to be instilled in a population with short attention spans. Submit your entry, by the end of the month, in comments to the blog post.

Submit your entry here:
https://www.schneier.com/blog/archives/2009/04/…

An example from The Onion:
http://www.theonion.com/content/cartoon/feb-23-2009

The First Movie-Plot Threat Contest:
https://www.schneier.com/blog/archives/2006/04/…
https://www.schneier.com/blog/archives/2006/06/…

The Second Movie-Plot Threat Contest:
https://www.schneier.com/blog/archives/2007/04/…
https://www.schneier.com/blog/archives/2007/06/…
https://www.schneier.com/blog/archives/2007/06/…

The Third Movie-Plot Threat Contest:
https://www.schneier.com/blog/archives/2008/04/…
https://www.schneier.com/blog/archives/2008/05/…
https://www.schneier.com/blog/archives/2008/05/…


Who Should be in Charge of U.S. Cybersecurity?

U.S. government cybersecurity is an insecure mess, and fixing it is going to take considerable attention and resources. Trying to make sense of this, President Barack Obama ordered a 60-day review of government cybersecurity initiatives. Meanwhile, the U.S. House Subcommittee on Emerging Threats, Cybersecurity, Science and Technology is holding hearings on the same topic.

One of the areas of contention is who should be in charge. The FBI, DHS and DoD—specifically, the NSA—all have interests here. Earlier this month, Rod Beckstrom resigned from his position as director of the DHS’s National Cybersecurity Center, warning of a power grab by the NSA.

Putting national cybersecurity in the hands of the NSA is an incredibly bad idea. An entire parade of people, ranging from former FBI director Louis Freeh to Microsoft’s Trusted Computing Group Vice President and former Justice Department computer crime chief Scott Charney, have told Congress the same thing at this month’s hearings.

Cybersecurity isn’t a military problem, or even a government problem—it’s a universal problem. All networks, military, government, civilian and commercial, use the same computers, the same networking hardware, the same Internet protocols and the same software packages. We all are the targets of the same attack tools and tactics. It’s not even that government targets are somehow more important; these days, most of our nation’s critical IT infrastructure is in commercial hands. Government-sponsored Chinese hackers go after both military and civilian targets.

Some have said that the NSA should be in charge because it has specialized knowledge. Earlier this month, Director of National Intelligence Admiral Dennis Blair made this point, saying “There are some wizards out there at Ft. Meade who can do stuff.” That’s probably not true, but if it is, we’d better get them out of Ft. Meade as soon as possible—they’re doing the nation little good where they are now.

Not that government cybersecurity failings require any specialized wizardry to fix. GAO reports indicate that government problems include insufficient access controls, a lack of encryption where necessary, poor network management, failure to install patches, inadequate audit procedures, and incomplete or ineffective information security programs. These aren’t super-secret NSA-level security issues; these are the same managerial problems that every corporate CIO wrestles with.

We’ve all got the same problems, so solutions must be shared. If the government has any clever ideas to solve its cybersecurity problems, certainly a lot of us could benefit from those solutions. If it has an idea for improving network security, it should tell everyone. The best thing the government can do for cybersecurity world-wide is to use its buying power to improve the security of the IT products everyone uses. If it imposes significant security requirements on its IT vendors, those vendors will modify their products to meet those requirements. And those same products, now with improved security, will become available to all of us as the new standard.

Moreover, the NSA’s dual mission of providing security and conducting surveillance means it has an inherent conflict of interest in cybersecurity. Inside the NSA, this is called the “equities issue.” During the Cold War, it was easy; the NSA used its expertise to protect American military information and communications, and eavesdropped on Soviet information and communications. But what happens when both the good guys the NSA wants to protect, and the bad guys the NSA wants to eavesdrop on, use the same systems? They all use Microsoft Windows, Oracle databases, Internet email, and Skype. When the NSA finds a vulnerability in one of those systems, does it alert the manufacturer and fix it—making both the good guys and the bad guys more secure? Or does it keep quiet about the vulnerability and not tell anyone—making it easier to spy on the bad guys but also keeping the good guys insecure? Programs like the NSA’s warrantless wiretapping program have created additional vulnerabilities in our domestic telephone networks.

Testifying before Congress earlier this month, former DHS National Cyber Security division head Amit Yoran said “the intelligence community has always and will always prioritize its own collection efforts over the defensive and protection mission of our government’s and nation’s digital systems.”

Maybe the NSA could convince us that it’s putting cybersecurity first, but its culture of secrecy will mean that any decisions it makes will be suspect. Under current law, extended by the Bush administration’s extravagant invocation of the “state secrets” privilege when charged with statutory and constitutional violations, the NSA’s activities are not subject to any meaningful public oversight. And the NSA’s tradition of military secrecy makes it harder for it to coordinate with other government IT departments, most of which don’t have clearances, let alone coordinate with local law enforcement or the commercial sector.

We need transparent and accountable government processes, using commercial security products. We need government cybersecurity programs that improve security for everyone. The NSA certainly has an advisory and a coordination role in national cybersecurity, and perhaps a more supervisory role in DoD cybersecurity—both offensive and defensive—but it should not be in charge.

A copy of this essay, with all embedded links, is here:
https://www.schneier.com/blog/archives/2009/04/…

A version of this essay appeared on The Wall Street Journal website.
http://online.wsj.com/article/SB123844579753370907.html


News

Privacy in Google Latitude: good news.
http://.wired.com/business/2009/03/…

Leaving infants in the car. It happens, and sometimes they die.
https://www.schneier.com/blog/archives/2009/03/…

Interesting piece of cryptographic history: a cipher designed by Robert Patterson and sent to Thomas Jefferson in 1801.
https://www.schneier.com/blog/archives/2009/03/…

The Bayer company is refusing to talk about a fatal accident at a West Virginia plant, citing a 2002 terrorism law.
http://pubs.acs.org/cen/news/87/i11/8711news6.html
The meeting has been rescheduled. No word on how forthcoming Bayer will be.
http://www.csb.gov/index.cfm?…
Research on fingerprinting paper:
http://www.freedom-to-tinker.com//felten/…
http://citp.princeton.edu/pub/paper09oak.pdf

Blowfish on the television series 24, again:
https://www.schneier.com/blog/archives/2009/03/…

Interesting analysis of why people steal rare books.
http://www.ft.com/cms/s/2/…

Last month, I linked to a catalog of NSA video courses from 1991. Here’s an update, with new information (the FOIA redactions were appealed).
http://www.governmentattic.org/2docs/…
You just can’t make this stuff up: a UK bomb squad is called in because someone saw a plastic replica of the Holy Hand Grenade of Antioch, from the movie Monty Python and the Holy Grail.
https://www.schneier.com/blog/archives/2009/03/…

Interesting research in explosives detection.
http://www.aip.org/press_release/…

A Psychology Today article on fear and the availability heuristic:
http://s.psychologytoday.com//…
From Kentucky: I think this is the first documented case of election fraud in the U.S. using electronic voting machines (there have been lots of documented cases of errors and voting problems, but this one involves actual maliciousness). Lots of details; well worth reading.
https://www.schneier.com/blog/archives/2009/03/…

Sniffing keyboard keystrokes with a laser:
http://news.zdnet.com/2100-9595_22-280184.html

Where you stand matters in surviving a suicide bombing.
http://www.sciencedaily.com/releases/2009/03/…
Presumably they also discovered where the attacker should stand to be as lethal as possible, but there’s no indication they published those results.

An impressive solar plasma movie-plot threat.
http://www.newscientist.com/article/…
Security fears drive Iran to Linux:
http://www.theage.com.au/articles/2004/09/21/…

A gorilla detector, from Muppet Labs.
http://www.youtube.com/watch?v=4QrelL9fOjY

Bob Blakley makes an interesting point about what he calls “the zone of essential risk”: “if you conduct medium-sized transactions rarely, you’re in trouble. The transactions are big enough so that you care about losses, you don’t have enough transaction volume to amortize those losses, and the cost of insurance or escrow is high enough compared to the value of your transactions that it doesn’t make economic sense to protect yourself.”
http://notabob.blogspot.com/2009/03/…

Massive Chinese espionage network discovered:
https://www.schneier.com/blog/archives/2009/03/…

Thefts at the Museum of Bad Art:
http://en.wikipedia.org/wiki/Museum_Of_Bad_Art
Be sure to notice the camera:
http://en.wikipedia.org/wiki/File:MOBAcamera.JPG

Here’s a story about a very expensive series of false positives. The German police spent years and millions of dollars tracking a mysterious killer whose DNA had been found at the scenes of six murders. Finally they realized they were tracking a worker at the factory that assembled the prepackaged swabs used for DNA testing.
http://scienceblogs.com/authority/2009/03/…
This story could be used as justification for a massive DNA database. After all, if that factory worker had his or her DNA in the database, the police would have quickly realized what the problem was.

Identifying people using anonymous social networking data:
https://www.schneier.com/blog/archives/2009/04/…

What to fear: a great rundown of the statistics.
http://www.counterpunch.org/goekler03242009.html

Crypto puzzle and NSA problem:
http://www.cryptosmith.com/archives/565

Clever social networking identity theft scams:
https://www.schneier.com/blog/archives/2009/04/…

Police powers and the UK government in the 1980s:
https://www.schneier.com/blog/archives/2009/04/…

Research into preserving P2P privacy:
http://www.physorg.com/news158419063.html

Fact-free article about foreign companies hacking the U.S. power grid suggests we panic. My guess is that it was deliberately planted by someone looking for leverage in the upcoming budget battle.
https://www.schneier.com/blog/archives/2009/04/…

Here’s a tip: when walking around in public with secret government documents, put them in an envelope. Don’t carry them in the open where people can read (and photograph) them.
https://www.schneier.com/blog/archives/2009/04/…

Details of the arrests made in haste after the above disclosure:
http://www.timesonline.co.uk/tol/news/uk/…

It is a measure of our restored sanity that no one has called the TSA about Tweenbots:
http://www.tweenbots.com/

How to write a scary cyberterrorism story. From Foreign Policy.
http://neteffect.foreignpolicy.com/posts/2009/04/11/…


Privacy and the Fourth Amendment

In the United States, the concept of “expectation of privacy” matters because it’s the constitutional test, based on the Fourth Amendment, that governs when and how the government can invade your privacy.

Based on the 1967 Katz v. United States Supreme Court decision, this test actually has two parts. First, the government’s action can’t contravene an individual’s subjective expectation of privacy; and second, that expectation of privacy must be one that society in general recognizes as reasonable. That second part isn’t based on anything like polling data; it is more of a normative idea of what level of privacy people should be allowed to expect, given the competing importance of personal privacy on one hand and the government’s interest in public safety on the other.

The problem is, in today’s information society, that definition test will rapidly leave us with no privacy at all.

In Katz, the Court ruled that the police could not eavesdrop on a phone call without a warrant: Katz expected his phone conversations to be private and this expectation resulted from a reasonable balance between personal privacy and societal security. Given NSA’s large-scale warrantless eavesdropping, and the previous administration’s continual insistence that it was necessary to keep America safe from terrorism, is it still reasonable to expect that our phone conversations are private?

Between the NSA’s massive internet eavesdropping program and Gmail’s content-dependent advertising, does anyone actually expect their e-mail to be private? Between calls for ISPs to retain user data and companies serving content-dependent web ads, does anyone expect their web browsing to be private? Between the various computer-infecting malware, and world governments increasingly demanding to see laptop data at borders, hard drives are barely private. I certainly don’t believe that my SMSs, any of my telephone data, or anything I say on LiveJournal or Facebook—regardless of the privacy settings—is private.

Aerial surveillance, data mining, automatic face recognition, terahertz radar that can “see” through walls, wholesale surveillance, brain scans, RFID, “life recorders” that save everything: Even if society still has some small expectation of digital privacy, that will change as these and other technologies become ubiquitous. In short, the problem with a normative expectation of privacy is that it changes with perceived threats, technology and large-scale abuses.

Clearly, something has to change if we are to be left with any privacy at all. Three legal scholars have written law review articles that wrestle with the problems of applying the Fourth Amendment to cyberspace and to our computer-mediated world in general.

George Washington University’s Daniel Solove, who blogs at Concurring Opinions, has tried to capture the Byzantine complexities of modern privacy. He points out, for example, that the following privacy violations—all real—are very different: A company markets a list of 5 million elderly incontinent women; reporters deceitfully gain entry to a person’s home and secretly photograph and record the person; the government uses a thermal sensor device to detect heat patterns in a person’s home; and a newspaper reports the name of a rape victim. Going beyond simple definitions such as the divulging of a secret, Solove has developed a taxonomy of privacy, and the harms that result from their violation.

His 16 categories are: surveillance, interrogation, aggregation, identification, insecurity, secondary use, exclusion, breach of confidentiality, disclosure, exposure, increased accessibility, blackmail, appropriation, distortion, intrusion and decisional interference. Solove’s goal is to provide a coherent and comprehensive understanding of what is traditionally an elusive and hard-to-explain concept: privacy violations. (This taxonomy is also discussed in Solove’s book, Understanding Privacy.)

Orin Kerr, also a law professor at George Washington University, and a blogger at Volokh Conspiracy, has attempted to lay out general principles for applying the Fourth Amendment to the internet. First, he points out that the traditional inside/outside distinction—the police can watch you in a public place without a warrant, but not in your home—doesn’t work very well with regard to cyberspace. Instead, he proposes a distinction between content and non-content information: the body of an e-mail versus the header information, for example. The police should be required to get a warrant for the former, but not for the latter. Second, he proposes that search warrants should be written for particular individuals and not for particular internet accounts.

Meanwhile, Jed Rubenfeld of Yale Law School has tried to reinterpret the Fourth Amendment not in terms of privacy, but in terms of security. Pointing out that the whole “expectations” test is circular—what the government does affects what the government can do—he redefines everything in terms of security: the security that our private affairs are private.

This security is violated when, for example, the government makes widespread use of informants, or engages in widespread eavesdropping—even if no one’s privacy is actually violated. This neatly bypasses the whole individual privacy versus societal security question—a balancing that the individual usually loses—by framing both sides in terms of personal security.

I have issues with all of these articles. Solove’s taxonomy is excellent, but the sense of outrage that accompanies a privacy violation—”How could they know/do/say that!?”—is an important part of the harm resulting from a privacy violation. The non-content information that Kerr believes should be collectible without a warrant can be very private and personal: URLs can be very personal, and it’s possible to figure out browsed content just from the size of encrypted SSL traffic. Also, the ease with which the government can collect all of it—the calling and called party of every phone call in the country—makes the balance very different. I believe these need to be protected with a warrant requirement. Rubenfeld’s reframing is interesting, but the devil is in the details. Reframing privacy in terms of security still results in a balancing of competing rights. I’d rather take the approach of stating the—obvious to me—individual and societal value of privacy, and giving privacy its rightful place as a fundamental human right. (There’s additional commentary on Rubenfeld’s thesis at ArsTechnica.)

The trick here is to realize that a normative definition of the expectation of privacy doesn’t need to depend on threats or technology, but rather on what we—as society—decide it should be. Sure, today’s technology make it easier than ever to violate privacy. But it doesn’t necessarily follow that we have to violate privacy. Today’s guns make it easier than ever to shoot virtually anyone for any reason. That doesn’t mean our laws have to change.

No one knows how this will shake out legally. These three articles are from law professors; they’re not judicial opinions. But clearly something has to change, and ideas like these may someday form the basis of new Supreme Court decisions that brings legal notions of privacy into the 21st century.

A copy of this essay, with all embedded links, is here:
https://www.schneier.com/blog/archives/2009/03/…

This essay originally appeared on Wired.com.
http://www.wired.com/politics/security/commentary/…


Schneier News

I was interviewed on Federal News Radio about insider threats:
http://www.federalnewsradio.com/index.php?…

I’m speaking at the Taiwan Information Security Center on April 17 in Taipei:
http://forum.twisc.ncku.edu.tw/dm.html

I’ll be on the Cryptographers’ Panel at the RSA Conference on April 21 in San Francisco:
http://www.rsaconference.com/2009/US/Home.aspx

I’ll be the keynote speaker at the IPSI Research Symposium on May 6 in Toronto:
http://www.ipsi.utoronto.ca/events/…

I’m speaking at the International Workshop on Coding and Cryptography on May 12 in Lofthus, Norway:
http://www.selmer.uib.no/WCC2009/callWCC2009.pdf

I’m giving the keynote speech on Day 2 of the European OWASP Application Security Conference, May 14 in Krakow, Poland:
http://www.owasp.org/index.php/AppSecEU09

And I’m giving the keynote speech at CONfidence on May 15 in Krakow, Poland:
http://2009.confidence.org.pl/


The Definition of “Weapon of Mass Destruction”

At least, according to U.S. law:

18 U.S.C. 2332a
(2) the term “weapon of mass destruction” means—
(A) any destructive device as defined in section 921 of this title;
(B) any weapon that is designed or intended to cause death or serious bodily injury through the release, dissemination, or impact of toxic or poisonous chemicals, or their precursors;
(C) any weapon involving a biological agent, toxin, or vector (as those terms are defined in section 178 of this title); or
(D) any weapon that is designed to release radiation or radioactivity at a level dangerous to human life;

18 U.S.C. 921
(4) The term “destructive device” means—
(A) any explosive, incendiary, or poison gas—
(i) bomb,
(ii) grenade,
(iii) rocket having a propellant charge of more than four ounces,
(iv) missile having an explosive or incendiary charge of more than one-quarter ounce,
(v) mine, or
(vi) device similar to any of the devices described in the preceding clauses;
(B) any type of weapon (other than a shotgun or a shotgun shell which the Attorney General finds is generally recognized as particularly suitable for sporting purposes) by whatever name known which will, or which may be readily converted to, expel a projectile by the action of an explosive or other propellant, and which has any barrel with a bore of more than one-half inch in diameter; and
(C) any combination of parts either designed or intended for use in converting any device into any destructive device described in subparagraph (A) or (B) and from which a destructive device may be readily assembled.

The term “destructive device” shall not include any device which is neither designed nor redesigned for use as a weapon; any device, although originally designed for use as a weapon, which is redesigned for use as a signaling, pyrotechnic, line throwing, safety, or similar device; surplus ordnance sold, loaned, or given by the Secretary of the Army pursuant to the provisions of section 4684 (2), 4685, or 4686 of title 10; or any other device which the Attorney General finds is not likely to be used as a weapon, is an antique, or is a rifle which the owner intends to use solely for sporting, recreational or cultural purposes.

This is a very broad definition, and one that involves the intention of the weapon’s creator as well as the details of the weapon itself.

In an e-mail, Ohio State University Professor John Mueller commented to me:

“As I understand it, not only is a grenade a weapon of mass destruction, but so is a maliciously-designed child’s rocket even if it doesn’t have a warhead. On the other hand, although a missile-propelled firecracker would be considered a weapon of mass destruction if its designers had wanted to think of it as a weapon, it would not be so considered if it had previously been designed for use as a weapon and then redesigned for pyrotechnic use or if it was surplus and had been sold, loaned, or given to you (under certain circumstances) by the Secretary of the Army.

“It also means that we are coming up on the 25th anniversary of the Reagan administration’s long-misnamed WMD-for-Hostages deal with Iran.

“Bad news for you, though. You’ll have to amend that line you like using in your presentations about how all WMD in all of history have killed fewer people than OIF (or whatever), since all artillery, and virtually every muzzle-loading military long arm for that matter, legally qualifies as an WMD. It does make the bombardment of Ft. Sumter all the more sinister. To say nothing of the revelation that The Star Spangled Banner is in fact an account of a WMD attack on American shores.”

Amusing, to be sure, but there’s something important going on. The U.S. government has passed specific laws about “weapons of mass destruction,” because they’re particularly scary and damaging. But by generalizing the definition of WMDs, those who write the laws greatly broaden their applicability. And I have to wonder how many of those who vote in favor of the laws realize how general they really are, or—if they do know—vote for them anyway because they can’t be seen to be “soft” on WMDs.

It reminds me of those provisions of the USA PATRIOT Act—and other laws—that created police powers to be used for “terrorism and other crimes.”

Prosecutions based on this unreasonable definition:
http://www.ph2dot1.com/2008/04/…


Stealing Commodities

Before his arrest, Tom Berge stole lead roof tiles from several buildings in south-east England, including the Honeywood Museum in Carshalton, the Croydon parish church, and the Sutton high school for girls. He then sold those tiles to scrap metal dealers.

As a security expert, I find this story interesting for two reasons. First, amongst increasingly ridiculous attempts to ban, or at least censor, Google Earth, lest it help the terrorists, here is an actual crime that relied on the service: Berge needed Google Earth for reconnaissance.

But more interesting is the discrepancy between the value of the lead tiles to the original owner and to the thief. The Sutton school had to spend £10,000 to buy new lead tiles; the Croydon Church had to repair extensive water damage after the theft. But Berge only received £700 a ton from London scrap metal dealers.

This isn’t an isolated story; the same dynamic is in play with other commodities as well.

There is an epidemic of copper wiring thefts worldwide; copper is being stolen out of telephone and power stations—and off poles in the streets—and thieves have killed themselves because they didn’t understand the dangers of high voltage. Homeowners are returning from holiday to find the copper pipes stolen from their houses. In 2001, scrap copper was worth 70 cents per pound. In April 2008, it was worth $4.

Gasoline siphoning became more common as pump prices rose. And used restaurant grease, formerly either given away or sold for pennies to farmers, is being stolen from restaurant parking lots and turned into biofuels. Newspapers and other recyclables are stolen from curbs, and trees are stolen and resold as Christmas trees.

Iron fences have been stolen from buildings and houses, manhole covers have been stolen from the middle of streets, and aluminum guard rails have been stolen from roadways. Steel is being stolen for scrap, too. In 2004 in Ukraine, thieves stole an entire steel bridge.

These crimes are particularly expensive to society because the replacement cost is much higher than the thief’s profit. A manhole cover is worth $5-$10 as scrap, but it costs $500 to replace, including labor. A thief may take $20 worth of copper from a construction site, but do $10,000 in damage in the process. And even if the thieves don’t get to the copper or steel, the increased threat means more money being spent on security to protect those commodities in the first place.

Security can be viewed as a tax on the honest, and these thefts demonstrate that our taxes are going up. And unlike many taxes, we don’t benefit from their collection. The cost to society of retrofitting manhole covers with locks, or replacing them with less resalable alternatives, is high; but there is no benefit other than reducing theft.

These crimes are a harbinger of the future: evolutionary pressure on our society, if you will. Criminals are often referred to as social parasites; they leech off society but provide no useful benefit. But they are an early warning system of societal changes. Unfettered by laws or moral restrictions, they can be the first to respond to changes that the rest of society will be slower to pick up on. In fact, currently there’s a reprieve. Scrap metal prices are all down from last year’s—copper is currently $1.62 per pound, and lead is half what Berge got—and thefts are down along with them.

We’ve designed much of our infrastructure around the assumptions that commodities are cheap and theft is rare. We don’t protect transmission lines, manhole covers, iron fences, or lead flashing on roofs. But if commodity prices really are headed for new higher stable points, society will eventually react and find alternatives for these items—or find ways to protect them. Criminals were the first to point this out, and will continue to exploit the system until it restabilizes.

A copy of this essay, with all embedded links, is here:
https://www.schneier.com/blog/archives/2009/04/…

A version of this essay originally appeared in The Guardian.
http://www.guardian.co.uk/technology/2009/apr/02/…


Comments from Readers

There are hundreds of comments—many of them interesting—on these topics on my blog. Search for the story you want to comment on, and join in.

http://www.schneier.com/


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Schneier on Security,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.

Copyright (c) 2009 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.