Crypto-Gram

April 15, 2010

by Bruce Schneier
Chief Security Technology Officer, BT
schneier@schneier.com
http://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <http://www.schneier.com/crypto-gram-1004.html>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively comment section. An RSS feed is available.


In this issue:


Privacy and Control

In January, Facebook Chief Executive Mark Zuckerberg declared the age of privacy to be over. A month earlier, Google Chief Eric Schmidt expressed a similar sentiment. Add Scott McNealy’s and Larry Ellison’s comments from a few years earlier, and you’ve got a whole lot of tech CEOs proclaiming the death of privacy—especially when it comes to young people.

It’s just not true. People, including the younger generation, still care about privacy. Yes, they’re far more public on the Internet than their parents: writing personal details on Facebook, posting embarrassing photos on Flickr and having intimate conversations on Twitter. But they take steps to protect their privacy and vociferously complain when they feel it violated. They’re not technically sophisticated about privacy and make mistakes all the time, but that’s mostly the fault of companies and Web sites that try to manipulate them for financial gain.

To the older generation, privacy is about secrecy. And, as the Supreme Court said, once something is no longer secret, it’s no longer private. But that’s not how privacy works, and it’s not how the younger generation thinks about it. Privacy is about control. When your health records are sold to a pharmaceutical company without your permission; when a social-networking site changes your privacy settings to make what used to be visible only to your friends visible to everyone; when the NSA eavesdrops on everyone’s e-mail conversations—your loss of control over that information is the issue. We may not mind sharing our personal lives and thoughts, but we want to control how, where and with whom. A privacy failure is a control failure.

People’s relationship with privacy is socially complicated. Salience matters: People are more likely to protect their privacy if they’re thinking about it, and less likely to if they’re thinking about something else. Social-networking sites know this, constantly reminding people about how much fun it is to share photos and comments and conversations while downplaying the privacy risks. Some sites go even further, deliberately hiding information about how little control—and privacy—users have over their data. We all give up our privacy when we’re not thinking about it.

Group behavior matters; we’re more likely to expose personal information when our peers are doing it. We object more to losing privacy than we value its return once it’s gone. Even if we don’t have control over our data, an illusion of control reassures us. And we are poor judges of risk. All sorts of academic research backs up these findings.

Here’s the problem: The very companies whose CEOs eulogize privacy make their money by controlling vast amounts of their users’ information. Whether through targeted advertising, cross-selling or simply convincing their users to spend more time on their site and sign up their friends, more information shared in more ways, more publicly means more profits. This means these companies are motivated to continually ratchet down the privacy of their services, while at the same time pronouncing privacy erosions as inevitable and giving users the illusion of control.

You can see these forces in play with Google’s launch of Buzz. Buzz is a Twitter-like chatting service, and when Google launched it in February, the defaults were set so people would follow the people they corresponded with frequently in Gmail, with the list publicly available. Yes, users could change these options, but—and Google knew this—changing options is hard and most people accept the defaults, especially when they’re trying out something new. People were upset that their previously private e-mail contacts list was suddenly public. A Federal Trade Commission commissioner even threatened penalties. And though Google changed its defaults, resentment remained.

Facebook tried a similar control grab when it changed people’s default privacy settings last December to make them more public. While users could, in theory, keep their previous settings, it took an effort. Many people just wanted to chat with their friends and clicked through the new defaults without realizing it.

Facebook has a history of this sort of thing. In 2006 it introduced News Feeds, which changed the way people viewed information about their friends. There was no true privacy change in that users could not see more information than before; the change was in control—or arguably, just in the illusion of control. Still, there was a large uproar. And Facebook is doing it again; last month, the company announced new privacy changes that will make it easier for it to collect location data on users and sell that data to third parties.

With all this privacy erosion, those CEOs may actually be right—but only because they’re working to kill privacy. On the Internet, our privacy options are limited to the options those companies give us and how easy they are to find. We have Gmail and Facebook accounts because that’s where we socialize these days, and it’s hard—especially for the younger generation—to opt out. As long as privacy isn’t salient, and as long as these companies are allowed to forcibly change social norms by limiting options, people will increasingly get used to less and less privacy. There’s no malice on anyone’s part here; it’s just market forces in action. If we believe privacy is a social good, something necessary for democracy, liberty and human dignity, then we can’t rely on market forces to maintain it. Broad legislation protecting personal privacy by giving people control over their personal data is the only solution.

This essay originally appeared on Forbes.com.
http://www.forbes.com/2010/04/05/…
Zuckerberg on privacy:
http://www.guardian.co.uk/technology/2010/jan/11/…

Schmidt on privacy:
http://gawker.com/5419271/…

McNealy on privacy:
http://www.wired.com/politics/law/news/1999/01/17538

Ellison on privacy:
http://www.businessweek.com/bwdaily/dnflash/oct2001/…

Danah Boyd on privacy and younger people:
http://www.danah.org/papers/talks/2010/SXSW2010.html

The Supreme Court on privacy and secrecy:
http://www.rbs2.com/privacy.htm

Privacy and salience:
http://www.computer.org/cms/Computer.org/…
Social networking sites downplaying privacy concerns:
http://www.schneier.com/essay-278.html

Sites that make misleading privacy claims:
http://www.schneier.com/essay-276.html

Humans are a poor judge of risk:
http://www.schneier.com/essay-162.html

Academic research on how people make privacy decisions:
http://www.heinz.cmu.edu/~acquisti/…

Google’s Buzz:
http://news.cnet.com/8301-31322_3-10451428-256.html
http://www.businessinsider.com/…
http://finapps.forbes.com/finapps/jsp/finance/…
http://www.lightbluetouchpaper.org/2010/02/12/…
http://www.computerworld.com/s/article/9172079/…
Facebook’s privacy problems:
http://www.eff.org/deeplinks/2009/12/…
Facebook News Feeds:
http://.facebook.com/.php?post=2207967130
http://www.facebook.com/group.php?gid=2208288769

Facebook’s latest privacy changes:
http://.facebook.com/.php?post=376904492130

The value of privacy:
http://www.schneier.com/essay-114.html

Privacy legislation:
https://www.schneier.com/blog/archives/2006/02/…

Google responds:
http://www.forbes.com/2010/04/12/…
Another essay on the topic:
http://www.secureconsulting.net/2009/05/…


New York and the Moscow Subway Bombing

People intent on preventing a Moscow-style terrorist attack against the New York subway system are proposing a range of expensive new underground security measures, some temporary and some permanent.

They should save their money—and instead invest every penny they’re considering pouring into new technologies into intelligence and old-fashioned policing.

Intensifying security at specific stations only works against terrorists who aren’t smart enough to move to another station. Cameras are useful only if all the stars align: The terrorists happen to walk into the frame, the video feeds are being watched in real time and the police can respond quickly enough to be effective. They’re much more useful after an attack, to figure out who pulled it off.

Installing biological and chemical detectors requires similarly implausible luck—plus a terrorist plot that includes the specific biological or chemical agent that is being detected.

What all these misguided reactions have in common is that they’re based on “movie-plot threats”: overly specific attack scenarios. They fill our imagination vividly, in full color with rich detail. Before long, we’re envisioning an entire story line, with or without Bruce Willis saving the day. And we’re scared.

It’s not that movie-plot threats are not worth worrying about. It’s that each one—Moscow’s subway attack, the bombing of the Oklahoma City federal building, etc.—is too specific. These threats are infinite, and the bad guys can easily switch among them.

New York has thousands of possible targets, and there are dozens of possible tactics. Implementing security against movie-plot threats is only effective if we correctly guess which specific threat to protect against. That’s unlikely.

A far better strategy is to spend our limited counterterrorism resources on investigation and intelligence—and on emergency response. These measures don’t hinge on any specific threat; they don’t require us to guess the tactic or target correctly. They’re effective in a variety of circumstances, even nonterrorist ones.

The result may not be flashy or outwardly reassuring—as are pricey new scanners in airports. But the strategy will save more lives.

The 2006 arrest of the liquid bombers—who wanted to detonate liquid explosives to be brought onboard airliners traveling from England to North America—serves as an excellent example. The plotters were arrested in their London apartments, and their attack was foiled before they ever got to the airport.

It didn’t matter if they were using liquids or solids or gases. It didn’t even matter if they were targeting airports or shopping malls or theaters. It was a straightforward, although hardly simple, matter of following leads.

Gimmicky security measures are tempting—but they’re distractions we can’t afford. The Christmas Day bomber chose his tactic because it would circumvent last year’s security measures, and the next attacker will choose his tactic—and target—according to similar criteria. Spend money on cameras and guards in the subways, and the terrorists will simply modify their plot to render those countermeasures ineffective.

Humans are a species of storytellers, and the Moscow story has obvious parallels in New York. When we read the word “subway,” we can’t help but think about the system we use every day. This is a natural response, but it doesn’t make for good public policy. We’d all be safer if we rose above the simple parallels and the need to calm our fears with expensive and seductive new technologies—and countered the threat the smart way.

This essay originally appeared in the New York Daily News.
http://www.nydailynews.com/opinions/2010/04/07/…


News

Interesting research on security questions:
http://www.lightbluetouchpaper.org/2010/03/04/…
I’ve written about this problem:
http://www.schneier.com/essay-081.html
xkcd on the secret question:
http://xkcd.com/565/

Nice casino hack, attacking “software that controlled remote betting machines on live roulette wheels.”
https://www.schneier.com/blog/archives/2010/03/…

Disabling cars by remote control: who didn’t see this coming?
http://www.wired.com/threatlevel/2010/03/…

Research on security trade-offs and sacred values:
http://www.scientificamerican.com/article.cfm?…
Here’s how to bring lots of liquid onto a plane at Schiphol Airport. This would worry me, if the liquid ban weren’t already useless.
https://www.schneier.com/blog/archives/2010/03/…

Even more on the al-Mabhouh assassination:
https://www.schneier.com/blog/archives/2010/03/…
Lots of interesting discussion in the comments.

PDF is now the most common malware vector, dethroning MS Word.
http://www.theregister.co.uk/2010/03/09/…
http://www.f-secure.com/weblog/archives/00001676.html

Back door in software that monitors Energizer battery charger.
https://www.schneier.com/blog/archives/2010/03/…

A security analysis of electronic health records from British Columbia. While this report is from Canada, the same issues apply to any electronic patient record system in the U.S. What I find really interesting is that the Canadian government actually conducted a security analysis of the system, rather than just maintaining that everything would be fine. I wish the U.S. would do something similar.
http://www.vancouversun.com/health/…
http://www.bcauditor.com/files/publications/2010/…
Here’s why there are dead people’s names on the no-fly list: “If a person on the no-fly list dies, his name could stay on the list so that the government can catch anyone trying to assume his identity.” But since a terrorist might assume *anyone’s* identity, by the same logic we should put everyone on the no-fly list. That issue aside, it’s an interesting article on how the no-fly list works.
http://abcnews.go.com/print?id=10058645

Real-world movie plot attack by acrobatic thieves:
http://www.nj.com/news/index.ssf/2010/03/…
Similar heists:
http://www.engadget.com/2007/06/22/…
http://online.wsj.com/article/…
Natural language shellcode:
http://www.cs.jhu.edu/~sam/ccs243-mason.pdf

How to become a nuclear power. It’s sarcastic, yet a bit too close to the truth.
http://www.ip-global.org/archiv/exclusive/view/…

Side-channel attacks on encrypted web traffic. We already know that eavesdropping on an SSL-encrypted web session can leak a lot of information about the person’s browsing habits. Since the size of both the page requests and the page downloads are different, an eavesdropper can sometimes infer which links the person clicked on and what pages he’s viewing. This paper extends that work considerably.
https://www.schneier.com/blog/archives/2010/03/…

Modern photocopy machines contain hard drives that often have scans of old documents. This matters when an office disposes of an old copier. It also matters if you make your copies at a commercial copy center like Kinko’s.
http://www.thestar.com/news/gta/article/…
A potential new forensic: identifying people by their unique bacteria.
http://news.sciencemag.org/sciencenow/2010/03/…
http://www.pnas.org/content/early/2010/03/01/…

The amazing story of Gerald Blanchard, master thief.
http://www.wired.com/magazine/2010/03/…

Nice essay by Jeremy Clarkson on security guards.
http://www.timesonline.co.uk/tol/comment/columnists/…
Another Clarkson essay, this one on security theater.
http://www.timesonline.co.uk/tol/comment/columnists/…
According to new research, leaders are better liars.
http://www4.gsb.columbia.edu/ideasatwork/feature/…

Nearly half the security cameras in the New York City subways don’t work, yet crime is at record lows.
https://www.schneier.com/blog/archives/2010/03/…

Terrorists using explosive breast implants. Inexplicably, this is not an April Fool’s joke.
https://www.schneier.com/blog/archives/2010/04/…

The DHS Cybersecurity Awareness Campaign Challenge is a little hokey, but better them than the NSA.
https://www.schneier.com/blog/archives/2010/04/…

The iPhone Secret Decoder Ring will protect your secrets from your kid sister, unless she’s smarter than that. Looks cool, though.
http://dscape-llc.com/products/view/id/1

Report from the House of Lords in the UK: “Protecting Europe Against Large-Scale Cyber-Attacks.”
http://www.publications.parliament.uk/pa/ld200910/…
http://www.publications.parliament.uk/pa/ld200910/…
A camera that detects when being watched: by binoculars, sniper scopes, cameras, and even human eyeballs.
http://nexgadget.com/2010/03/22/…
How to bypass the chain on hotel-room doors.
http://blackbag.nl/?p=1315
http://www.youtube.com/watch?v=7INIRLe7x0Y

Cryptography broken on American military attack video.
https://www.schneier.com/blog/archives/2010/04/…

Air marshals are being arrested faster than air marshals are making arrests.
http://duncan.house.gov/2009/06/22062009.shtml

New cryptanalysis of the proprietary encryption algorithm used in the Digital Enhanced Cordless Telecommunications (DECT) standard for cordless phones.
https://dedected.org/trac/raw-attachment/wiki/…
http://www.theregister.co.uk/2010/02/08/…

Does dueling have a rational economic basis?
https://www.schneier.com/blog/archives/2010/04/…
See comments for some good rebuttals.

An NYU student has been reverse-engineering facial recognition algorithms to devise makeup patterns to confuse face recognition software.
http://ahprojects.com/c/itp/thesis
http://ahprojects.com/blog/122

Governments can buy commercial hardware to implement man-in-the-middle attacks against SSL.
https://www.schneier.com/blog/archives/2010/04/…

Nice analysis by John Mueller and Mark G. Stewart on terrorist attacks and comparable risks:
http://www.foreignaffairs.com/articles/66186/…
John Adams argues that our irrationality about comparative risks depends on the type of risk:
http://www.socialaffairsunit.org.uk/blog/archives/…

Chris Hoofnagle’s paper on the externalities involved in issuing credit and its effects on identity theft:
https://www.schneier.com/blog/archives/2010/04/…

Matt Blaze comments on the afterword he wrote for “Applied Cryptography” fifteen years ago:
http://www.crypto.com/blog/afterword

Storing cryptographic keys with invisible tattoos, for use in implantable medical devices.
http://research.microsoft.com/pubs/122137/healthsec.pdf

Security for implantable medical devices:
http://www.secure-medicine.org/IMD-CHI2010.pdf


Fifth Annual Movie-Plot Threat Contest

Once upon a time, men and women throughout the land lived in fear. This caused them to do foolish things that made them feel better temporarily, but didn’t make them any safer. Gradually, some people became less fearful, and less tolerant of the foolish things they were told to submit to. The lords who ruled the land tried to revive the fear, but with less and less success. Sensible men and women from all over the land were peering behind the curtain, and seeing that the emperor had no clothes.

Thus it came to pass that the lords decided to appeal to the children. If the children could be made more fearful, then their fathers and mothers might also become more fearful, and the lords would remain lords, and all would be right with the order of things. The children would grow up in fear, and thus become accustomed to doing what the lords said, further allowing the lords to remain lords. But to do this, the lords realized they needed Frightful Fables and Fear-Mongering Fairytales to tell the children at bedtime.

Your task, ye Weavers of Tales, is to create a fable or fairytale suitable for instilling the appropriate level of fear in children so they grow up appreciating all the lords do to protect them.

That’s this year’s contest. I’m looking for entries in the form of a fairytale or fable.

Make your submissions short and sweet: 400 words or less. Imagine that someone will be illustrating this story for young children. Submit your entry in comments; deadline is May 1. Feel free to post ideas and suggestions in comments as well, although only actual stories will count as submissions. I’ll choose several semifinalists, and then you all will vote for the winner. The prize is a signed copy of my latest book, Cryptography Engineering. And if anyone seriously wants to illustrate this, please contact me directly—or just go for it and post a link.

Thank you to loyal reader—and frequent reader of my draft essays—”grenouille,” who suggested this year’s contest.

And good luck!

Post, read, and comment on entries here:
https://www.schneier.com/blog/archives/2010/04/…

The First Movie-Plot Threat Contest rules and winner.
https://www.schneier.com/blog/archives/2006/04/…
https://www.schneier.com/blog/archives/2006/06/…

The Second Movie-Plot Threat Contest rules, semifinalists, and winner.
https://www.schneier.com/blog/archives/2007/04/…
https://www.schneier.com/blog/archives/2007/06/…
https://www.schneier.com/blog/archives/2007/06/…

The Third Movie-Plot Threat Contest rules, semifinalists, and winner.
https://www.schneier.com/blog/archives/2008/04/…
https://www.schneier.com/blog/archives/2008/05/…
https://www.schneier.com/blog/archives/2008/05/…

The Fourth Movie-Plot Threat Contest rules and winner.
https://www.schneier.com/blog/archives/2009/04/…
https://www.schneier.com/blog/archives/2009/05/…


New Book: Cryptography Engineering

I have a new book, sort of. Cryptography Engineering is really the second edition of Practical Cryptography. Niels Ferguson and I wrote Practical Cryptography in 2003. Tadayoshi Kohno did most of the update work—and added exercises to make it more suitable as a textbook—and is the third author on Cryptography Engineering. (I didn’t like it that Wiley changed the title; I think it’s too close to Ross Anderson’s excellent Security Engineering.)

Cryptography Engineering is a techie book; it’s for practitioners who are implementing cryptography or for people who want to learn more about the nitty-gritty of how cryptography works and what the implementation pitfalls are. If you’ve already bought Practical Cryptography, there’s no need to upgrade unless you’re actually using it.

Here’s what’s new: We revised the introductory materials in Chapter 1 to help readers better understand the broader context for computer security, with some explicit exercises to help readers develop a security mindset. We updated the discussion of AES in Chapter 3; rather than speculating on algebraic attacks, we now talk about the recent successful (theoretical, not practical) attacks against AES. Chapter 4 used to recommended using nonce-based encryption schemes. We now find these schemes problematic, and instead recommend randomized encryption schemes, like CBC mode. We updated the discussion of hash functions in Chapter 5; we discuss new results against MD5 and SHA1, and allude to the new SHA3 candidates (but say it’s too early to start using the SHA3 candidates). In Chapter 6, we no longer talk about UMAC, and instead talk about CMAC and GMAC. We revised Chapters 8 and 15 to talk about some recent implementation issue to be aware of. For example, we now talk about the cold boot attacks and challenges for generating randomness in VMs. In Chapter 19, we discuss online certificate verification.

Signed copies are available. See the bottom of the book’s webpage for details.

http://www.schneier.com/book-ce.html


Schneier News

I am delivering the keynote at InfoSec World in Orlando on April 20th:
http://www.misti.com/default.asp?…

I am speaking at the SwissICT Symposium in Interlaken, Switzerland, on May 10th:
http://www.swissict.ch/symposium2010.html

I’m participating in a debate called “The Cyber War Threat Has Been Grossly Exaggerated” in Washington, DC on June 8th:
http://intelligencesquaredus.org/index.php/debates/…

I was interviewed on Second Life with James Fallows:
http://www.blogtalkradio.com/virtuallyspeaking/2010/…
iTunes podcast:
http://itunes.apple.com/us/podcast/…?
An eerily accurate Schneier blogging template:
https://www.schneier.com/blog/archives/2010/03/…

Last month at the RSA Conference, I gave a talk titled “Security, Privacy, and the Generation Gap.” It was pretty good, but it was the first time I gave that talk in front of a large audience—and its newness showed. Earlier this month, I gave the same talk again, at the CACR Higher Education Security Summit at Indiana University. It was much, much better the second time around, and there’s a video available.
http://www.indiana.edu/~video/stream/…
CRN Magazine named me as one of its security superstars of 2010.
http://www.crn.com/security/223100880


Should the Government Stop Outsourcing Code Development?

Information technology is increasingly everywhere, and it’s the same technologies everywhere. The same operating systems are used in corporate and government computers. The same software controls critical infrastructure and home shopping. The same networking technologies are used in every country. The same digital infrastructure underpins the small and the large, the important and the trivial, the local and the global; the same vendors, the same standards, the same protocols, the same applications.

With all of this sameness, you’d think these technologies would be designed to the highest security standard, but they’re not. They’re designed to the lowest or, at best, somewhere in the middle. They’re designed sloppily, in an ad hoc manner, with efficiency in mind. Security is a requirement, more or less, but it’s a secondary priority. It’s far less important than functionality, and security is what gets compromised when schedules get tight.

Should the government—ours, someone else’s?—stop outsourcing code development? That’s the wrong question to ask. Code isn’t magically more secure when it’s written by someone who receives a government paycheck than when it’s written by someone who receives a corporate paycheck. It’s not magically less secure when it’s written by someone who speaks a foreign language, or is paid by the hour instead of by salary. Writing all your code in-house isn’t even a viable option anymore; we’re all stuck with software written by who-knows-whom in who-knows-which-country. And we need to figure out how to get security from that.

The traditional solution has been defense in depth: layering one mediocre security measure on top of another mediocre security measure. So we have the security embedded in our operating system and applications software, the security embedded in our networking protocols, and our additional security products such as antivirus and firewalls. We hope that whatever security flaws—either found and exploited, or deliberately inserted—there are in one layer are counteracted by the security in another layer, and that when they’re not, we can patch our systems quickly enough to avoid serious long-term damage. That is a lousy solution when you think about it, but we’ve been more-or-less managing with it so far.

Bringing all software—and hardware, I suppose—development in-house under some misconception that proximity equals security is not a better solution. What we need is to improve the software development process, so we can have some assurance that our software is secure—regardless of what coder, employed by what company, and living in what country, writes it. The key word here is “assurance.”

Assurance is less about developing new security techniques than about using the ones we already have. It’s all the things described in books on secure coding practices. It’s what Microsoft is trying to do with its Security Development Lifecycle. It’s the Department of Homeland Security’s Build Security In program. It’s what every aircraft manufacturer goes through before it fields a piece of avionics software. It’s what the NSA demands before it purchases a piece of security equipment. As an industry, we know how to provide security assurance in software and systems. But most of the time, we don’t care; commercial software, as insecure as it is, is good enough for most purposes.

Assurance is expensive, in terms of money and time, for both the process and the documentation. But the NSA needs assurance for critical military systems and Boeing needs it for its avionics. And the government needs it more and more: for voting machines, for databases entrusted with our personal information, for electronic passports, for communications systems, for the computers and systems controlling our critical infrastructure. Assurance requirements should be more common in government IT contracts.

The software used to run our critical infrastructure—government, corporate, everything—isn’t very secure, and there’s no hope of fixing it anytime soon. Assurance is really our only option to improve this, but it’s expensive and the market doesn’t care. Government has to step in and spend the money where its requirements demand it, and then we’ll all benefit when we buy the same software.

Critical software:
http://www.schneier.com/essay-140.html

Assurance:
http://www.schneier.com/essay-286.html

This essay first appeared in Information Security, as the second part of a point-counterpoint with Marcus Ranum. You can read Marcus’s essay there as well.
http://searchsecurity.techtarget.com/…


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Schneier on Security,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.

Copyright (c) 2010 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.