May 15, 2003
by Bruce Schneier
A free monthly newsletter providing summaries, analyses, insights, and commentaries on computer security and cryptography.
Back issues are available at <http://www.schneier.com/crypto-gram.html>. To subscribe, visit <http://www.schneier.com/crypto-gram.html> or send a blank message to email@example.com.
Copyright (c) 2003 by Counterpane Internet Security, Inc.
In this issue:
In the long-titled "Report of the Director of the Administrative Office of the United States Courts on Applications for Orders Authorizing or Approving the Interception of Wire, Oral, or Electronic Communications," we find the following interesting quote:
"Public Law 106-197 amended 18 U.S.C. 2519(2)(b) in 2001 to require that reporting should reflect the number of wiretap applications granted in which encryption was encountered and whether such encryption prevented law enforcement officials from obtaining the plain text of communications intercepted pursuant to the court orders. In 2002, no federal wiretap reports indicated that encryption was encountered. State and local jurisdictions reported that encryption was encountered in 16 wiretaps terminated in 2002; however, in none of these cases was encryption reported to have prevented law enforcement officials from obtaining the plain text of communications intercepted. In addition, state and local jurisdictions reported that encryption was encountered in 18 wiretaps that were terminated in calendar year 2001 or earlier, but were reported for the first time in 2002; in none of these cases did encryption prevent access to the plain text of communications intercepted." (Pages 10-11.)
Two points immediately spring forward:
1) Encryption of phone communications is very uncommon. Sixteen cases of encryption out of 1,358 wiretaps is a little more than one percent. Almost no suspected criminals use voice encryption.
2) Encryption of phone conversations isn't very effective. Every time law enforcement encountered encryption, they were able to bypass it. I assume that local law enforcement agencies don't have the means to brute-force DES keys (for example). My guess is that the voice encryption was relatively easy to bypass.
These two points can be easily explained by the fact that telephones are closed devices. Users can't download software onto them like they can on computers. No one can write a free encryption program for phones. Even software manufacturers will find it more expensive to sell an added feature for a phone system than for a computer system.
This means that telephone security is a narrow field. Encrypted phones are expensive. Encrypted phones are designed and manufactured by companies who believe in secrecy. Telephone encryption is closed from scrutiny; the software is not subject to peer review. It should come as no surprise that the result is a poor selection of expensive lousy telephone security products.
For decades, the debate about whether openness helps or hurts security has continued. It's obvious to us security people that secrecy hurts security, but it's so counterintuitive to the general population that we continually have to defend our position. This wiretapping report provides hard evidence that a closed security design methodology -- the "trust us because we know these things" way of building security products -- doesn't work. The U.S. government hasn't encountered a telephone encryption product that they couldn't easily break.
My essay on secrecy and security:
Crypto-Gram is currently in its sixth year of publication. Back issues cover a variety of security-related topics, and can all be found on <http://www.schneier.com/crypto-gram.html>. These are a selection of articles that appeared in this calendar month in other years.
Secrecy, Security, and Obscurity
Fun with Fingerprint Readers
What Military History Can Teach Computer Security, Part 2
The Futility of Digital Copy Protection
Safe Personal Computing
Computer Security: Will we Ever Learn?
Trusted Client Software
The IL*VEYOU Virus (Title bowdlerized to foil automatic e-mail filters.)
The Internationalization of Cryptography
The British discovery of public-key cryptography
The UK is considering e-voting:
Funny password story. And people wonder why security is so hard....
Three interesting essays by Andrew Odlyzko.
Cyberterrorism, reality and hype:
Another article about people mistakenly on the terrorist "watch list." The problem is what I wrote about last month: there is an incentive for law enforcement to put people on this list, but no incentive for them to take people off. So the harrassment continues.
New Web site/blog/whatever. Fun reading.
Howard Schmidt resigns as the White House Cybersecurity Advisor, and joins eBay as the VP of Security. I'm not sure what this means for government computer security, but my guess is that it's not good.
The April Fool's RFC from two years ago. I don't think I ever linked to it. (It would be funnier if it weren't so true.)
Interview with Paul Kocher. Good points about copy protection.
Security problem at Apple.com. These kinds of problems are pretty common, and are generally the result of programmers taking clever shortcuts when designing Web forms and other interactive features. It's just easy to store data in the URL, and who thinks about security?
A Korean group is suing Microsoft for damages caused by the SQL Slammer worm. This will be an interesting case to follow, and probably a harbinger of things to come.
A majority of cybercrime losses are due to data theft. I've been saying this for a while, now.
Counterpane is exhibiting/participating in several ISCA events: the North America CACS event in Houston (May 18-20), Cyber Security 2003 in Sacramento (May 20).
Bruce Schneier recently received a Lifetime Achievement Award from SC Magazine.
Store owners want their salespeople to ring up a sale and provide a receipt, because that practice also generates an internal register receipt and makes it harder for salespeople to steal from the register: It produces an accurate audit trail. Honest salespeople don't care one way or another, and in stores where returns are not common -- such as fast-food restaurants or convenience stores -- neither do the customers. A common security practice is to put a sign on the register that says: "Your purchase free if I fail to give a receipt." What that sign does is give the customer an interest in paying attention to whether or not she gets a receipt and immediately reporting an employee who doesn't give her one (by demanding her purchase free). It enlists her as a security agent to defend against employee theft. The customer has the capability to perform this security function, and the sign gives her the incentive.
A common security countermeasure against spam is to use unique e-mail addresses when signing up for things. If someone uses a different e-mail address every time he gets an Amazon account, signs up for a mailing list, or sends off for information, he gets two security benefits. One, he can track who sells his e-mail address to whom. And two, he can turn e-mail addresses off when they get too well known. For someone who has a large number of e-mail addresses available and can point them all to a single e-mail address, it's a quick and easy security countermeasure. I've done it myself.
That's our policy, and I know we stick to it. But if that's true, how did his unique e-mail address get onto spam lists? We Googled for the unique e-mail address he'd used, and found that he'd posted a message to a mailing list in which he accidentally used it. Some spam harvester must have recently found that. He'd even recognized his mistake at the time -- only because someone asked if the string "counterpane" in his address had anything to do with me -- but of course four years later it's hard to remember.
The address involved was of the format firstname.lastname@example.org, which provides a fairly obvious potential for framing. There are 63 addresses of the form "counterpane@foo" in my subscription logs. I'll bet at least some of those people have a corresponding "amazon@foo" for their Amazon accounts. Good thing I don't want to make Amazon look bad....
From: "Ian C. Blenke" <ianblenke.com>
While the "slashdot spam" scenario for postal abuse has become feasible recently, abusing phone service victims with automated fax systems has been possible for quite some time. Try googling for "request catalog fax" or "request whitepaper fax".
From: "Stéphane Doyon" <s.doyonvideotron.ca>
> Individual catalog companies can protect themselves by
I would like to point out however that this technique is very frustrating for blind people like me. Yet another barrier to Web accessibility. (I don't order paper catalogs much of course, but I was blocked by this technique once or twice for other transactions...)
From: "Steven M. Bellovin" <smbresearch.att.com>
>A couple of weeks ago I was listening to a baseball game on the
I suspect that that is a valid security measure, albeit not because of terrorism. They're trying to deny people small, heavy objects that they can throw easily -- a problem that has happened. (A few years ago, when John Rocker was persona non grata among New York Mets fans, batteries were the weapon of choice.)
Not understanding the threat model can make lots of security risks seem absurd. Imagine what someone who had never heard of timing attacks would think of the RSA blinding step.
From: Erwann Abalea <erwann.abaleacertplus.com>
This kind of security measure has already been applied in France, and maybe in England, for soccer competitions. The problem is really about security and not about the control of a market, since drinks are not sold in those stadiums. The fact is that some "hooligans" use bottles, cans, or anything solid to hit other people, or throw these objects on the playfield to hurt the players or the referees.
From: Jon Woodcock <jpwoodcockyahoo.co.uk>
Your baseball piece brought to mind another abuse of the terrorism argument to justify actions motivated by a personal agenda. The Mayor of Chicago recently arranged for the runway of a local airfield he doesn't like to be dug up in the middle of the night -- his justification, "Homeland Security." See the saga unfold at <http://www.aopa.org/whatsnew/newsitems/2003/...>.
From: "Vladimir G. Ivanovic" <vladimiracm.org>
After reading the first 60 or so pages of the report, it strikes me that the Committee's recommendations might be effective against yesterday's security threats (airplanes-as-missiles). Yesterday's threats succeeded because the security procedures in place at the time were designed to be effective against an even older threat (hijackings). Something is wrong here...
If I were a terrorist, I would not use box cutters. I'd try something different. In fact, why target airplanes at all? A cruise liner, a stadium, a bridge during rush-hour traffic, a nice chemical plant, all would make spectacular headlines.
From: Nathan Rosenblum <flandersmurf.to>
>Saudi terrorist sympathizers learn computer security at
Characterizing Mr. al-Hussayen as a "terrorist sympathizer" is inaccurate. At most, Sami is suspected of being involved with organizations that contribute to terrorism. It should be noted that the actual charges against him relate to visa violations stemming from the fact that he allegedly did not report membership in an organization while applying for a visa. Additionally, he is accused of working for the IANA while studying at the University of Idaho (persons holding student visas are not permitted to engage in activity unrelated to their academic pursuits).
While I have difficulty believing that my former colleague knowingly aided a terror-supporting organization through the alleged financial transfers or through Web site development, it certainly is not impossible. Still, I feel that it would be more responsible to prepend "suspected" to "terrorist sympathizer." Indeed, even if Sami al-Hussayen is convicted of the charges against him and is forced to leave with his family for Saudi Arabia, it will still not be possible to characterize him as a "terrorist sympathizer" on that basis; Sami will not be tried on terror-related charges.
CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses, insights, and commentaries on computer security and cryptography. Back issues are available on <http://www.schneier.com/crypto-gram.html>.
To subscribe, visit <http://www.schneier.com/crypto-gram.html> or send a blank message to email@example.com. To unsubscribe, visit <http://www.schneier.com/crypto-gram-faq.html>.
Please feel free to forward CRYPTO-GRAM to colleagues and friends who will find it valuable. Permission is granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Schneier is founder and CTO of Counterpane Internet Security Inc., the author of "Secrets and Lies" and "Applied Cryptography," and an inventor of the Blowfish, Twofish, and Yarrow algorithms. He is a member of the Advisory Board of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on computer security and cryptography.
Counterpane Internet Security, Inc. is the world leader in Managed Security Monitoring. Counterpane's expert security analysts protect networks for Fortune 1000 companies world-wide.
Copyright (c) 2003 by Counterpane Internet Security, Inc.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.