Crypto-Gram

June 15, 2011

by Bruce Schneier
Chief Security Technology Officer, BT
schneier@schneier.com
http://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <http://www.schneier.com/crypto-gram-1106.html>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively comment section. An RSS feed is available.


In this issue:


New Siemens SCADA Vulnerabilities Kept Secret

SCADA systems—computer systems that control industrial processes—are one of the ways a computer hack can directly affect the real world. Here, the fears multiply. It’s not bad guys deleting your files, or getting your personal information and taking out credit cards in your name; it’s bad guys spewing chemicals into the atmosphere and dumping raw sewage into waterways. It’s Stuxnet: centrifuges spinning out of control and destroying themselves. Never mind how realistic the threat is, it’s scarier.

Last week, a researcher was successfully pressured by the Department of Homeland Security not to disclose details “before Siemens could patch the vulnerabilities.”

Beresford wouldn’t say how many vulnerabilities he found in the Siemens products, but said he gave the company four exploit modules to test. He believes that at least one of the vulnerabilities he found affects multiple SCADA-system vendors, which share “commonality” in their products. Beresford wouldn’t reveal more details, but says he hopes to do so at a later date.

We’ve been living with full disclosure for so long that many people have forgotten what life was like before it was routine.

Before full disclosure was the norm, researchers would discover vulnerabilities in software and send details to the software companies—who would ignore them, trusting in the security of secrecy. Some would go so far as to threaten the researchers with legal action if they disclosed the vulnerabilities.

Later on, researchers announced that particular vulnerabilities existed, but did not publish details. Software companies would then call the vulnerabilities “theoretical” and deny that they actually existed. Of course, they would still ignore the problems, and occasionally threaten the researcher with legal action. Then, of course, some hacker would create an exploit using the vulnerability—and the company would release a really quick patch, apologize profusely, and then go on to explain that the whole thing was entirely the fault of the evil, vile hackers.

I wrote that in 2007. Siemens is doing it right now:

Beresford expressed frustration that Siemens appeared to imply the flaws in its SCADA systems gear might be difficult for a typical hacker to exploit because the vulnerabilities unearthed by NSS Labs “were discovered while working under special laboratory conditions with unlimited access to protocols and controllers.”

There were no “‘special laboratory conditions’ with ‘unlimited access to the protocols,'” Beresford wrote Monday about how he managed to find flaws in Siemens PLC gear that would allow an attacker to compromise them. “My personal apartment on the wrong side of town where I can hear gunshots at night hardly defines a special laboratory.” Beresford said he purchased the Siemens controllers with funding from his company and found the vulnerabilities, which he says hackers with bad intentions could do as well.

That’s precisely the point. Me again from 2007:

Unfortunately, secrecy *sounds* like a good idea. Keeping software vulnerabilities secret, the argument goes, keeps them out of the hands of the hackers…. But that assumes that hackers can’t discover vulnerabilities on their own, and that software companies will spend time and money fixing secret vulnerabilities. Both of those assumptions are false. Hackers have proven to be quite adept at discovering secret vulnerabilities, and full disclosure is the only reason vendors routinely patch their systems.

With the pressure off, Siemens is motivated to deal with the PR problem and ignore the underlying security problem.

http://www.wired.com/threatlevel/2011/05/…
The history of full disclosure.
https://www.schneier.com/blog/archives/2007/01/…

Siemens pressuring Bradford.
http://www.networkworld.com/news/2011/…


Yet Another Way to Avoid TSA’s Full-Body Scanners

Last night, at the Third EPIC Champion of Freedom Awards Dinner, we gave an award to Susie Castillo, whose blog post and video of her treatment in the hands of the TSA has inspired thousands to complain about the agency and their treatment of travellers.

Sitting with her at dinner, I learned yet another way to evade the TSA’s full body scanners: carry a small pet. She regularly travels with her small dog, and has found that she is always directed away from the full-body scanners and through the magnetometers. I suspect that the difficulty of keeping the dog still is why TSA makes that determination. (The carrier, of course, goes through the x-ray machine.)

I’m not sure what the TSA is going to do now that I’ve publicized this unpublished exception. Those of you who travel with small pets: please let me know what happens.

(For those of you who are appalled that I could give the terrorists ideas on how to evade the full-body scanners, there are already so many ways that one more can’t hurt.)

http://www.susiecastillo.net//2011/4/25/…


News

An FBI surveillance device, designed to be attached to a car, has been taken apart and analyzed.
http://www.ifixit.com/Teardown/…
A recent ruling by the 9th U.S. Circuit Court of Appeals affirms that it’s legal for law enforcement to secretly place a tracking device on your car without a warrant, even if it’s parked in a private driveway.
http://www.executivegov.com/2010/08/…
Scanning fingerprints from six feet away. No information on how accurate it is, but it’ll only get better.
http://www.technologyreview.com/biomedicine/27052/

Bin Laden maintained computer security with an air gap, according to the Associated Press. I’m impressed. It’s hard to maintain this kind of COMSEC discipline.
https://www.schneier.com/blog/archives/2011/05/…

I haven’t written about Dropbox’s security problems; too busy with the book. But here’s an excellent summary article from The Economist.
http://www.economist.com/s/babbage/2011/05/…
The meta-issue is pretty simple. If you expect a cloud provider to do anything more interesting than simply store your files for you and give them back to you at a later date, they are going to have to have access to the plaintext. For most people—Gmail users, Google Docs users, Flickr users, and so on—that’s fine. For some people, it isn’t. Those people should probably encrypt their files themselves before sending them into the cloud.
Another security issue with Dropbox:
http://dereknewton.com/2011/04/…

NIST has released “BIOS Protection Guidelines.”
http://csrc.nist.gov/publications/nistpubs/800-147/…
http://www.phrack.com/issues.html?issue=66&id=7

For years, an employee of Cubic Corp—the company that makes the automatic fare card systems for most of the subway systems around the world—forged and then sold monthly passes for the Boston MBTA system. The scheme was discovered by accident. “Cubic Transportation Systems said in a written statement that it is cooperating with authorities. ‘Our company has numerous safeguards designed to prevent fraudulent production or distribution of Charlie Tickets,’ the statement said, referring to the monthly MBTA passes.” It always amuses me when companies pretend the obvious isn’t true in their press releases. “Someone completely broke our system.” “Say that we have a lot of security.” “But it didn’t work.” “Say it anyway; the press will just blindly report it.” To be fair, we don’t—and probably will never—know how this proprietary system was broken. In this case, an insider did it. But did that insider just have access to the system specifications, or was access to blank ticket stock or specialized equipment necessary as well?
https://www.schneier.com/blog/archives/2011/05/…

TSA-style security is now so normal that it’s part of a Disney ride at Walt Disney World in Orlando.
http://www.mouseplanet.com/9624/…

The Centers for Disease Control and Prevention weigh in on preparations for the zombie apocalypse.
http://emergency.cdc.gov/socialmedia/zombies_blog.asp

Blackhole Exploit Kit is now available as a free download.
http://www.theregister.co.uk/2011/05/24/…

Proposed new rules for automobile black boxes in the U.S.
http://www.wired.com/autopia/2011/05/…

It’s amusing to watch the presidential limo immobilized by a steep grade at the U.S. embassy in Dublin. (You’ll get a glimpse of how thick the car doors are toward the end of the video.) It was a spare; the president wasn’t riding in it at the time.
https://www.schneier.com/blog/archives/2011/05/…
Related: a video of President Bush’s limo breaking down in Rome:
http://www.youtube.com/watch?v=iX4-a7VBr4c#t=1m28s

Elcomsoft has cracked Apple’s iOS 4 hardware encryption. Note that they didn’t break AES-256; they figured out how to extract the keys from the hardware (iPhones, iPads). The company “will be releasing the product implementing this functionality for the exclusive use of law enforcement, forensic and intelligence agencies.”
http://.crackpassword.com/2011/05/…
http://.crackpassword.com/2011/05/…
Cyber criminals are getting aggressive with their social engineering tactics.
https://www.schneier.com/blog/archives/2011/05/…

Lockheed Martin hack linked to RSA’s SecurID breach.
http://www.reuters.com/article/2011/05/27/…
http://www.nytimes.com/2011/05/28/business/28hack.html
http://www.rawstory.com/rs/2011/05/27/…
http://www.theregister.co.uk/2011/05/27/…

The U.S. seems to have a secret stealth helicopter. That’s what the U.S. destroyed after a malfunction in Pakistan during the bin Laden assassination. (For helicopters, “stealth” is less concerned with radar signatures and more concerned with acoustical quiet.) There was some talk about Pakistan sending it to China, but they’re returning it to the U.S. I presume that the Chinese got everything they needed quickly.
http://www.nytimes.com/2011/05/06/world/asia/…

A four-volume history of counterintelligence, “CI Reader: An American Revolution Into the New Millennium,” published by the U.S. Office of the National Counterintelligence Executive. (No, I’ve never heard of them, either.)
https://www.schneier.com/blog/archives/2011/06/…

Reporters have been calling me pretty much constantly about spear phishing attacks against Gmail accounts, but I can’t figure out why in the world this is news.
http://www.pcmag.com/article2/0,2817,2386287,00.asp
http://www.schneier.com/essay-227.html
http://www.google.com/hostednews/ap/article/…
http://www.washingtonpost.com/business/…
Attacks from China—old news.
http://www.schneier.com/essay-227.html
Attacks from China against Google—old news.
http://www.schneier.com/essay-306.html
Attacks from China against Google Gmail accounts—old news.
https://www.schneier.com/blog/archives/2010/02/…
Spear phishing attacks from China against senior government officials—old news.
https://www.schneier.com/blog/archives/2009/03/…
There’s even a WikiLeaks cable about this stuff.
https://www.schneier.com/blog/archives/2011/04/…

Daniel Solove on the security vs. privacy debate.
http://www.salon.com/news/politics/war_room/2011/05/…
World War II Tunny cryptanalysis machine rebuilt at Bletchley Park.
http://www.theregister.co.uk/2011/05/26/…
Redaction failures are so common that I stopped blogging about them years ago. This is the first analysis I have seen of technical redaction failures.
http://freedom-to-tinker.com//tblee/…
And here’s the NSA on how to redact.
https://www.schneier.com/blog/archives/2006/02/…

MI6 hacked into an online al-Qaeda magazine and replaced bomb-making instructions with a cupcake recipe. It’s a more polite hack than subtly altering the recipe so it blows up during the making process. (I’ve been told, although I don’t know for sure, that the 1971 “Anarchist’s Cookbook” has similarly flawed recipes.)
http://www.telegraph.co.uk/news/uknews/…
Tennessee makes password sharing illegal. Of course it won’t work. “State lawmakers in country music’s capital have passed a groundbreaking measure that would make it a crime to use a friend’s login—even with permission—to listen to songs or watch movies from services such as Netflix or Rhapsody.”
http://news.yahoo.com/s/ap/20110601/ap_on_hi_te/…
According to some random news reports, 25% of U.S. criminal hackers are police informants. I have no idea if it’s true, but if I were the FBI I would want everyone to believe that it’s true.
http://www.guardian.co.uk/technology/2011/jun/06/…
Interesting research: Kirill Levchenko, et al. (2010), “Click Trajectories—End-to-End Analysis of the Spam Value Chain,” IEEE Symposium on Security and Privacy 2011, Oakland, California, 24 May 2011. “95% of spam-advertised pharmaceutical, replica and software products are monetized using merchant services from just a handful of banks.” This points to a fruitful avenue to reduce spam: go after the banks.
http://cseweb.ucsd.edu/~savage/papers/Oakland11.pdf
http://www.informationweek.com/news/security/client/…

Here’s a potential new airport screen technology. I know nothing about it.
http://isconimaging.com/technology.htm
http://wholebodyimagingfacts.com/?p=201

A good rant by Patrick Gray on why we secretly love LulzSec.
http://risky.biz/lulzsec

A good rant by Robert Cringely on why we openly hate RSA.
http://www.cringely.com/2011/06/when-engineers-lie/

Adam Shostack’s rant about Patrick Gray’s rant.
http://newschoolsecurity.com/2011/06/…

Why is it so difficult to trace cyber attacks? I’ve been asked this question by countless reporters in the past couple of weeks. Here’s a good explanation. Shorter answer: it’s easy to spoof source location, and it’s easy to hijack unsuspecting middlemen and use them as proxies.
http://www.scientificamerican.com/article.cfm?…
No, mandating attribution won’t solve the problem. Any Internet design will necessarily include anonymity.
http://www.schneier.com/essay-308.html

Status report on the war on photography: Morgan Leigh Manning, “Less than Picture Perfect: The Legal Relationship between Photographers’ Rights and Law Enforcement,” Tennessee Law Review, Vol. 78, p. 105, 2010.
http://papers.ssrn.com/sol3/papers.cfm?…

The non-anonymity of “fill-in-the-bubble” forms.
http://www.freedom-to-tinker.com//wclarkso/…

Malware in Google’s Android:
http://www.theregister.co.uk/2011/06/13/…


Keeping Sensitive Information Out of the Hands of Terrorists

Through Self-Restraint

In my forthcoming book (available February 2012), I talk about various mechanisms for societal security: how we as a group protect ourselves from the “dishonest minority” within us. I have four types of societal security systems:

  • moral systems—any internal rewards and punishments;
  • reputational systems—any informal external rewards and punishments;
  • rule-based systems—any formal system of rewards and punishments (mostly punishments)—laws, mostly;
  • technological systems—everything like walls, door locks, cameras, and so on.

We spend most of our effort in the third and fourth category. I am spending a lot of time researching how the first two categories work.

Given that, I was very interested in seeing an article by Dallas Boyd in “Homeland Security Affairs”: “Protecting Sensitive Information: The Virtue of Self-Restraint,” where he basically says that people should not publish information that terrorists could use out of moral responsibility (he calls it “civic duty”). Ignore for a moment the debate about whether publishing information that could give the terrorists ideas is actually a bad idea—I think it’s not—what Boyd is proposing is actually very interesting. He specifically says that censorship is bad and won’t work, and wants to see voluntary self-restraint along with public shaming of offenders.

As an alternative to formal restrictions on communication, professional societies and influential figures should promote voluntary self-censorship as a civic duty. As this practice is already accepted among many scientists, it may be transferrable to members of other professions. As part of this effort, formal channels should be established in which citizens can alert the government to vulnerabilities and other sensitive information without exposing it to a wide audience. Concurrent with this campaign should be the stigmatization of those who recklessly disseminate sensitive information. This censure would be aided by the fact that many such people are unattractive figures whose writings betray their intellectual vanity. The public should be quick to furnish the opprobrium that presently escapes these individuals.

I don’t think it will work, and I don’t even think it’s possible in this international day and age, but it’s interesting to read the proposal.

Protecting Sensitive Information: The Virtue of Self-Restraint:
http://www.hsaj.org/?fullarticle=7.1.10

More articles:
http://yro.slashdot.org/story/11/05/27/2324227/…
http://www.fas.org//secrecy/2011/05/…


Man-in-the-Middle Attack Against the MCAT Exam

In Applied Cryptography, I wrote about the “Chess Grandmaster Problem,” a man-in-the-middle attack. Basically, Alice plays chess remotely with two grandmasters. She plays Grandmaster 1 as white and Grandmaster 2 as black. After the standard opening of 1. e4, she just replays the moves from one game to the other, and convinces both of them that she’s a grandmaster in the process.

Detecting these sorts of man-in-the-middle attacks is difficult, and involves things like synchronous clocks, complex cryptographic protocols, or—more practically—proctors. Proctors, of course, can be fooled. Here’s a real-world attempt of this type of attack on the MCAT medical-school admissions test.

Police allege he used a pinhole camera and wireless technology to transmit images of the questions on a computer screen back to his co-conspirator, Ruben, at the University of British Columbia.

Investigators believe Ruben then tricked three other students, who thought they were taking a multiple choice test for a job to be an MCAT tutor, into answering the questions.

The answers were then transmitted back by phone to Rezazadeh-Azar, as he continued on with the test in Victoria, police allege.

http://www.cbc.ca/news/canada/british-columbia/…
And as long as we’re on the topic, we can think about all the ways to hack this system of remote exam proctoring via webcam.
http://www.ao.uiuc.edu/support/source/…


Schneier News

I’m speaking at Computers, Freedom, and Privacy in Washington DC on June 16.
http://www.cfp.org/2011/wiki/index.php/Main_Page


Open-Source Software Feels Insecure

At first glance, this seems like a particularly dumb opening line of an article:

Open-source software may not sound compatible with the idea of strong cybersecurity, but….

But it’s not. Open source does sound like a security risk. Why would you want the bad guys to be able to look at the source code? They’ll figure out how it works. They’ll find flaws. They’ll—in extreme cases—sneak back-doors into the code when no one is looking.

Of course, these statements rely on the erroneous assumptions that security vulnerabilities are easy to find, and that proprietary source code makes them harder to find. And that secrecy is somehow aligned with security. I’ve written about this several times in the past, and there’s no need to rewrite the arguments again.

Still, we have to remember that the popular wisdom is that secrecy equals security, and open-source software doesn’t sound compatible with the idea of strong cybersecurity.

http://www.innovationnewsdaily.com/…

Me on open-source security:
http://www.schneier.com/essay-056.html
http://www.schneier.com/crypto-gram-0205.html#1
http://www.schneier.com/…


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Schneier on Security,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.

Copyright (c) 2011 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.