November 15, 2012

by Bruce Schneier
Chief Security Technology Officer, BT

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <>.

You can read this issue on the web at <>. These same essays and news items appear in the "Schneier on Security" blog at <>, along with a lively comment section. An RSS feed is available.

In this issue:

Stoking Cyber Fears

A lot of the debate around President Obama's cybsersecurity initiative centers on how much of a burden it would be on industry, and how that should be financed. As important as that debate is, it obscures some of the larger issues surrounding cyberwar, cyberterrorism, and cybersecurity in general.

It's difficult to have any serious policy discussion amongst the fear mongering. Secretary Panetta's recent comments are just the latest; search the Internet for "cyber 9/11," "cyber Pearl-Harbor," "cyber Katrina," or -- my favorite -- "cyber Armageddon."

There's an enormous amount of money and power that results from pushing cyberwar and cyberterrorism: power within the military, the Department of Homeland Security, and the Justice Department; and lucrative government contracts supporting those organizations. As long as cyber remains a prefix that scares, it'll continue to be used as a bugaboo.

But while scare stories are more movie-plot than actual threat, there are real risks. The government is continually poked and probed in cyberspace, from attackers ranging from kids playing politics to sophisticated national intelligence gathering operations. Hackers can do damage, although nothing like the cyberterrorism rhetoric would lead you to believe. Cybercrime continues to rise, and still poses real risks to those of use who work, shop, and play on the Internet. And cyberdefense needs to be part of our military strategy.

Industry has definitely not done enough to protect our nation's critical infrastructure, and federal government may need more involvement. This should come as no surprise; the economic externalities in cybersecurity are so great that even the freest free market would fail.

For example, the owner of a chemical plant will protect that plant from cyber attack up to the value of that plant to the owner; the residual risk to the community around the plant will remain. Politics will color how government involvement looks: market incentives, regulation, or outright government takeover of some aspects of cybersecurity.

None of this requires heavy-handed regulation. Over the past few years we've heard calls for the military to better control Internet protocols; for the United States to be able to "kill" all or part of the Internet, or to cut itself off from the greater Internet; for increased government surveillance; and for limits on anonymity. All of those would be dangerous, and would make us less secure. The world's first military cyberweapon, Stuxnet, was used by the United States and Israel against Iran.

In all of this government posturing about cybersecurity, the biggest risk is a cyber-war arms race; and that's where remarks like Panetta's lead us. Increased government spending on cyberweapons and cyberdefense, and an increased militarization of cyberspace, is both expensive and destabilizing. Fears lead to weapons buildups, and weapons beg to be used.

I would like to see less fear mongering, and more reasoned discussion about the actual threats and reasonable countermeasures. Pushing the fear button benefits no one.

This essay originally appeared in the New York Times "Room for Debate" blog.

Here are the other essays on the topic.

Hacking TSA PreCheck

I have a hard time getting worked up about this story:

I have X'd out any information that you could use to change my reservation. But it's all there, PNR, seat assignment, flight number, name, ect. But what is interesting is the bolded three on the end. This is the TSA Pre-Check information. The number means the number of beeps. 1 beep no Pre-Check, 3 beeps yes Pre-Check. On this trip as you can see I am eligible for Pre-Check. Also this information is not encrypted in any way.
What terrorists or really anyone can do is use a website to decode the barcode and get the flight information, put it into a text file, change the 1 to a 3, then use another website to re-encode it into a barcode. Finally, using a commercial photo-editing program or any program that can edit graphics replace the barcode in their boarding pass with the new one they created. Even more scary is that people can do this to change names. So if they have a fake ID they can use this method to make a valid boarding pass that matches their fake ID. The really scary part is this will get past both the TSA document checker, because the scanners the TSA use are just barcode decoders, they don't check against the real time information. So the TSA document checker will not pick up on the alterations. This means, as long as they sub in 3 they can always use the Pre-Check line.

What a dumb way to design the system. It would be easier -- and far more secure -- if the boarding pass checker just randomly chose 10%, or whatever percentage they want, of PreCheck passengers to send through regular screening. Why go through the trouble of encoding it in the barcode and then reading it?

And -- of course -- this means that you can still print your own boarding pass.

On the other hand, I think the PreCheck level of airport screening is what everyone should get, and that the no-fly list and the photo ID check add nothing to security. So I don't feel any less safe because of this vulnerability.

Still, I am surprised. Is this the same in other countries? Lots of countries scan my boarding pass before allowing me through security: France, the Netherlands, the UK, Japan, even Uruguay at Montevideo Airport when I flew out of there yesterday. I always assumed that those systems were connected to the airlines' reservation databases. Does anyone know?

Print your own boarding pass:


Interesting paper: "Before We Knew It: An Empirical Study of Zero-Day Attacks in the Real World," by Leyla Bilge and Tudor Dumitras.

There's a new report from the Presidential Commission for the Study of Bioethical Issues. It's called "Privacy and Progress in Whole Genome Sequencing." The Commission described the rapid advances underway in the field of genome sequencing, but also noted growing concerns about privacy and security. The report lists twelve recommendations to improve current practices and to help safeguard privacy and security, including using deidentification wherever possible.

An analysis of how Bitcoin is actually used: "Quantitative Analysis of the Full Bitcoin Transaction Graph," by Dorit Ron and Adi Shamir.
Commentary on the paper:

noPhoto attaches to car license plates. It reacts to a camera flash, and jams the image with a bright light.
The website makes the point that this is legal, but that can't last.

Weaponizing office supplies:

Peter Swire and Yianni Lagos have pre-published a law journal article on the risks of data portability. It specifically addresses an EU data protection regulation, but the security discussion is more general. "...Article 18 poses serious risks to a long-established E.U. fundamental right of data protection, the right to security of a person's data. Previous access requests by individuals were limited in scope and format. By contrast, when an individual's lifetime of data must be exported 'without hindrance,' then one moment of identity fraud can turn into a lifetime breach of personal data." They have a point. If you're going to allow users to download all of their data with one command, you might want to double- and triple-check that command. Otherwise it's going to become an attack vector for identity theft and other malfeasance.

Sony Playstation 3 master key leaked:!topic/...

Protecting (and collecting) the DNA of world leaders:

Detecting fake hurricane photographs: a short tutorial.

Dan Ariely on dishonesty:

Rap News on Internet surveillance:

Really nice profile of Peter Neumann in the "New York Times." It includes a discussion of the Clean Slate program.

Interesting "This American Life" show on loopholes. The first part is about getting around the Church's ban against suicide. The second part is about an interesting insurance scheme.

I've written about the ineffectiveness of airport security patdowns before, but not half as well as this story:

World War II encoded message found attached to dead carrier pigeon's leg.

Commentary on New Jersey's decision to allow voting by e-mail for the recent presidential election.

This new vulnerability against industrial control systems doesn't look good. These are often called SCADA vulnerabilities, although it isn't SCADA that's involved here. They're against programmable logic controllers (PLCs): the same industrial controllers that Stuxnet attacked.

There's a three-rotor Enigma machine up for auction. It's expensive, but it's in complete working order. They're also auctioning off a complete set of rotors; those are even rarer than the machines -- which are often missing their rotors.

Regulations as a Prisoner's Dilemma:
This is the sort of thing I wrote about in my latest book.

New SSL vulnerability.

Two great concepts: micromorts and microlives.

Gary McGraw on national cybersecurity:

How terrorist groups disband:

From the Department of Homeland Security, a handy list of 19 suspicious behaviors that could indicate that a hotel guest is actually a terrorist. I myself have done several of these.
More generally, this is another example of why all the "see something say something" campaigns fail: "If you ask amateurs to act as front-line security personnel, you shouldn't be surprised when you get amateur security."

Mother fairy wrens teach their chicks passwords while they're still in their eggs to tell them from cuckoo impostors.
It's worth noting that this is primarily of use to the chicks' parents, so they know not to expend time and energy on the impostor cuckoo chick. Cuckoo chicks, as part of *their* evolutionary adaptation, kick the real chicks out of the nest, so they're lost in any case. It's the fact that the signal allows the parents to identify impostors and start a new brood that's of evolutionary advantage.

Dan Boneh of Stanford University is offering a free online cryptography course. The course runs for six weeks, and has five to seven hours of coursework per week. It just started last week.
A second part will be starting in January.

Were the keys to the Crown Jewels stolen?

Webmail as a dead drop:

The terrorist risk of food trucks:

Encryption in Cloud Computing

This article makes the important argument that encryption -- where the user and not the cloud provider holds the keys -- is critical to protect cloud data. The problem is, it upsets cloud providers' business models:

In part it is because encryption with customer controlled keys is inconsistent with portions of their business model. This architecture limits a cloud provider's ability to data mine or otherwise exploit the users' data. If a provider does not have access to the keys, they lose access to the data for their own use. While a cloud provider may agree to keep the data confidential (i.e., they won't show it to anyone else) that promise does not prevent their own use of the data to improve search results or deliver ads. Of course, this kind of access to the data has huge value to some cloud providers and they believe that data access in exchange for providing below-cost cloud services is a fair trade.
Also, providing onsite encryption at rest options might require some providers to significantly modify their existing software systems, which could require a substantial capital investment.

That second reason is actually very important, too. A lot of cloud providers don't just store client data, they do things with that data. If the user encrypts the data, it's an opaque blob to the cloud provider -- and a lot of cloud services would be impossible.

Lots of companies are trying really hard to solve parts of this problem, but a truly optimal solution still eludes us.


Schneier News

Is anyone out there interested in buying a pile of copies of my "Liars and Outliers" for a giveaway and book signing at the RSA Conference? I can guarantee enormous crowds at your booth for as long as there are books to give away. This could also work for an after-hours event. Please let me know. I can get you a great bulk order price with my publisher.

I updated a 2006 essay of mine on the security issues around sports doping.

I am speaking at Tietoturva 2013 in Helsinki on November 27:

I am speaking at IA12: Securing Opportunities in Cyberspace in London on December 3.

I am speaking at Impacts and Risks of Artificial General Intelligence (AGI-Impacts) in Oxford on December 10.

The Risks of Trusting Experts

I'm not sure what to think about this story:

Six Italian scientists and an ex-government official have been sentenced to six years in prison over the 2009 deadly earthquake in L'Aquila.
A regional court found them guilty of multiple manslaughter.
Prosecutors said the defendants gave a falsely reassuring statement before the quake, while the defence maintained there was no way to predict major quakes.
The 6.3 magnitude quake devastated the city and killed 309 people.

These were all members of the National Commission for the Forecast and Prevention of Major Risks, and some of Italy's most prominent and internationally respected seismologists and geological experts. Basically, the problem was that they failed to hedge their bets against the earthquake. In a press conference just before the earthquake, they incorrectly assured locals that there was no danger. This, according to the court, was equivalent to manslaughter.

No, it doesn't make any sense.

David Rothery, of the UK's Open University, said earthquakes were "inherently unpredictable".
"The best estimate at the time was that the low-level seismicity was not likely to herald a bigger quake, but there are no certainties in this game," he said.

Even the defendants were confused:

Another, Enzo Boschi, described himself as "dejected" and "desperate" after the verdict was read.
"I thought I would have been acquitted. I still don't understand what I was convicted of."

I do. He was convicted because the public wanted revenge -- and the scientists were their most obvious targets.

Needless to say, this is having a chilling effect on scientists talking to the public. Enzo Boschi, president of Italy's National Institute of Geophysics and Volcanology (INGV) in Rome, said: "When people, when journalists, asked my opinion about things, I used to tell them, but no more. Scientists have to shut up." Also, as part of their conviction, those scientists are prohibited from ever holding public office again.

From a security perspective, this seems like the worst possible outcome. The last thing we want of our experts is for them to refuse to give us the benefits of their expertise.

To be fair, the verdict isn't final. There are always appeals in Italy, and at least one level of appeal is certain in this case. Everything might be overturned, but I'm sure the chilling effect will remain, regardless.

As someone who constantly makes predictions about security that could potentially affect the livelihood and lives of those who listen to them, this really made me stop and think. Could I be arrested, or sued, for telling people that this particular security product is effective when in fact it is not? I am forever minimizing the risks of terrorism in general and airplane terrorism in particular. Sooner or later, there will be another terrorist event. Will that make me guilty of manslaughter as well? Italy is a long way away, but everything I write on the Internet reaches there.

Here is an article in "New Scientist" that gives the prosecutor's side of things. According to the prosecutor, this case was not about prediction. It was about communication. It wasn't about the odds of the quake, it was about how those odds were communicated to the public.

Oddly enough, there is a large of amount of case law in this area, with weathermen as the target. This two-part article, "Bad Weather? Then Sue the Weatherman," is fascinating.

Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers "Liars and Outliers," "Beyond Fear," "Secrets and Lies," and "Applied Cryptography," and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.

Copyright (c) 2012 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.