Crypto-Gram

September 15, 2012

by Bruce Schneier
Chief Security Technology Officer, BT
schneier@schneier.com
http://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <http://www.schneier.com/crypto-gram-1209.html>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively comment section. An RSS feed is available.


In this issue:


The Importance of Security Engineering

In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don’t recommend reading the entire discussion; we spent 14,000 words talking past each other. But what’s interesting is how our debate illustrates the differences between a security engineer and an intelligent layman. Harris was uninterested in the detailed analysis required to understand a security system and unwilling to accept that security engineering is a specialized discipline with a body of knowledge and relevant expertise. He trusted his intuition.

Many people have researched how intuition fails us in security: Paul Slovic and Bill Burns on risk perception, Daniel Kahneman on cognitive biases in general, Rick Walsh on folk computer-security models. I’ve written about the psychology of security, and Daniel Gartner has written more. Basically, our intuitions are based on things like antiquated fight-or-flight models, and these increasingly fail in our technological world.

This problem isn’t unique to computer security, or even security in general. But this misperception about security matters now more than it ever has. We’re no longer asking people to make security choices only for themselves and their businesses; we need them to make security choices as a matter of public policy. And getting it wrong has increasingly bad consequences.

Computers and the Internet have collided with public policy. The entertainment industry wants to enforce copyright. Internet companies want to continue freely spying on users. Law-enforcement wants its own laws imposed on the Internet: laws that make surveillance easier, prohibit anonymity, mandate the removal of objectionable images and texts, and require ISPs to retain data about their customers’ Internet activities. Militaries want laws regarding cyber weapons, laws enabling wholesale surveillance, and laws mandating an Internet kill switch. “Security” is now a catch-all excuse for all sorts of authoritarianism, as well as for boondoggles and corporate profiteering.

Cory Doctorow recently spoke about the coming war on general-purpose computing. I talked about it in terms of the entertainment industry and Jonathan Zittrain discussed it more generally, but Doctorow sees it as a much broader issue. Preventing people from copying digital files is only the first skirmish; just wait until the DEA wants to prevent chemical printers from making certain drugs, or the FBI wants to prevent 3D printers from making guns.

I’m not here to debate the merits of any of these policies, but instead to point out that people will debate them. Elected officials will be expected to understand security implications, both good and bad, and will make laws based on that understanding. And if they aren’t able to understand security engineering, or even accept that there is such a thing, the result will be ineffective and harmful policies.

So what do we do? We need to establish security engineering as a valid profession in the minds of the public and policy makers. This is less about certifications and (heaven forbid) licensing, and more about perception—and cultivating a security mindset. Amateurs produce amateur security, which costs more in dollars, time, liberty, and dignity while giving us less—or even no—security. We need everyone to know that.

We also need to engage with real-world security problems, and apply our expertise to the variety of technical and socio-technical systems that affect broader society. Everything involves computers, and almost everything involves the Internet. More and more, computer security *is* security.

Finally, and perhaps most importantly, we need to learn how to talk about security engineering to a non-technical audience. We need to convince policy makers to follow a logical approach instead of an emotional one—an approach that includes threat modeling, failure analysis, searching for unintended consequences, and everything else in an engineer’s approach to design. Powerful lobbying forces are attempting to force security policies on society, largely for non-security reasons, and sometimes in secret. We need to stand up for security.

A shorter version of this essay appeared in the September/October 2012 issue of “IEEE Security & Privacy.”

Harris debate:
http://www.samharris.org//item/…
https://www.schneier.com/blog/archives/2012/05/…
http://www.schneier.com/essay-397.html

Security Engineering books:
http://www.amazon.com/…
http://www.amazon.com/Software-Security-Building-In/…
Paul Slovic book:
http://www.amazon.com/…
Bill Burns research:
http://create.usc.edu/research/54570.pdf

Daniel Kahneman book:
http://www.amazon.com/…
Rick Walsh research:
http://www.rickwash.com/papers/…

My essay on the psychology of security:
http://www.schneier.com/essay-155.html

Daniel Gartner book:
http://www.amazon.com/…
Internet kill switch:
http://www.schneier.com/essay-321.html

“The War on General Purpose Computing”
http://boingboing.net/2012/01/10/lockdown.html
http://www.schneier.com/crypto-gram-0110.html#3
http://www.schneier.com/news-012.html

Jonathan Zittrain’s book:
http://www.amazon.com/…
Printing physical objects:
http://www.bbc.co.uk/news/technology-17760085
http://www.forbes.com/sites/markgibbs/2012/07/28/…
http://haveblue.org/?p=1041

The security mindset:
http://www.schneier.com/essay-210.html


Security at the 9/11 WTC Memorial

There’s a lot:

Advance tickets are required to enter this public, outdoor memorial. To book them, you’re obliged to provide your home address, email address, and phone number, and the full names of everyone in your party. It is “strongly recommended” that you print your tickets at home, which is where you must leave explosives, large bags, hand soap, glass bottles, rope, and bubbles. Also, “personal wheeled vehicles” not limited to bicycles, skateboards, and scooters, and anything else deemed inappropriate. Anyone age 13 or older must carry photo ID, to be displayed “when required and/or requested.”

Once at the memorial you must go through a metal detector and your belongings must be X-rayed. Officers will inspect your ticket—that invulnerable document you nearly left on your printer—at least five times. One will draw a blue line on it; 40 yards (and around a dozen security cameras) later, another officer will shout at you if your ticket and its blue line are not visible.

I’m one of the people commenting on whether this all makes sense.

I especially appreciated the last paragraph:

The Sept. 11 memorial’s designers hoped the plaza would be “a living part” of the city—integrated into its fabric and usable “on a daily basis.” I thought that sounded nice, so I asked Schneier one last question. Let’s say we dismantled all the security and let the Sept. 11 memorial be a memorial like any other: a place where citizens and travelers could visit spontaneously, on their own contemplative terms, day or night, subject only to capacity limits until the site is complete. What single measure would most guarantee their safety? I was thinking about cameras and a high-tech control center, “flower pot”-style vehicle barriers, maybe even snipers poised on nearby roofs. Schneier’s answer? Seat belts. On the drive to New York, or in your taxi downtown, buckle up, he warned. It’s dangerous out there.

http://www.slate.com/articles/life/culturebox/2012/…


News

This is an analysis of Apple’s disk encryption program, FileVault 2, that first appeared in the Lion operating system. Short summary: they couldn’t break it. (Presumably the version in Mountain Lion isn’t any different.)
http://www.lightbluetouchpaper.org/2012/08/06/…
In the short story “A Wayside Comedy,” published in 1888 in “Under the Deodars,” Rudyard Kipling wrote: “You must remember, though you will not understand, that all laws weaken in a small and hidden community where there is no public opinion. When a man is absolutely alone in a Station he runs a certain risk of falling into evil ways. This risk is multiplied by every addition to the population up to twelve—the Jury number. After that, fear and consequent restraint begin, and human action becomes less grotesquely jerky.” Interesting commentary on how reputational pressure scales. If I had found this quote last year, I would have included it in my book.
http://www.gutenberg.org/ebooks/2828

$200 for a fake security system. “Moving red laser beams scare away potential intruders. Laser beams move along floor and wall 180 degrees. Easy to install, 110v comes on automatically w/timer.” Watch the video. This is not an alarm, and it doesn’t do anything other than the laser light show. But, as the product advertisement says, “perception can be an excellent deterrent to crime.” Although this only works if the product isn’t very successful—or widely known.
http://www.amazon.com/…
http://www.youtube.com/watch?v=GnH95uzQPOo

This is an extraordinary (and gut-wrenching) first-person account of what it’s like to staff an Israeli security checkpoint. It shows how power corrupts: how it’s impossible to make humane decisions in such a circumstance.
http://www.bostonreview.net/BR37.4/…
A new technology uses the radiation given off by wi-fi devices to sense the positions of people through a one-foot-thick brick wall.
http://m.phys.org/news/…
Kaspersky is looking for help decrypting the Gauss payload.
http://www.infosecurity-magazine.com/view/27590/…
http://www.esecurityplanet.com/malware/…
https://www.securelist.com/en//208193781/…
http://www.theregister.co.uk/2012/08/14/…
http://www.pcmag.com/article2/0,2817,2408459,00.asp

Video filter that detects a pulse.
http://people.csail.mit.edu/mrub/vidmag/
How long before someone claims he can use this technology to detect nervous people in airports?

Finally, someone takes a look at the $1 trillion number government officials are quoting as the cost of cybercrime. While it’s a good figure to scare people, it doesn’t have much of a basis in reality.
https://www.propublica.org/article/…
Older research debunking cybercrime surveys:
http://research.microsoft.com/apps/pubs/…

Fear and how it scales.
https://www.schneier.com/blog/archives/2012/08/…

A surprisingly sensible list of Internet safety talking points for schools.
https://www.schneier.com/blog/archives/2012/08/…

Fear and imagination—an interesting anecdote from World War II.
http://redteamjournal.com/2012/08/…
A reader sent me this photo of a shared lock. It’s at the gate of a large ranch outside of Victoria, Texas. Multiple padlocks secure the device, but when a single padlock is removed, the center pin can be fully lifted and the gate can be opened. The point is to allow multiple entities (oil and gas, hunting parties, ranch supervisors, etc.) access without the issues of key distribution that would arise if it were just a single lock. On the other hand, the gate is only as secure as the weakest padlock.
http://www.flickr.com/photos/86078043@N08/7880120310/
A less elegant way to do the same thing:
http://www.flickr.com/photos/8791591@N03/2133541070/
A slightly different implementation of the same idea: removal of any one lock allows locking bar to retract from pole and gate to open.
http://www.mountain-bike-diaries.com/image-files/…
And an interesting comment from someone who deals with this in his work:
https://www.schneier.com/blog/archives/2012/08/…

Research paper on the psychological effects of terrorism.
http://onlinelibrary.wiley.com/doi/10.1111/…
Yet another biometric: eye twitch patterns:
http://www.sciencedaily.com/releases/2012/08/…
Probably harder to fool than iris scanners.

In this fascinating piece of research, the question is asked: can we surreptitiously collect secret information from the brains of people using brain-computer interface devices.
https://www.usenix.org/conference/usenixsecurity12/…
http://www.wired.com/threatlevel/2012/08/…
http://hardware.slashdot.org/story/12/08/17/1328241/…
http://www.forbes.com/sites/andygreenberg/2012/08/…
Here’s an article on the pros and cons of this sort of research.
http://www.cognitiveliberty.org/issues/…

Database of 12 million Apple UDIDs leaked. It wasn’t the FBI or Apple—it was a company called BlueToad.
https://www.schneier.com/blog/archives/2012/09/…

Truly bizarre story of someone who seems to have figured out how to successfully cheat at marathons. The evidence of his cheating is overwhelming, but no one knows how he does it.
http://www.newyorker.com/reporting/2012/08/06/…
For anyone who finds the article interesting, the blog discussion is worth reading.
https://www.schneier.com/blog/archives/2012/09/…

Larry Constantine disputes David Stanger’s book about Stuxnet.
http://spectrum.ieee.org/podcast/computing/…
Comment from Larry Constantine on the blog post.
https://www.schneier.com/blog/archives/2012/09/…
New attack against chip-and-pin systems.
https://www.schneier.com/blog/archives/2012/09/…

A real movie-plot threat contest—the “Australia’s Security Nightmares: The National Security Short Story Competition” is part of Safeguarding Australia 2012.
https://www.schneier.com/blog/archives/2012/09/…

Nice essay on the futility of trying to prevent another 9/11.
http://www.foreignpolicy.com/articles/2012/09/10/…

And on a related topic, an essay and commentary on overhyping the threat of terrorism at the London Olympics.
http://walt.foreignpolicy.com/posts/2012/08/13/…
http://www.salon.com/2012/08/15/…

Steganographic information is embedded in World of Warcraft screen shots.
http://www.ownedcore.com/forums/world-of-warcraft/…
Estimating the probability of another 9/11. This statistical research says once per decade.
http://arxiv.org/abs/1209.0089
http://www.wired.com/wiredscience/2012/09/…
Good article on the hacker group UGNazi.
http://www.wired.com/gadgetlab/2012/09/…
Details of a man-in-the-middle bank fraud attack:
http://www.trusteer.com//…
Note that the attack relies on tricking the user, which isn’t very hard. This sort of attack will become more common as banks require two-factor authentication.

Good article on the hacker group UGNazi:
http://www.wired.com/gadgetlab/2012/09/…


Poll: Americans Like the TSA

Gallup has the results:

Despite recent negative press, a majority of Americans, 54%, think the U.S. Transportation Security Administration is doing either an excellent or a good job of handling security screening at airports. At the same time, 41% think TSA screening procedures are extremely or very effective at preventing acts of terrorism on U.S. airplanes, with most of the rest saying they are somewhat effective.

My first reaction was that people who don’t fly—and don’t interact with the TSA—are more likely to believe it is doing a good job. That’s not true.

Just over half of Americans report having flown at least once in the past year. These fliers have a slightly better opinion of the job TSA is doing than those who haven’t flown. Fifty-seven percent of those who have flown at least once and 57% of the smaller group who have flown at least three times have an excellent or good opinion of the TSA’s job performance. That compares with 52% of those who have not flown in the past year.

There is little difference in opinions about the effectiveness of TSA’s screening procedures by flying status; between 40% and 42% of non-fliers, as well as of those who have flown at least once and those who have flown at least three times, believe the procedures are at least very effective.

Also:

Younger Americans have significantly more positive opinions of the TSA than those who are older. These differences may partly reflect substantial differences in flying frequency, with 60% of 18- to 29-year-olds reporting having flown within the last year, compared with 33% of those 65 years and older.

Anyone want to try to explain these numbers?

http://www.gallup.com/poll/156491/…


Five “Neglects” in Risk Management

Good list, summarized on hlswatch.com:

1. Probability neglect—people sometimes don’t consider the probability of the occurrence of an outcome, but focus on the consequences only.

2. Consequence neglect—just like probability neglect, sometimes individuals neglect the magnitude of outcomes.

3. Statistical neglect—instead of subjectively assessing small probabilities and continuously updating them, people choose to use rules-of-thumb (if any heuristics), which can introduce systematic biases in their decisions.

4. Solution neglect—choosing an optimal solution is not possible when one fails to consider all of the solutions.

5. External risk neglect—in making decisions, individuals or groups often consider the cost/benefits of decisions only for themselves, without including externalities, sometimes leading to significant negative outcomes for others.

http://www.sra.org/docs_reg_risk_pres/…
http://www.hlswatch.com/2012/08/16/…


Schneier News

I’m speaking at the Central European Conference on Information and Intelligent Systems in Zagreb on 19 Sep.
http://ceciis.foi.hr/

I’m speaking at RSA Europe in London on 10 Oct.
http://www.rsaconference.com/events/2012/europe/…

“Liars and Outliers” (along with two other books: Kip Hawley’s memoir of his time at the TSA and “Against Security,” by Harvey Molotch) has been reviewed in the “Wall Street Journal.”
http://online.wsj.com/article/…

I did a Q&A about “Liars and Outliers” on The Well:
http://bit.ly/schneier-well

Two of my books can be seen in the background in CBS’ new Sherlock Holmes drama, Elementary. A copy of “Schneier on Security” is prominently displayed on Sherlock Holmes’ bookshelf. You can see it in the first few minutes of the pilot episode. The show’s producers contacted me early on to ask permission to use my books, so it didn’t come as a surprise, but it’s still a bit of a thrill. Here’s a listing of all the books visible on the bookshelf.
http://www.goodreads.com/list/show/…


Is iPhone Security Really this Good?

Simson Garfinkel writes that the iPhone has such good security that the police can’t use it for forensics anymore:

Technologies the company has adopted protect Apple customers’ content so well that in many situations it’s impossible for law enforcement to perform forensic examinations of devices seized from criminals. Most significant is the increasing use of encryption, which is beginning to cause problems for law enforcement agencies when they encounter systems with encrypted drives.

“I can tell you from the Department of Justice perspective, if that drive is encrypted, you’re done,” Ovie Carroll, director of the cyber-crime lab at the Computer Crime and Intellectual Property Section in the Department of Justice, said during his keynote address at the DFRWS computer forensics conference in Washington, D.C., last Monday. “When conducting criminal investigations, if you pull the power on a drive that is whole- disk encrypted you have lost any chance of recovering that data.”

Yes, I believe that full-disk encryption—whether Apple’s FileVault or Microsoft’s BitLocker (I don’t know what the iOS system is called)—is good; but its security is only as good as the user is at choosing a good password.

The iPhone always supported a PIN lock, but the PIN wasn’t a deterrent to a serious attacker until the iPhone 3GS. Because those early phones didn’t use their hardware to perform encryption, a skilled investigator could hack into the phone, dump its flash memory, and directly access the phone’s address book, e- mail messages, and other information. But now, with Apple’s more sophisticated approach to encryption, investigators who want to examine data on a phone have to try every possible PIN. Examiners perform these so-called brute-force attacks with special software, because the iPhone can be programmed to wipe itself if the wrong PIN is provided more than 10 times in a row. This software must be run on the iPhone itself, limiting the guessing speed to 80 milliseconds per PIN. Trying all four-digit PINs therefore requires no more than 800 seconds, a little more than 13 minutes. However, if the user chooses a six-digit PIN, the maximum time required would be 22 hours; a nine-digit PIN would require 2.5 years, and a 10-digit pin would take 25 years. That’s good enough for most corporate secrets—and probably good enough for most criminals as well.

Leaving aside the user practice questions—my guess is that very few users, even those with something to hide, use a ten-digit PIN—could this possibly be true? In the introduction to “Applied Cryptography,” almost 20 years ago, I wrote: “There are two kinds of cryptography in this world: cryptography that will stop your kid sister from reading your files, and cryptography that will stop major governments from reading your files.”

Since then, I’ve learned two things: 1) there are a lot of gradients to kid sister cryptography, and 2) major government cryptography is very hard to get right. It’s not the cryptography; it’s everything around the cryptography. I said as much in the preface to “Secrets and Lies” in 2000:

Cryptography is a branch of mathematics. And like all mathematics, it involves numbers, equations, and logic. Security, palpable security that you or I might find useful in our lives, involves people: things people know, relationships between people, people and how they relate to machines. Digital security involves computers: complex, unstable, buggy computers.

Mathematics is perfect; reality is subjective. Mathematics is defined; computers are ornery. Mathematics is logical; people are erratic, capricious, and barely comprehensible.

If, in fact, we’ve finally achieved something resembling this level of security for our computers and handheld computing devices, this is something to celebrate.

But I’m skeptical.

http://www.technologyreview.com/news/428477/…
http://m.phys.org/news/…
http://apple.slashdot.org/story/12/08/13/165244/…
More analysis:
http://sit.sit.fraunhofer.de/studies/en/…
http://sit.sit.fraunhofer.de/studies/en/…

Elcomsoft can crack iPhones:
http://www.elcomsoft.com/eift.html
http://.crackpassword.com/2011/05/…


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Liars and Outliers,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.

Copyright (c) 2012 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.