Crypto-Gram

May 15, 2010

by Bruce Schneier
Chief Security Technology Officer, BT
schneier@schneier.com
http://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.

You can read this issue on the web at <http://www.schneier.com/crypto-gram-1005.html>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively comment section. An RSS feed is available.


In this issue:


Worst-Case Thinking

At a security conference recently, the moderator asked the panel of distinguished cybersecurity leaders what their nightmare scenario was. The answers were the predictable array of large-scale attacks: against our communications infrastructure, against the power grid, against the financial system, in combination with a physical attack.

I didn’t get to give my answer until the afternoon, which was: “My nightmare scenario is that people keep talking about their nightmare scenarios.”

There’s a certain blindness that comes from worst-case thinking. An extension of the precautionary principle, it involves imagining the worst possible outcome and then acting as if it were a certainty. It substitutes imagination for thinking, speculation for risk analysis and fear for reason. It fosters powerlessness and vulnerability and magnifies social paralysis. And it makes us more vulnerable to the effects of terrorism.

Worst-case thinking means generally bad decision making for several reasons. First, it’s only half of the cost-benefit equation. Every decision has costs and benefits, risks and rewards. By speculating about what can possibly go wrong, and then acting as if that is likely to happen, worst-case thinking focuses only on the extreme but improbable risks and does a poor job at assessing outcomes.

Second, it’s based on flawed logic. It begs the question by assuming that a proponent of an action must prove that the nightmare scenario is impossible.

Third, it can be used to support any position or its opposite. If we build a nuclear power plant, it could melt down. If we don’t build it, we will run short of power and society will collapse into anarchy. If we allow flights near Iceland’s volcanic ash, planes will crash and people will die. If we don’t, organs won’t arrive in time for transplant operations and people will die. If we don’t invade Iraq, Saddam Hussein might use the nuclear weapons he might have. If we do, we might destabilize the Middle East, leading to widespread violence and death.

Of course, not all fears are equal. Those that we tend to exaggerate are more easily justified by worst-case thinking. So terrorism fears trump privacy fears, and almost everything else; technology is hard to understand and therefore scary; nuclear weapons are worse than conventional weapons; our children need to be protected at all costs; and annihilating the planet is bad. Basically, any fear that would make a good movie plot is amenable to worst-case thinking.

Fourth and finally, worst-case thinking validates ignorance. Instead of focusing on what we know, it focuses on what we don’t know—and what we can imagine.

Remember Defense Secretary Donald Rumsfeld’s quote? “Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know.” And this: “the absence of evidence is not evidence of absence.” Ignorance isn’t a cause for doubt; when you can fill that ignorance with imagination, it can be a call to action.

Even worse, it can lead to hasty and dangerous acts. You can’t wait for a smoking gun, so you act as if the gun is about to go off. Rather than making us safer, worst-case thinking has the potential to cause dangerous escalation.

The new undercurrent in this is that our society no longer has the ability to calculate probabilities. Risk assessment is devalued. Probabilistic thinking is repudiated in favor of “possibilistic thinking”: Since we can’t know what’s likely to go wrong, let’s speculate about what can possibly go wrong.

Worst-case thinking leads to bad decisions, bad systems design, and bad security. And we all have direct experience with its effects: airline security and the TSA, which we make fun of when we’re not appalled that they’re harassing 93-year-old women or keeping first graders off airplanes. You can’t be too careful!

Actually, you can. You can refuse to fly because of the possibility of plane crashes. You can lock your children in the house because of the possibility of child predators. You can eschew all contact with people because of the possibility of hurt. Steven Hawking wants to avoid trying to communicate with aliens because they might be hostile; does he want to turn off all the planet’s television broadcasts because they’re radiating into space? It isn’t hard to parody worst-case thinking, and at its extreme it’s a psychological condition.

Frank Furedi, a sociology professor at the University of Kent, writes: “Worst-case thinking encourages society to adopt fear as one of the dominant principles around which the public, the government and institutions should organize their life. It institutionalizes insecurity and fosters a mood of confusion and powerlessness. Through popularizing the belief that worst cases are normal, it incites people to feel defenseless and vulnerable to a wide range of future threats.”

Even worse, it plays directly into the hands of terrorists, creating a population that is easily terrorized—even by failed terrorist attacks like the Christmas Day underwear bomber and the Times Square SUV bomber.

When someone is proposing a change, the onus should be on them to justify it over the status quo. But worst case thinking is a way of looking at the world that exaggerates the rare and unusual and gives the rare much more credence than it deserves.

It isn’t really a principle; it’s a cheap trick to justify what you already believe. It lets lazy or biased people make what seem to be cogent arguments without understanding the whole issue. And when people don’t need to refute counterarguments, there’s no point in listening to them.

This essay was originally published on CNN.com, although they stripped out all the links.
http://www.cnn.com/2010/OPINION/05/12/…

Security conference:
http://www.ewi.info/dallas

Precautionary principle:
http://en.wikipedia.org/wiki/Precautionary_principle

Iceland volcano affects organ donations:
http://www.cbsnews.com/…

Areas where we tend to overestimate the threat:
http://www.schneier.com/essay-170.html

New particle accelerator may annihilate the planet:
http://news.cnet.com/8301-10784_3-9905448-7.html

Movie plot threats:
http://www.schneier.com/essay-087.html

Rumsfeld quote:
http://www.defenselink.mil/Transcripts/…
Possibilistic thinking:
http://www.press.uchicago.edu/Misc/Chicago/…

Making fun of the TSA:
http://www.theatlantic.com/politics/archive/2010/05/…
The TSA harasses a 93-year-old women:
http://www.youtube.com/watch?v=wHxy5GattLY

The TSA keeps a first-grader off airplanes:
http://www.bostonherald.com/news/regional/view/…
Steven Hawking on communicating with aliens:
http://www.msnbc.msn.com/id/36769422/

Frank Furedi on worst-case-thinking:
http://www.frankfuredi.com/index.php/site/article/326/
http://www.frankfuredi.com/index.php/site/article/386/

How we are easily terrorized:
http://www.schneier.com/essay-124.html

Christmas Day underwear bomber:
http://www.schneier.com/essay-304.html

Times Square SUV bomber:
http://www.schneier.com/essay-315.html


Why Aren’t There More Terrorist Attacks?

As the details of the Times Square car bomb attempt emerge in the wake of Faisal Shahzad’s arrest Monday night, one thing has already been made clear: Terrorism is fairly easy. All you need is a gun or a bomb, and a crowded target. Guns are easy to buy. Bombs are easy to make. Crowded targets—not only in New York, but all over the country—are easy to come by. If you’re willing to die in the aftermath of your attack, you could launch a pretty effective terrorist attack with a few days of planning, maybe less.

But if it’s so easy, why aren’t there more terrorist attacks like the failed car bomb in New York’s Times Square? Or the terrorist shootings in Mumbai? Or the Moscow subway bombings? After the enormous horror and tragedy of 9/11, why have the past eight years been so safe in the U.S.?

There are actually several answers to this question. One, terrorist attacks are harder to pull off than popular imagination—and the movies—lead everyone to believe. Two, there are far fewer terrorists than the political rhetoric of the past eight years leads everyone to believe. And three, random minor terrorist attacks don’t serve Islamic terrorists’ interests right now.

Terrorism sounds easy, but the actual attack is the easiest part.

Putting together the people, the plot and the materials is hard. It’s hard to sneak terrorists into the U.S. It’s hard to grow your own inside the U.S. It’s hard to operate; the general population, even the Muslim population, is against you.

Movies and television make terrorist plots look easier than they are. It’s hard to hold conspiracies together. It’s easy to make a mistake. Even 9/11, which was planned before the climate of fear that event engendered, just barely succeeded. Today, it’s much harder to pull something like that off without slipping up and getting arrested.

But even more important than the difficulty of executing a terrorist attack, there aren’t a lot of terrorists out there. Al-Qaeda isn’t a well-organized global organization with movie-plot-villain capabilities; it’s a loose collection of people using the same name. Despite the post-9/11 rhetoric, there isn’t a terrorist cell in every major city. If you think about the major terrorist plots we’ve foiled in the U.S.—the JFK bombers, the Fort Dix plotters—they were mostly amateur terrorist wannabes with no connection to any sort of al-Qaeda central command, and mostly no ability to effectively carry out the attacks they planned.

The successful terrorist attacks—the Fort Hood shooter, the guy who flew his plane into the Austin IRS office, the anthrax mailer—were largely nut cases operating alone. Even the unsuccessful shoe bomber, and the equally unsuccessful Christmas Day underwear bomber, had minimal organized help—and that help originated outside the U.S.

Terrorism doesn’t occur without terrorists, and they are far rarer than popular opinion would have it.

Lastly, and perhaps most subtly, there’s not a lot of value in unspectacular terrorism anymore.

If you think about it, terrorism is essentially a PR stunt. The death of innocents and the destruction of property isn’t the goal of terrorism; it’s just the tactic used. And acts of terrorism are intended for two audiences: for the victims, who are supposed to be terrorized as a result, and for the allies and potential allies of the terrorists, who are supposed to give them more funding and generally support their efforts.

An act of terrorism that doesn’t instill terror in the target population is a failure, even if people die. And an act of terrorism that doesn’t impress the terrorists’ allies is not very effective, either.

Fortunately for us and unfortunately for the terrorists, 9/11 upped the stakes. It’s no longer enough to blow up something like the Oklahoma City Federal Building. Terrorists need to blow up airplanes or the Brooklyn Bridge or the Sears Tower or JFK airport—something big to impress the folks back home. Small no-name targets just don’t cut it anymore.

Note that this is very different than terrorism by an occupied population: the IRA in Northern Ireland, Iraqis in Iraq, Palestinians in Israel. Setting aside the actual politics, all of these terrorists believe they are repelling foreign invaders. That’s not the situation here in the U.S.

So, to sum up: If you’re just a loner wannabe who wants to go out with a bang, terrorism is easy. You’re more likely to get caught if you take a long time to plan or involve a bunch of people, but you might succeed. If you’re a representative of al-Qaeda trying to make a statement in the U.S., it’s much harder. You just don’t have the people, and you’re probably going to slip up and get caught.

This essay originally appeared on AOL News.
http://www.aolnews.com/opinion/article/…
Amateur terrorist wannabes:
http://www.schneier.com/essay-174.html

Instilling terror:
http://www.schneier.com/essay-124.html

A similar sentiment about the economic motivations of terrorists.
http://www.theatlantic.com/business/archive/2010/05/…


9/11 Made us Safer?

There’s an essay on the Computerworld website that claims I implied, and believe, so: “OK, so strictly-speaking, he doesn’t use those exact words, but the implication is certainly clear. In a discussion about why there aren’t more terrorist attacks, he argues that ‘minor’ terrorist plots like the Times Square car bomb are counter-productive for terrorist groups, because ‘9/11 upped the stakes.'”

This comes from the above essay that discusses why there have been so few terrorist attacks since 9/11. There’s the primary reason—there aren’t very many terrorists out there—and the secondary reason: terrorist attacks are harder to pull off than popular culture leads people to believe. What he’s talking about above is the tertiary reason: terrorist attacks have a secondary purpose of impressing supporters back home, and 9/11 has upped the stakes in what a flashy terrorist attack is supposed to look like.

From there to 9/11 making us safer is quite a leap, and not one that I expected anyone to make. Certainly a series of events, before, during, and after 9/11, contributed to an environment in which a particular group of terrorists found low-budget terrorist attacks less useful—and I suppose by extension we might be safer because of it. But you’d also have to factor in the risks associated with increased police powers, the NSA spying on all of us without warrants, and the increased disregard for the law we’ve seen out of the U.S. government since 9/11. And even so, that’s a far cry from claiming causality that 9/11 made us safer.

Not that any of this really matters. Compared to the real risks in the world, the risk of terrorism is so small that it’s not worth a lot of worry. As John Mueller pointed out, the risks of terrorism “are similar to the risks of using home appliances (200 deaths per year in the United States) or of commercial aviation (103 deaths per year).”

http://s.computerworld.com/16053/…
John Mueller on the risks of terrorism:
https://www.schneier.com/blog/archives/2010/04/…

A response from Computerworld.
http://s.computerworld.com/16079/…


News

Last month I was in New York, and saw posters on the subways warning people about real guns painted to look like toys. Searching, I found pictures from the Baltimore police department and an article from 2006 New York. They’re painted bright colors to look cool—not really to fool policemen—but I had no idea this was a thing.
http://publicintelligence.net/…
http://abcnews.go.com/US/story?id=2045782

CCTV cameras in Moscow have been accused of streaming prerecorded video instead of live images. What I can’t figure out is why? To me, it seems easier for the cameras to stream live video than prerecorded images. But it seems they were not connected at all.
http://www.theregister.co.uk/2010/01/15/…
http://rt.com/Top_News/2010-01-13/…

In 2006, writing about future threats on privacy, I described a life recorder: “A ‘life recorder’ you can wear on your lapel that constantly records is still a few generations off: 200 gigabytes/year for audio and 700 gigabytes/year for video. It’ll be sold as a security device, so that no one can attack you without being recorded.” I can’t find a quote right now, but in talks I would say that this kind of technology would first be used by groups of people with diminished rights: children, soldiers, prisoners, and the non-lucid elderly. It’s been proposed. Just one sentence on the security and privacy issues: “Indeed, privacy concerns need to be addressed so that stalkers and predators couldn’t compromise the device.” Indeed.
http://www.darkreading.com/blog/archives/2010/03/… or http://tinyurl.com/y29q5kx http://www.schneier.com/essay-109.html

Lt. Gen. Alexander and the U.S. Cyber Command
https://www.schneier.com/blog/archives/2010/04/…

Research on the effectiveness of political assassinations.
https://www.schneier.com/blog/archives/2010/04/…

Remember SmartWater: liquid imbued with a uniquely identifiable DNA-style code? Well, Mont Blanc is selling a pen with uniquely identifiable ink.
https://www.schneier.com/blog/archives/2010/04/…

Security Fog: an odd burglary prevention tool.
https://www.schneier.com/blog/archives/2010/04/…

Just published by NIST: Special Publication (SP) 800-122, “Guide to Protecting the Confidentiality of Personally Identifiable Information (PII).” It’s 60 pages long; I haven’t read it.
http://csrc.nist.gov/publications/nistpubs/800-122/…

Booby-trapping a PDF file:
http://www.theregister.co.uk/2010/03/31/pdf_insecurity/
http://www.sophos.com/s/sophoslabs/?p=9413

A security cartoon.
http://www.gocomics.com/chanlowe/2010/04/13/

Another security cartoon.
http://images.ucomics.com/comics/wpswi/2010/…

Nasty scam, where the user is pressured into accepting a “pre-trial settlement” for ICPP copyright violations. The level of detail is impressive.
http://www.f-secure.com/weblog/archives/00001931.html

The New York Police Department removed all bicycles from President Obama’s route, based on the fear that they might contain pipe bombs.
https://www.schneier.com/blog/archives/2010/04/…

This blog entry about an attack against apache.org should serve as a model for open and transparent security self-reporting. I’m impressed.
https://s.apache.org/infra/entry/…
More news reports:
http://www.theregister.co.uk/2010/04/13/…
http://www.computerworld.com/s/article/9175459/…
http://www.itpro.co.uk/622363/…

Seat belt use and lessons for security awareness:
http://www.honeytech.com/blog/ticket-or-click-it/

Hiding your valuables in common household containers is an old trick. Here are some can safes you can buy. They’re relatively inexpensive, although it’s cheaper to make your own.
http://www.buyasafe.com/Can-safes-s/12.htm

The U.S. is developing a hypersonic cruise missile capable of striking anywhere on the planet within an hour. The article talks about the possibility of modifying Trident missiles—problematic because they would be indistinguishable from nuclear weapons—and using the Mach 5-capable X-51 hypersonic cruise missile. Interesting technology, but we really need to think through the political ramifications of this sort of thing better.
http://www.popularmechanics.com/technology/military/…
Report on the policy implications:
http://www.fas.org/sgp/crs/nuke/RL33067.pdf

Fun with secret questions. (Be sure to read the blog comments, too.)
https://www.schneier.com/blog/archives/2010/04/…

Homeopathic bomb: this is funny.
http://www.newsbiscuit.com/2010/04/20/…
A security analysis of India’s electronic voting machines. No surprise; they’re vulnerable to fraud.
http://indiaevm.org/

Good quote from Malcolm Gladwell on spies: “The proper function of spies is to remind those who rely on spies that the kinds of thing found out by spies can’t be trusted.” The article is about the British Operation Mincemeat in World War II.
http://www.newyorker.com/arts/critics/atlarge/2010/…
Nobody encrypts phone calls.
http://s.forbes.com/firewall/2010/04/30/…
WiFi cracking kits are being sold in China.
http://www.networkworld.com/news/2010/…
Cory Doctorow gets phished.
http://www.locusmag.com/Perspectives/2010/05/…
SnapScouts: a parody.
http://www.snapscouts.org/

Reflections of a former U-2 pilot.
http://www.nytimes.com/2010/05/07/opinion/…

Biometric wallet: cool idea, or dumb idea?
http://geekdoctor.blogspot.com/2010/05/…

There’s a new Windows attack. It’s only in the lab, but nothing detects it.
http://www.zdnet.com//hardware/…


Fifth Annual Movie-Plot Threat Contest Semi-Finalists

On April 1, I announced the Fifth Annual Movie Plot Threat Contest: “Your task, ye Weavers of Tales, is to create a fable of fairytale suitable for instilling the appropriate level of fear in children so they grow up appreciating all the lords do to protect them.”

Submissions are in, and here are the semifinalists.

1. Untitled story about polar bears, by Mike Ferguson.
https://www.schneier.com/blog/archives/2010/04/…

2. “The Gashlycrumb Terrors,” by Laura.
https://www.schneier.com/blog/archives/2010/04/…
3. Untitled Little Red Riding Hood parody, by Isti.
https://www.schneier.com/blog/archives/2010/04/…
4. “The Boy who Didn’t Cry Wolf,” by yt.
https://www.schneier.com/blog/archives/2010/04/…
5. Untitled story about exploding imps, by Mister JTA.
https://www.schneier.com/blog/archives/2010/04/…
Cast your vote by number; voting closes at the end of the month.

Vote here:
https://www.schneier.com/blog/archives/2010/05/…


Young People, Privacy, and the Internet

There’s a lot out there on this topic. Last week, two new papers were published.

1. “Youth, Privacy, and Reputation” is a literature review published by Harvard’s Berkman Center. It’s long, but an excellent summary of what’s out there on the topic.

2. “How Different Are Young Adults from Older Adults When it Comes to Information Privacy Attitudes & Policy?” from the University of California Berkeley, describes the results of a broad survey on privacy attitudes.

They’re both worth reading for anyone interested in this topic.

Youth, Privacy, and Reputation:
http://cyber.law.harvard.edu/publications/2010/…
How Different Are Young Adults from Older Adults When it Comes to Information Privacy Attitudes & Policy?:
http://ssrn.com/abstract=1589864

danah boyd on the topic:
http://www.danah.org/papers/talks/2010/SXSW2010.html
http://www.danah.org/papers/

My essay on the topic:
https://www.schneier.com/blog/archives/2010/04/…

My talk: Security, Privacy, and the Generation Gap:
https://www.schneier.com/blog/archives/2010/04/…


The Doghouse: Lock My PC

Lock My PC 4 has a master password.

In blog comments, people are reporting that the master password doesn’t work. Near as I can tell, those are all recent downloads. So either they took out the feature, or changed the password.

https://www.schneier.com/blog/archives/2010/04/…


“If You See Something, Say Something”

That slogan is owned by New York’s Metropolitan Transit Authority (the MTA). “Since obtaining the trademark in 2007, the authority has granted permission to use the phrase in public awareness campaigns to 54 organizations in the United States and overseas, like Amtrak, the Chicago Transit Authority, the emergency management office at Stony Brook University and three states in Australia.”

Of course, you’re only supposed to say something if you see something you think is terrorism: “Some requests have been rejected, including one from a university that wanted to use it to address a series of dormitory burglaries.”

Not that its very effective: “The campaign urges people to call a counter-terrorism hot line, 1-888-NYC-SAFE. Police officials said 16,191 calls were received last year, down from 27,127 in 2008.”

That’s a lot of wasted manpower, dealing with all those calls.

Of course, the vendors in Times Square who saw the smoking Nissan Pathfinder two weeks ago didn’t call that number.

And, as I’ve written previously, “if you ask amateurs to act as front-line security personnel, you shouldn’t be surprised when you get amateur security.” People don’t need to be reminded to call the police; the slogan is nothing more than an invitation to report people who are different.

http://www.nytimes.com/2010/05/11/nyregion/…

My essay on amateur security:
http://www.schneier.com/essay-195.html

Nice article illustrating how ineffective the campaign is.
http://www.nytimes.com/2008/01/07/nyregion/07see.html


Schneier News

I wil be speaking at the 2010 World Congress on Information Technology on May 26 in Amsterdam.
http://www.wcit2010.com/

I’m participating in a debate, “The Cyber War Threat Has Been Grossly Exaggerated,” on June 8 in Washington, DC.
http://intelligencesquaredus.org/index.php/debates/…
I will be speaking at the CCD CoE Conference on Cyber Conflict on June 18 in Tallinn, Estonia.
http://www.ccdcoe.org/conference2010/

I won a CSO Compass Award:
http://www.csoonline.com/article/593063/…
Someone named me as one of the top 10 science and technology writers of all time. Flattering though it is, I don’t think I belong in the company of Einstein, Newton, Darwin, and Asimov.
http://www.pcauthority.com.au/News/…
Mike Mimoso interviewed me at the RSA Conference last month.
http://searchsecurity.techtarget.com/video/…
http://searchsecurity.techtarget.com/video/…


Preventing Terrorist Attacks in Crowded Areas

In the wake of the failed Times Square car bombing, it’s natural to ask how we can prevent this sort of thing from happening again. The answer is stop focusing on the specifics of what actually happened, and instead think about the threat in general.

Think about the security measures commonly proposed. Cameras won’t help. They don’t prevent terrorist attacks, and their forensic value after the fact is minimal. In the Times Square case, surely there’s enough other evidence—the car’s identification number, the auto body shop the stolen license plates came from, the name of the fertilizer store—to identify the guy. We will almost certainly not need the camera footage. The images released so far, like the images in so many other terrorist attacks, may make for exciting television, but their value to law enforcement officers is limited.

Check points won’t help, either. You can’t check everybody and everything. There are too many people to check, and too many train stations, buses, theaters, department stores and other places where people congregate. Patrolling guards, bomb-sniffing dogs, chemical and biological weapons detectors: they all suffer from similar problems. In general, focusing on specific tactics or defending specific targets doesn’t make sense. They’re inflexible; possibly effective if you guess the plot correctly, but completely ineffective if you don’t. At best, the countermeasures just force the terrorists to make minor changes in their tactic and target.

It’s much smarter to spend our limited counterterrorism resources on measures that don’t focus on the specific. It’s more efficient to spend money on investigating and stopping terrorist attacks before they happen, and responding effectively to any that occur. This approach works because it’s flexible and adaptive; it’s effective regardless of what the bad guys are planning for next time.

After the Christmas Day airplane bombing attempt, I was asked how we can better protect our airplanes from terrorist attacks. I pointed out that the event was a security success—the plane landed safely, nobody was hurt, a terrorist was in custody—and that the next attack would probably have nothing to do with explosive underwear. After the Moscow subway bombing, I wrote that overly specific security countermeasures like subway cameras and sensors were a waste of money.

Now we have a failed car bombing in Times Square. We can’t protect against the next imagined movie-plot threat. Isn’t it time to recognize that the bad guys are flexible and adaptive, and that we need the same quality in our countermeasures?

This essay originally appeared on the New York Times “Room for Debate” blog. I know, it’s nothing I haven’t said before.
http://roomfordebate.blogs.nytimes.com/2010/05/03/…
http://www.schneier.com/essay-309.html
http://www.schneier.com/essay-304.html
http://www.schneier.com/essay-312.html

Steven Simon likes cameras, although his arguments are more movie-plot than real.
http://roomfordebate.blogs.nytimes.com/2010/05/03/…
Michael Black, Noah Shachtman, Michael Tarr, and Jeffrey Rosen all wrote about the limitations of security cameras. http://roomfordebate.blogs.nytimes.com/2010/05/03/…
http://roomfordebate.blogs.nytimes.com/2010/05/03/…
http://roomfordebate.blogs.nytimes.com/2010/05/03/…
http://roomfordebate.blogs.nytimes.com/2010/05/03/…
Paul Ekman wants more people. http://roomfordebate.blogs.nytimes.com/2010/05/03/…
Richard Clarke has a nice essay about how we shouldn’t panic.
http://roomfordebate.blogs.nytimes.com/2010/05/03/…


Punishing Security Breaches

(The editor of the Freakonomics blog asked me to write about this topic. The idea was that they would get several opinions, and publish them all. They spiked the story, but I already wrote my piece. So here it is.)

In deciding what to do with Gray Powell, the Apple employee who accidentally left a secret prototype 4G iPhone in a California bar, Apple needs to figure out how much of the problem is due to an employee not following the rules, and how much of the problem is due to unclear, unrealistic, or just plain bad rules.

If Powell sneaked the phone out of the Apple building in a flagrant violation of the rules—maybe he wanted to show it to a friend—he should be disciplined, perhaps even fired. Some military installations have rules like that. If someone wants to take something classified out of a top secret military compound, he might have to secrete it on his person and deliberately sneak it past a guard who searches briefcases and purses. He might be committing a crime by doing so, by the way. Apple isn’t the military, of course, but if their corporate security policy is that strict, it may very well have rules like that. And the only way to ensure rules are followed is by enforcing them, and that means severe disciplinary action against those who bypass the rules.

Even if Powell had authorization to take the phone out of Apple’s labs—presumably someone has to test drive the new toys sooner or later—the corporate rules might have required him to pay attention to it at all times. We’ve all heard of military attach豠who carry briefcases chained to their wrists. It’s an extreme example, but demonstrates how a security policy can allow for objects to move around town—or around the world—without getting lost. Apple almost certainly doesn’t have a policy as rigid as that, but its policy might explicitly prohibit Powell from taking that phone into a bar, putting it down on a counter, and participating in a beer tasting. Again, if Apple’s rules and Powell’s violation were both that clear, Apple should enforce them.

On the other hand, if Apple doesn’t have clear-cut rules, if Powell wasn’t prohibited from taking the phone out of his office, if engineers routinely ignore or bypass security rules and—as long as nothing bad happens—no one complains, then Apple needs to understand that the system is more to blame than the individual. Most corporate security policies have this sort of problem. Security is important, but it’s quickly jettisoned when there’s an important job to be done. A common example is passwords: people aren’t supposed to share them, unless it’s really important and they have to. Another example is guest accounts. And doors that are supposed to remain locked but rarely are. People routinely bypass security policies if they get in the way, and if no one complains, those policies are effectively meaningless.

Apple’s unfortunately public security breach has given the company an opportunity to examine its policies and figure out how much of the problem is Powell and how much of it is the system he’s a part of. Apple needs to fix its security problem, but only after it figures out where the problem is.

http://www.telegraph.co.uk/technology/apple/7611045/…


Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Schneier on Security,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.

Copyright (c) 2010 by Bruce Schneier.

Sidebar photo of Bruce Schneier by Joe MacInnis.