June 15, 2010
by Bruce Schneier
Chief Security Technology Officer, BT
A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.
You can read this issue on the web at <http://www.schneier.com/crypto-gram-1006.html>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively comment section. An RSS feed is available.
In this issue:
- Hiring Hackers
- Scenes from an Airport
- Fifth Annual Movie-Plot Threat Contest Winner
- Outsourcing to an Indian Jail
- Schneier News
- Terrorists Placing Fake Bombs in Public Places
- Reading Me
Any essay on hiring hackers quickly gets bogged down in definitions. What is a hacker, and how is he different from a cracker? I have my own definitions, but I’d rather define the issue more specifically: Would you hire someone convicted of a computer crime to fill a position of trust in your computer network? Or, more generally, would you hire someone convicted of a crime for a job related to that crime?
The answer, of course, is “it depends.” It depends on the specifics of the crime. It depends on the ethics involved. It depends on the recidivism rate of the type of criminal. It depends a whole lot on the individual.
Would you hire a convicted pedophile to work at a day care center? Would you hire Bernie Madoff to manage your investment fund? The answer is almost certainly no to those two — but you might hire a convicted bank robber to consult on bank security. You might hire someone who was convicted of false advertising to write ad copy for your next marketing campaign. And you might hire someone who ran a chop shop to fix your car. It depends on the person and the crime.
It can get even murkier. Would you hire a CIA-trained assassin to be a bodyguard? Would you put a general who led a successful attack in charge of defense? What if they were both convicted of crimes in whatever country they were operating in? There are different legal and ethical issues, to be sure, but in both cases the people learned a certain set of skills regarding offense that could be transferable to defense.
Which brings us back to computers. Hacking is primarily a mindset: a way of thinking about security. Its primary focus is in attacking systems, but it’s invaluable to the defense of those systems as well. Because computer systems are so complex, defending them often requires people who can think like attackers.
Admittedly, there’s a difference between thinking like an attacker and acting like a criminal, and between researching vulnerabilities in fielded systems and exploiting those vulnerabilities for personal gain. But there is a huge variability in computer crime convictions, and — at least in the early days — many hacking convictions were unjust and unfair. And there’s also a difference between someone’s behavior as a teenager and his behavior later in life. Additionally, there might very well be a difference between someone’s behavior before and after a hacking conviction. It all depends on the person.
An employer’s goal should be to hire moral and ethical people with the skill set required to do the job. And while a hacking conviction is certainly a mark against a person, it isn’t always grounds for complete non-consideration.
“We don’t hire hackers” and “we don’t hire felons” are coarse generalizations, in the same way that “we only hire people with this or that security certification” is. They work — you’re less likely to hire the wrong person if you follow them — but they’re both coarse and flawed. Just as all potential employees with certifications aren’t automatically good hires, all potential employees with hacking convictions aren’t automatically bad hires. Sure, it’s easier to hire people based on things you can learn from checkboxes, but you won’t get the best employees that way. It’s far better to look at the individual, and put those check boxes into context. But we don’t always have time to do that.
Last winter, a Minneapolis attorney who works to get felons a fair shake after they served their time told of a sign he saw: “Snow shovelers wanted. Felons need not apply.” It’s not good for society if felons who have served their time can’t even get jobs shoveling snow.
This essay previously appeared in Information Security as the first half of a point-counterpoint with Marcus Ranum. Marcus’s half is here.
Hiring people with security certifications:
Scene One: I’ve gotten to the front of the security line and handed the TSA officer my ID and ticket.
TSA Officer: (Looks at my ticket. Looks at my ID. Looks at me. Smiles.)
Me: (Smiles back.)
TSA Officer: (Looks at my ID. Looks at me. Smiles.)
Me: (Tips hat. Smiles back.)
TSA Officer: A beloved name from the blogosphere.
Me: And I always thought that I slipped through these lines anonymously.
TSA Officer: Don’t worry. No one will notice. This isn’t the sort of job that rewards competence, you know.
Me: Have a good day.
Scene Two (a few days later): I’ve gotten to the front of the security line at a different airport, and handed a different TSA officer my ID and ticket.
TSA Officer: (Looks everything over. Reads the name on my passport.) The Bruce Schneier?
Me: (Nods, managing not to say: “No no, just a Bruce Schneier; didn’t you hear I come in six-packs?”)
TSA Officer: The security expert?
TSA Officer: (Takes off his glove. Offers me his hand to shake.)
Me: (Shakes his hand.)
TSA Officer: I read your stuff all the time.
That’s twice in a row, after years of not being recognized by any TSA officer ever. This is starting to worry me.
The British High Court ruled that a software vendor’s EULA — which denied all liability for poor software — was not reasonable.
I wrote about software liabilities back in 2003.
Insect-based terrorism: sounds like fear mongering to me.
A recently declassified history of NSA computers through 1964.
Militarized marine mammals:
Interesting research in detecting browser history:
“Experimental Security Analysis of a Modern Automobile,” by a whole mess of authors:
This is an interesting piece of research evaluating different user interface designs by which applications disclose to users what sort of authority they need to install themselves. Given all the recent concerns about third-party access to user data on social networking sites (particularly Facebook), this is particularly timely research.
More interesting research: “What You See is What They Get: Protecting users from unwanted use of microphones, cameras, and other sensors,” by Jon Howell and Stuart Schechter. Apple seems to be taking some steps in this direction with the location sensor disclosure in the iPhone 4.0 operating system.
LIGATT Security certainly hopes to scare people with this InfoSec television commercial.
If you see something, think twice about saying something:
An Android app for end-to-end encrypted cell phone calls:
Low-tech burglars to get lighter sentences in Louisiana. Well, they get increased sentences if they use Internet maps.
This is the kind of law that annoys me. Crimes are crimes, regardless of the ancillary technology used to plan them.
Who needs actual terrorists, when we can terrorize ourselves with poorly thought out emergency preparedness drills?
The pointlessness of voluntary security inspections:
Go read this article — “Setting impossible standards on intelligence” — on laying blame for the intelligence “failure” that allowed the Underwear Bomber to board an airplane on Christmas Day.
I’ve never been impressed with the “dots” that should have been connected regarding Abdulmutallab. On closer examination, they mostly evaporate. Nor do I consider Christmas Day a security failure. Plane lands safely, terrorist captured, no one hurt; what more do people want?
The OSS Simple Sabotage Field Manual from 1944.
Interesting article about the four stages of fear.
I’m in the middle of reading Dave Grossman’s book “On Killing: The Psychological Cost of Learning to Kill in War and Society.” He writes that “fight or flight” is actually “fight, flight, posture, or submit.”
How to spot a CIA officer, at least in the mid-1970s.
Bletchley Park archives to go online:
The Bletchley Park Museum really needs donations, if you’re so inclined.
Carly Fiorina wanted to scare Californians into voting for her.
Yes, terrorists kill — about as often as home appliances.
Canada is spending $1B on security for the G8/G20 Summit in June. The numbers are simply crazy; think about how much real security you could buy for that money.
DARPA Research into clean-slate network security redesign.
Earlier this week, the Ninth Workshop on Economics and Information Security (WEIS 2010) was held at Harvard. As always, it was a great workshop with some very interesting papers. Papers are on the conference website.
Ross Anderson liveblogged the event.
Botox as a terrorist threat:
Hi and Lois security cartoon.
This essay in The New York Times is a refreshingly cogent attempt at rational cost-benefit analysis:
And here’s another essay from BBC.com that is also nicely rational:
Behavioral profiling at airports:
Protecting cars with the Club:
On April 1, I announced the Fifth Annual Movie Plot Threat Contest: “Your task, ye Weavers of Tales, is to create a fable of fairytale suitable for instilling the appropriate level of fear in children so they grow up appreciating all the lords do to protect them.”
On May 15, I announced the five semi-finalists. Voting continued through the end of the month, and the winner is:
The Gashlycrumb Terrors, by Laura
A is for anthrax, deadly and white.
B is for burglars who break in at night.
C is for cars that have minds of their own
and accelerate rapidly in a school zone.
D is for dynamite lit with a fuse.
E is for everything we have to lose.
F is for foreigners, different and strange.
G is for gangs and the crimes they arrange.
H is for hand lotion, more than three ounces;
let’s pray some brave agent soon sees it and pounces.
I is for implants (I’ll explain when you’re older).
J is for jokers who only grow bolder.
K is for kids who aren’t afraid
to play in the park or drink lemonade.
L is for lead in our toys and our food.
M is for Mom’s cavalier attitude.
N is for neighbors — you never can tell:
is that a book club or terrorist cell?
O is for ostrich, with head in the sand.
P is for plots to blow up Disneyland.
Q is for those who would question authorities.
R is for radical sects and minorities.
S is for satanists, who have been seen
to give children razor blades on Halloween.
T is for terrorists, by definition.
U is for uncensored acts of sedition.
V is for vigilance, our leaders’ tool
for keeping us safe, both at home and at school.
W is for warnings with colors and levels.
X is for xraying bags at all revels.
Y is for you! So don’t be a dope.
Z is for zero tolerance, our finest hope.
Laura, contact me with your address so I can send you your prize. Anyone interesting in illustrating this, preferably in Edward Gorey’s style, should e-mail me first.
This doesn’t seem like the best idea: “Authorities in the southern Indian state of Andhra Pradesh are planning to set up an outsourcing unit in a jail. The unit will employ 200 educated convicts who will handle back office operations like data entry, and process and transmit information.”
It’s not necessarily a bad idea, as long as misusable information isn’t being handled by the criminals. “The unit, which is expected to undertake back-office work for banks, will work round the clock with three shifts of 70 staff each.”
Okay, definitely a bad idea.
“Working in the unit will also be financially rewarding for the prisoners.” I’ll bet.
I’m speaking at the OECD Experts Workshop on Internet Intermediaries, on June 16 in Paris.
I’m delivering the keynote at the CCD CoE Conference on Cyber Conflict, on June 17 in Tallinn.
Supposedly, the latest terrorist tactic is to place fake bombs — suspicious looking bags, backpacks, boxes, and coolers — in public places in an effort to paralyze the city and probe our defenses. The article doesn’t say whether or not this has actually ever happened, only that the FBI is warning of the tactic. “Citing an FBI informational document, ABC News reports a so called ‘battle of suspicious bags’ is being encouraged on a jihadist website.
I have no doubt that this may happen, but I’m sure these are not actual terrorists doing the planting. We’re so easy to terrorize that anyone can play; this is the equivalent of hacking in the real world. One solution is to continue to overreact, and spend even more money on these fake threats. The other is to refuse to be terrorized.
The number of different ways to read my essays, commentaries, and links has grown recently. Here’s the rundown:
You can read my writings daily on my blog.
These are reprinted on my Facebook page.
They are also reprinted on my LiveJournal feed.
You can follow them on Twitter.
You can subscribe to the RSS feed:
Or you can subscribe to the alternative RSS feed, if you prefer excerpts instead of full text:
Finally, you can read the same writing aggregated once a month and e-mailed directly to you: Crypto-Gram.
I think that about covers it for useful distribution formats right now.
Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Schneier on Security,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.
Copyright (c) 2010 by Bruce Schneier.