Essays in the Category “Theory of Security”
This essay appeared as a response to Edge's annual question, "what scientific term or concept ought to be more widely known?"
There's a concept from computer security known as a class break. It's a particular security vulnerability that breaks not just one system, but an entire class of systems. Examples might be a vulnerability in a particular operating system that allows an attacker to take remote control of every computer that runs on that system's software. Or a vulnerability in Internet-enabled digital video recorders and webcams that allow an attacker to recruit those devices into a massive botnet.
Distributed citizen groups and nimble hackers once had the edge. Now governments and corporations are catching up. Who will dominate in the decades ahead?
We're in the middle of an epic battle for power in cyberspace. On one side are the traditional, organized, institutional powers such as governments and large multinational corporations. On the other are the distributed and nimble: grassroots movements, dissident groups, hackers, and criminals. Initially, the Internet empowered the second side.
The latest Snowden document is the US intelligence 'black budget.' There's a lot of information in the few pages the Washington Post decided to publish, including an introduction by Director of National Intelligence James Clapper. In it, he drops a tantalizing hint: 'Also, we are investing in groundbreaking cryptanalytic capabilities to defeat adversarial cryptography and exploit internet traffic.'
Honestly, I'm skeptical. Whatever the NSA has up its top-secret sleeves, the mathematics of cryptography will still be the most secure part of any encryption system. I worry a lot more about poorly designed cryptographic products, software bugs, bad passwords, companies that collaborate with the NSA to leak all or part of the keys, and insecure computers and networks.
I jacked a visitor's badge from the Eisenhower Executive Office Building in Washington, DC, last month. The badges are electronic; they're enabled when you check in at building security. You're supposed to wear it on a chain around your neck at all times and drop it through a slot when you leave.
I kept the badge.
A core, not side, effect of technology is its ability to magnify power and multiply force—for both attackers and defenders. One side creates ceramic handguns, laser-guided missiles, and new-identity theft techniques, while the other side creates anti-missile defense systems, fingerprint databases, and automatic facial recognition systems.
The problem is that it's not balanced: Attackers generally benefit from new security technologies before defenders do. They have a first-mover advantage.
Doping in professional sports is back in the news, as the overwhelming evidence against Lance Armstrong led to his being stripped of his seven Tour de France titles and more. But instead of focusing on the issues of performance-enhancing drugs and whether professional athletes be allowed to take them, I'd like to talk about the security and economic aspects of the issue.
Because drug testing is a security issue. Various sports federations around the world do their best to detect illegal doping, and players do their best to evade the tests.
A Debate between Sam Harris and Bruce Schneier
Return to Part 1
A profile that encompasses "anyone who could conceivably be Muslim" needs to include almost everyone. Anything less and you're missing known Muslim airplane terrorist wannabes.
SH:It includes a lot of people, but I wouldn't say almost everyone. In fact, I just flew out of San Jose this morning and witnessed a performance of security theater so masochistic and absurd that, given our ongoing discussion, it seemed too good to be true.
A Debate between Sam Harris and Bruce SchneierIntroduction by Sam Harris
I recently wrote two articles in defense of "profiling" in the context of airline security (1 & 2), arguing that the TSA should stop doing secondary screenings of people who stand no reasonable chance of being Muslim jihadists. I knew this proposal would be controversial, but I seriously underestimated how inflamed the response would be. Had I worked for a newspaper or a university, I could well have lost my job over it.
One thing that united many of my critics was their admiration for Bruce Schneier.
This essay was republished in Wired on February 24, 2014.
Security is a tradeoff, a balancing act between attacker and defender. Unfortunately, that balance is never static. Changes in technology affect both sides. Society uses new technologies to decrease what I call the scope of defection -- what attackers can get away with -- and attackers use new technologies to increase it.
I CAN put my cash card into an ATM anywhere in the world and take out a fistful of local currency, while the corresponding amount is debited from my bank account at home. I don't even think twice: regardless of the country, I trust that the system will work.
The whole world runs on trust. We trust that people on the street won't rob us, that the bank we deposited money in last month returns it this month, that the justice system punishes the guilty and exonerates the innocent.
At a security conference recently, the moderator asked the panel of distinguished cybersecurity leaders what their nightmare scenario was. The answers were the predictable array of large-scale attacks: against our communications infrastructure, against the power grid, against the financial system, in combination with a physical attack.
I didn't get to give my answer until the afternoon, which was: "My nightmare scenario is that people keep talking about their nightmare scenarios."
There's a certain blindness that comes from worst-case thinking. An extension of the precautionary principle, it involves imagining the worst possible outcome and then acting as if it were a certainty.
Security technologist and author Bruce Schneier looks at the age-old problem of insider threat
Rajendrasinh Makwana was a UNIX contractor for Fannie Mae. On October 24, he was fired. Before he left, he slipped a logic bomb into the organisation's network. The bomb would have "detonated" on January 31. It was programmed to disable access to the server on which it was running, block any network monitoring software, systematically and irretrievably erase everything, and then replicate itself on all 4,000 Fannie Mae servers.
There are several ways two people can divide a piece of cake in half. One way is to find someone impartial to do it for them. This works, but it requires another person. Another way is for one person to divide the piece, and the other person to complain (to the police, a judge, or his parents) if he doesn't think it's fair.
An employee of Whole Foods in Ann Arbor, Michigan, was fired in 2007 for apprehending a shoplifter. More specifically, he was fired for touching a customer, even though that customer had a backpack filled with stolen groceries and was running away with them.
I regularly see security decisions that, like the Whole Foods incident, seem to make absolutely no sense. However, in every case, the decisions actually make perfect sense once you understand the underlying incentives driving the decision.
Sports referees are supposed to be fair and impartial. They're not supposed to favor one team over another. And they're most certainly not supposed to have a financial interest in the outcome of a game.
Tim Donaghy, referee for the National Basketball Association, has been accused of both betting on basketball games and fixing games for the mob.
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient, an IBM Company.