January 15, 2011
by Bruce Schneier
Chief Security Technology Officer, BT
A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit <http://www.schneier.com/crypto-gram.html>.
You can read this issue on the web at <http://www.schneier.com/crypto-gram-1101.html>. These same essays and news items appear in the "Schneier on Security" blog at <http://www.schneier.com/blog>, along with a lively comment section. An RSS feed is available.
In this issue:
- Security in 2020
- Recording the Police
- Stealing SIM Cards from Traffic Lights
- Schneier News
- Book Review: Cyber War
There's really no such thing as security in the abstract. Security can only be defined in relation to something else. You're secure from something or against something. In the next 10 years, the traditional definition of IT security -- that it protects you from hackers, criminals, and other bad guys -- will undergo a radical shift. Instead of protecting you from the bad guys, it will increasingly protect businesses and their business models from you.
Ten years ago, the big conceptual change in IT security was *deperimeterization*. A wordlike grouping of 18 letters with both a prefix and a suffix, it has to be the ugliest word our industry invented. The concept, though -- the dissolution of the strict boundaries between the internal and external network -- was both real and important.
There's more deperimeterization today than there ever was. Customer and partner access, guest access, outsourced e-mail, VPNs; to the extent there is an organizational network boundary, it's so full of holes that it's sometimes easier to pretend it isn't there. The most important change, though, is conceptual. We used to think of a network as a fortress, with the good guys on the inside and the bad guys on the outside, and walls and gates and guards to ensure that only the good guys got inside. Modern networks are more like cities, dynamic and complex entities with many different boundaries within them. The access, authorization, and trust relationships are even more complicated.
Today, two other conceptual changes matter. The first is *consumerization*. Another ponderous invented word, it's the idea that consumers get the cool new gadgets first, and demand to do their work on them. Employees already have their laptops configured just the way they like them, and they don't want another one just for getting through the corporate VPN. They're already reading their mail on their BlackBerrys or iPads. They already have a home computer, and it's cooler than the standard issue IT department machine. Network administrators are increasingly losing control over clients.
This trend will only increase. Consumer devices will become trendier, cheaper, and more integrated; and younger people are already used to using their own stuff on their school networks. It's a recapitulation of the PC revolution. The centralized computer center concept was shaken by people buying PCs to run VisiCalc; now it's iPads and Android smart phones.
The second conceptual change comes from cloud computing: our increasing tendency to store our data elsewhere. Call it *decentralization*: our email, photos, books, music, and documents are stored somewhere, and accessible to us through our consumer devices. The younger you are, the more you expect to get your digital stuff on the closest screen available. This is an important trend, because it signals the end of the hardware and operating system battles we've all lived with. Windows vs. Mac doesn't matter when all you need is a web browser. Computers become temporary; user backup becomes irrelevant. It's all out there somewhere -- and users are increasingly losing control over their data.
During the next 10 years, three new conceptual changes will emerge, two of which we can already see the beginnings of. The first I'll call *deconcentration*. The general-purpose computer is dying and being replaced by special-purpose devices. Some of them, like the iPhone, seem general purpose but are strictly controlled by their providers. Others, like Internet-enabled game machines or digital cameras, are truly special purpose. In 10 years, most computers will be small, specialized, and ubiquitous.
Even on what are ostensibly general-purpose devices, we're seeing more special-purpose applications. Sure, you could use the iPhone's web browser to access the *New York Times* website, but it's much easier to use the NYT's special iPhone app. As computers become smaller and cheaper, this trend will only continue. It'll be easier to use special-purpose hardware and software. And companies, wanting more control over their users' experience, will push this trend.
The second is *decustomerization* -- now I get to invent the really ugly words -- the idea that we get more of our IT functionality without any business relationship. We're all part of this trend: every search engine gives away its services in exchange for the ability to advertise. It's not just Google and Bing; most webmail and social networking sites offer free basic service in exchange for advertising, possibly with premium services for money. Most websites, even useful ones that take the place of client software, are free; they are either run altruistically or to facilitate advertising.
Soon it will be hardware. In 1999, Internet startup FreePC tried to make money by giving away computers in exchange for the ability to monitor users' surfing and purchasing habits. The company failed, but computers have only gotten cheaper since then. It won't be long before giving away netbooks in exchange for advertising will be a viable business. Or giving away digital cameras. Already there are companies that give away long-distance minutes in exchange for advertising. Free cell phones aren't far off. Of course, not all IT hardware will be free. Some of the new cool hardware will cost too much to be free, and there will always be a need for concentrated computing power close to the user -- game systems are an obvious example -- but those will be the exception. Where the hardware costs too much to just give away, however, we'll see free or highly subsidized hardware in exchange for locked-in service; that's already the way cell phones are sold.
This is important because it destroys what's left of the normal business relationship between IT companies and their users. We're not Google's customers; we're Google's product that they sell to their customers. It's a three-way relationship: us, the IT service provider, and the advertiser or data buyer. And as these noncustomer IT relationships proliferate, we'll see more IT companies treating us as products. If I buy a Dell computer, then I'm obviously a Dell customer; but if I get a Dell computer for free in exchange for access to my life, it's much less obvious whom I'm entering a business relationship with. Facebook's continual ratcheting down of user privacy in order to satisfy its actual customers -- the advertisers -- and enhance its revenue is just a hint of what's to come.
The third conceptual change I've termed *depersonization*: computing that removes the user, either partially or entirely. Expect to see more software agents: programs that do things on your behalf, such as prioritize your email based on your observed preferences or send you personalized sales announcements based on your past behavior. The "people who liked this also liked" feature on many retail websites is just the beginning. A website that alerts you if a plane ticket to your favorite destination drops below a certain price is simplistic but useful, and some sites already offer this functionality. Ten years won't be enough time to solve the serious artificial intelligence problems required to fully realize intelligent agents, but the agents of that time will be both sophisticated and commonplace, and they'll need less direct input from you.
Similarly, connecting objects to the Internet will soon be cheap enough to be viable. There's already considerable research into Internet-enabled medical devices, smart power grids that communicate with smart phones, and networked automobiles. Nike sneakers can already communicate with your iPhone. Your phone already tells the network where you are. Internet-enabled appliances are already in limited use, but soon they will be the norm. Businesses will acquire smart HVAC units, smart elevators, and smart inventory systems. And, as short-range communications -- like RFID and Bluetooth -- become cheaper, everything becomes smart.
The "Internet of things" won't need you to communicate. The smart appliances in your smart home will talk directly to the power company. Your smart car will talk to road sensors and, eventually, other cars. Your clothes will talk to your dry cleaner. Your phone will talk to vending machines; they already do in some countries. The ramifications of this are hard to imagine; it's likely to be weirder and less orderly than the contemporary press describes it. But certainly smart objects will be talking about you, and you probably won't have much control over what they're saying.
One old trend: deperimeterization. Two current trends: consumerization and decentralization. Three future trends: deconcentration, decustomerization, and depersonization. That's IT in 2020 -- it's not under your control, it's doing things without your knowledge and consent, and it's not necessarily acting in your best interests. And this is how things will be when they're working as they're intended to work; I haven't even started talking about the bad guys yet.
That's because IT security in 2020 will be less about protecting you from traditional bad guys, and more about protecting corporate business models from you. Deperimeterization assumes everyone is untrusted until proven otherwise. Consumerization requires networks to assume all user devices are untrustworthy until proven otherwise. Decentralization and deconcentration won't work if you're able to hack the devices to run unauthorized software or access unauthorized data. Decustomerization won't be viable unless you're unable to bypass the ads, or whatever the vendor uses to monetize you. And depersonization requires the autonomous devices to be, well, autonomous.
In 2020 -- 10 years from now -- Moore's Law predicts that computers will be 100 times more powerful. That'll change things in ways we can't know, but we do know that human nature never changes. Cory Doctorow rightly pointed out that all complex ecosystems have parasites. Society's traditional parasites are criminals, but a broader definition makes more sense here. As we users lose control of those systems and IT providers gain control for their own purposes, the definition of "parasite" will shift. Whether they're criminals trying to drain your bank account, movie watchers trying to bypass whatever copy protection studios are using to protect their profits, or Facebook users trying to use the service without giving up their privacy or being forced to watch ads, parasites will continue to try to take advantage of IT systems. They'll exist, just as they always have existed, and -- like today -- security is going to have a hard time keeping up with them.
Welcome to the future. Companies will use technical security measures, backed up by legal security measures, to protect their business models. And unless you're a model user, the parasite will be you.
This essay was originally written as a foreword to "Security 2020," by Doug Howard and Kevin Prince.
Fake Amazon receipt generators can be used to scam Amazon Marketplace merchants:
They're also useful if you want to defraud your employer on expense reimbursement forms.
The FBI has been accused of planting backdoors in OpenBSD.
I doubt this is true. One, it's a very risky thing to do. And two, there are more than enough exploitable security vulnerabilities in a piece of code that large. Finding and exploiting them is a much better strategy than planting them. But maybe someone at the FBI *is* that dumb.
Hiding PETN from full-body scanners:
Stephen Colbert on the issue:
I like the phrase "architecture of fear":
Interesting article on computational forensics.
Adam Shostack on TSA threat modeling:
In this interview with TSA Administrator John Pistole, he's more realistic than one normally hears. He still ducks some of the hard questions.
I am reminded my own interview from 2007 with then-TSA Administrator Kip Hawley.
Interesting interview with Viviane Reding, the vice president of the EU Justice Commission and head of privacy regulation.
Proprietary encryption in car immobilizers cracked.
Cyberwar movie plot from an actual thriller writer. It could make a good movie.
PlugBot: "PlugBot is a hardware bot. It's a covert penetration testing device designed for use during physical penetration tests. PlugBot is a tiny computer that looks like a power adapter; this small size allows it to go physically undetected all the while powerful enough to scan, collect and deliver test results externally."
Garfield Christmas comic.
Is it suspicious to photograph someone who is suspiciously taking photographs?
This interview discusses five books about terrorism (none of which I've read, by the way).
The TSA is now inspecting thermoses.
Civil War message decoded.
The key was "Manchester Bluff".
Home routers that automatically run Tor.
Guard towers at Walmart.
It's easy and cheap to eavesdrop on GSM calls.
Sony used an ECDSA signature scheme to protect the PS3. Trouble is, it didn't pay sufficient attention to its random number generator.
"SMS of death": messages you can send to crash other people's phones.
Be sure to read the response from one of the researchers.
The talk is online:
Good essay on the social dynamics of terror, separating "terror" from "terrorism."
James Fallows on political shootings.
"Homeland Security Hasn't Made Us Safer": This will be nothing new to Crypto-Gram readers, but it's nice to read other people saying it too.
Attacking high-frequency trading networks.
The security threat of forged law-enforcement credentials.
Interesting reading, mostly for the probable effects of a terrorist-sized nuclear bomb.
A loaded gun slips past the TSA. I'm not really worried about mistakes like this. Sure, a gun slips through occasionally, and a knife slips through even more often. (I'm sure the TSA doesn't catch 100% of all bombs in tests, either.) But these items are caught by the TSA often enough, and when the TSA does catch someone, they're going to call the police and totally ruin his day. A terrorist can't build a plot around succeeding. It's things like liquids that are the real problem. Because there are no consequences to trying -- the bottle of water just gets thrown into the trash -- a terrorist can repeatedly try until he succeeds in slipping it through.
I asked then-TSA Administrator Kip Hawley about this in 2007. He didn't have a good answer.
I've written a lot on the "War on Photography," where normal people are harassed as potential terrorists for taking pictures of things in public. The article below is different; it's about recording the police, and how that often is illegal.
This is all important. Being able to record the police is one of the best ways to ensure that the police are held accountable for their actions. Privacy has to be viewed in the context of relative power. For example, the government has a lot more power than the people. So privacy for the government increases their power and increases the power imbalance between government and the people; it decreases liberty. Forced openness in government -- open government laws, Freedom of Information Act filings, the recording of police officers and other government officials, WikiLeaks -- reduces the power imbalance between government and the people, and increases liberty.
Privacy for the people increases their power. It also increases liberty, because it reduces the power imbalance between government and the people. Forced openness in the people -- NSA monitoring of everyone's phone calls and e-mails, the DOJ monitoring everyone's credit card transactions, surveillance cameras -- decreases liberty.
I think we need a law that explicitly makes it legal for people to record government officials when they are interacting with them in their official capacity. And this is doubly true for police officers and other law enforcement officials.
Anthony Graber, the Maryland motorcyclist in the article, had all the wiretapping charges cleared.
FBI monitoring credit card transactions:
My "War on Photography" essay:
Johannesburg installed hundreds of networked traffic lights on its streets. The lights use a cellular modem and a SIM card to communicate.
Those lights introduced a security risk I'll bet no one gave a moment's thought to: that criminals might steal the SIM cards from the traffic lights and use them to make free phone calls. But that's exactly what happened.
Aside from the theft of phone service, repairing those traffic lights is far more expensive than those components are worth.
I wrote about this general issue before:
"These crimes are particularly expensive to society because the replacement cost is much higher than the thief's profit. A manhole is worth $5-$10 as scrap, but it costs $500 to replace, including labor. A thief may take $20 worth of copper from a construction site, but do $10,000 in damage in the process. And the increased threat means more money being spent on security to protect those commodities in the first place.
"Security can be viewed as a tax on the honest, and these thefts demonstrate that our taxes are going up. And unlike many taxes, we don't benefit from their collection. The cost to society of retrofitting manhole covers with locks, or replacing them with less re-salable alternatives, is high; but there is no benefit other than reducing theft."
Last week, I spoke at an airport security conference hosted by EPIC: "The Stripping of Freedom: A Careful Scan of TSA Security Procedures." Here's the video of my half-hour talk.
"Cyber War: The Next Threat to National Security and What to do About It" by Richard Clarke and Robert Knake, HarperCollins, 2010.
"Cyber War" is a fast and enjoyable read. This means you could give the book to your non-techy friends, and they'd understand most of it, enjoy all of it, and learn a lot from it. Unfortunately, while there's a lot of smart discussion and good information in the book, there's also a lot of fear-mongering and hyperbole as well. Since there's no easy way to tell someone what parts of the book to pay attention to and what parts to take with a grain of salt, I can't recommend it for that purpose. This is a pity, because parts of the book really need to be widely read and discussed.
The fear-mongering and hyperbole is mostly in the beginning. There, the authors describe the cyberwar of novels. Hackers disable air traffic control, delete money from bank accounts, cause widespread blackouts, release chlorine gas from chemical plants, and -- this is my favorite -- remotely cause your printer to catch on fire. It's exciting and scary stuff, but not terribly realistic. Even their discussions of previous "cyber wars" -- Estonia, Georgia, attacks against U.S. and South Korea on July 4, 2009 -- are full of hyperbole. A lot of what they write is unproven speculation, but they don't say that.
Better is the historical discussion of the formation of the U.S. Cyber Command, but there are important omissions. There's nothing about the cyberwar fear being stoked that accompanied this: by the NSA's General Keith Alexander -- who became the first head of the command -- or by the NSA's former director, current military contractor, by Mike McConnell, who's Senior Vice President at Booz Allen Hamilton, and by others. By hyping the threat, the former has amassed a lot of power, and the latter a lot of money. Cyberwar is the new cash cow of the military-industrial complex, and any political discussion of cyberwar should include this as well.
Also interesting is the discussion of the asymmetric nature of the threat. A country like the United States, which is heavily dependent on the Internet and information technology, is much more vulnerable to cyber-attacks than a less-developed country like North Korea. This means that a country like North Korea would benefit from a cyberwar exchange: they'd inflict far more damage than they'd incur. This also means that, in this hypothetical cyberwar, there would be pressure on the U.S. to move the war to another theater: air and ground, for example. Definitely worth thinking about.
Most important is the section on treaties. Clarke and Knake have a lot of experience with nuclear treaties, and have done considerable thinking about how to apply that experience to cyberspace. The parallel isn't perfect, but there's a lot to learn about what worked and what didn't, and -- more importantly -- *how* things worked and didn't. The authors discuss treaties banning cyberwar entirely (unlikely), banning attacks against civilians, limiting what is allowed in peacetime, stipulating no first use of cyber weapons, and so on. They discuss cyberwar inspections, and how these treaties might be enforced. Since cyberwar would be likely to result in a new worldwide arms race, one with a more precarious trigger than the nuclear arms race, this part should be read and discussed far and wide. Sadly, it gets lost in the rest of the book. And, since the book lacks an index, it can be hard to find any particular section after you're done reading it.
In the last chapter, the authors lay out their agenda for the future, which largely I agree with.
1. We need to start talking publicly about cyber war. This is certainly true. The threat of cyberwar is going to consume the sorts of resources we shoveled into the nuclear threat half a century ago, and a realistic discussion of the threats, risks, countermeasures, and policy choices is essential. We need more universities offering degrees in cyber security, because we need more expertise for the entire gamut of threats.
2. We need to better defend our military networks, the high-level ISPs, and our national power grid. Clarke and Knake call this the "Defensive Triad." The authors and I disagree strongly on how this should be done, but there is no doubt that it should be done. The two parts of that triad currently in commercial hands are simply too central to our nation, and too vulnerable, to be left insecure. And their value is far greater to the nation than it is to the corporations that own it, which means the market will not naturally secure it. I agree with the authors that regulation is necessary.
3. We need to reduce cybercrime. Even without the cyber warriors bit, we need to do that. Cybercrime is bad, and it's continuing to get worse. Yes, it's hard. But it's important.
4. We need international cyberwar treaties. I couldn't agree more about this. We do. We need to start thinking about them, talking about them, and negotiating them now, before the cyberwar arms race takes off. There are all kind of issues with cyberwar treaties, and the book talks about a lot of them. However full of loopholes they might be, their existence will do more good than harm.
5. We need more research on secure network designs. Again, even without the cyberwar bit, this is essential. We need more research in cybersecurity, a lot more.
6. We need decisions about cyberwar -- what weapons to build, what offensive actions to take, who to target -- to be made as far up the command structure as possible. Clarke and Knake want the president to personally approve all of this, and I agree. Because of its nature, it can be easy to launch a small-scale cyber attack, and it can be easy for a small-scale attack to get out of hand and turn into a large-scale attack. We need the president to make the decisions, not some low-level military officer ensconced in a computer-filled bunker late one night.
This is great stuff, and a fine starting place for a national policy discussion on cybersecurity, whether it be against a military, espionage, or criminal threat. Unfortunately, for readers to get there, they have to wade through the rest of the book. And unless their bullshit detectors are already well-calibrated on this topic, I don't want them reading all the hyperbole and fear-mongering that comes before, no matter how readable the book.
Note: I read "Cyber War" in April, when it first came out. I wanted to write a review then, but found that while my Kindle is great for reading, it's terrible for flipping back and forth looking for bits and pieces to write about in a review. So I let the review languish. Finally, I borrowed a paper copy from my local library.
Cyber War: The Next Threat to National Security and What to do About It:
Some other reviews:
See also the reviews on the Amazon page:
Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers "Schneier on Security," "Beyond Fear," "Secrets and Lies," and "Applied Cryptography," and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.
Copyright (c) 2011 by Bruce Schneier.
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..