Schneier on Security
A blog covering security and security technology.
April 2009 Archives
Interesting article from The New York Times.
Because so many aspects of the American effort to develop cyberweapons and define their proper use remain classified, many of those officials declined to speak on the record. The White House declined several requests for interviews or to say whether Mr. Obama as a matter of policy supports or opposes the use of American cyberweapons.
I've written about cyberwar here.
From The Daily WTF:
Johnny was what you might call a "gym rat." In incredible shape from almost-daily gym visits, a tight Lycra tank top, iPod strapped to his sizable bicep, underneath which was a large black tribal tattoo. He scanned his finger on his way out, but the turnstile wouldn't budge.
Me on biometrics.
I wrote about electronic voting machines back in 2004.
Lots of high-tech gear, but that's not what makes schools safe:
Some of the noticeable security measures remain, but experts say the country is exploring a new way to protect kids from in-school violence: administrators now want to foster school communities that essentially can protect themselves with or without the high-tech gear.
Of course, there never was an epidemic of school shootings -- it just seemed that way in the media. And kids are much safer in schools than outside of them.
I've previously written about the piece of counterterrorism silliness known as the no-fly list:
Imagine a list of suspected terrorists so dangerous that we can't ever let them fly, yet so innocent that we can't arrest them -- even under the draconian provisions of the Patriot Act.
Turns out these people are so dangerous that they can't be allowed to fly over United States territory, even on a flight from Paris to Mexico.
What makes the whole incident even more interesting is that Air France had only sent its passenger manifest to the Mexicans, but now it is clear that Mexico shares this information with the United States.
This apparently non-ironic video warns that people might impersonate census workers in an effort to rob you. But while you shouldn't trust the ID of a stranger, you should trust that same stranger to give you a phone number where you can verify that ID. This, of course, makes no sense.
Preventing impersonation is hard.
Fuselier is one of the people Cullen spotlights in his retelling in order to clear up the historical record. Some of the confusion generated by Columbine was inevitable: Harris and Klebold started out wearing trench coats, for instance, but at some point removed them, giving the illusion that they were four people rather than two. The homemade pipe bombs they were tossing in all directions—down stairwells, onto the roof—only seemed to further the impression that there were more of them. And then there were the SWAT teams: students trapped inside the building would hear their rifle fire, assume it was the killers and report it to the media by cellphone, complicating the cops' efforts to keep them safe. "This was the first major hostage standoff of the cellphone age," Cullen notes. The police "had never seen anything like it."
Do you know what your data did last night? Almost none of the more than 27 million people who took the RealAge quiz realized that their personal health data was being used by drug companies to develop targeted e-mail marketing campaigns.
There's a basic consumer protection principle at work here, and it's the concept of "unfair and deceptive" trade practices. Basically, a company shouldn't be able to say one thing and do another: sell used goods as new, lie on ingredients lists, advertise prices that aren't generally available, claim features that don't exist, and so on.
Cloud computing is another technology where users entrust their data to service providers. Salesforce.com, Gmail, and Google Docs are examples; your data isn't on your computer -- it's out in the "cloud" somewhere -- and you access it from your web browser. Cloud computing has significant benefits for customers and huge profit potential for providers. It's one of the fastest growing IT market segments -- 69% of Americans now use some sort of cloud computing services -- but the business is rife with shady, if not outright deceptive, advertising.
Take Google, for example. Last month, the Electronic Privacy Information Center (I'm on its board of directors) filed a complaint with the Federal Trade Commission concerning Google's cloud computing services. On its website, Google repeatedly assures customers that their data is secure and private, while published vulnerabilities demonstrate that it is not. Google's not foolish, though; its Terms of Service explicitly disavow any warranty or any liability for harm that might result from Google's negligence, recklessness, malevolent intent, or even purposeful disregard of existing legal obligations to protect the privacy and security of user data. EPIC claims that's deceptive.
Facebook isn't much better. Its plainly written (and not legally binding) Statement of Principles contains an admirable set of goals, but its denser and more legalistic Statement of Rights and Responsibilities undermines a lot of it. One research group who studies these documents called it "democracy theater": Facebook wants the appearance of involving users in governance, without the messiness of actually having to do so. Deceptive.
These issues are not identical. RealAge is hiding what it does with your data. Google is trying to both assure you that your data is safe and duck any responsibility when it's not. Facebook wants to market a democracy but run a dictatorship. But they all involve trying to deceive the customer.
Cloud computing services like Google Docs, and social networking sites like RealAge and Facebook, bring with them significant privacy and security risks over and above traditional computing models. Unlike data on my own computer, which I can protect to whatever level I believe prudent, I have no control over any of these sites, nor any real knowledge of how these companies protect my privacy and security. I have to trust them.
This may be fine -- the advantages might very well outweigh the risks -- but users often can't weigh the trade-offs because these companies are going out of their way to hide the risks.
Of course, companies don't want people to make informed decisions about where to leave their personal data. RealAge wouldn't get 27 million members if its webpage clearly stated "you are signing up to receive e-mails containing advertising from pharmaceutical companies," and Google Docs wouldn't get five million users if its webpage said "We'll take some steps to protect your privacy, but you can't blame us if something goes wrong."
And of course, trust isn't black and white. If, for example, Amazon tried to use customer credit card info to buy itself office supplies, we'd all agree that that was wrong. If it used customer names to solicit new business from their friends, most of us would consider this wrong. When it uses buying history to try to sell customers new books, many of us appreciate the targeted marketing. Similarly, no one expects Google's security to be perfect. But if it didn't fix known vulnerabilities, most of us would consider that a problem.
This is why understanding is so important. For markets to work, consumers need to be able to make informed buying decisions. They need to understand both the costs and benefits of the products and services they buy. Allowing sellers to manipulate the market by outright lying, or even by hiding vital information, about their products breaks capitalism -- and that's why the government has to step in to ensure markets work smoothly.
Last month, Mary K. Engle, Acting Deputy Director of the FTC's Bureau of Consumer Protection said: "a company's marketing materials must be consistent with the nature of the product being offered. It's not enough to disclose the information only in a fine print of a lengthy online user agreement." She was speaking about Digital Rights Management and, specifically, an incident where Sony used a music copy protection scheme without disclosing that it secretly installed software on customers' computers. DRM is different from cloud computing or even online surveys and quizzes, but the principle is the same.
Engle again: "if your advertising giveth and your EULA [license agreement] taketh away don't be surprised if the FTC comes calling." That's the right response from government.
A version of this article originally appeared on The Wall Street Journal.
Not what you think; it's about forensics of the Squid web/proxy cache.
Note the squid stamp, though.
The RSA Conference organizers asked me to write a restaurant review column for their show daily -- distributed only electronically. I called my column "The Dining Cryptographer." Here are links to them. I reviewed two restaurants each day: one walking distance from Moscone Center, and one a taxi ride away.
The Crown Prosecution Service said there was insufficient evidence to press charges or hold them any longer.
Back during the debate for HR 1, I was amazed at how easily conservatives were willing to accept and repeat lies about spending in the stimulus package, even after those provisions had been debunked as fabrications. The $30 million for the salt marsh mouse is a perfect example, and Kagro X documented well over a dozen congressmen repeating the lie.
The Twitter InTheStimulus site appears to have been taken down.
There a several things going on here. First is confirmation bias, which is the tendency of people to believe things that reinforce their prior beliefs. But the second is the limited bandwidth of Twitter—140-character messages—that makes it very difficult to authenticate anything. Twitter is an ideal medium to inject fake facts into society for precisely this reason.
EDITED TO ADD (5/14): False Twitter rumors about Swine Flu.
The problem is more widespread than you might think:
First lofted into orbit in the 1970s, the FLTSATCOM bird was at the time a major advance in military communications. Their 23 channels were used by every branch of the U.S. armed forces and the White House for encrypted data and voice, typically from portable ground units that could be quickly unpacked and put to use on the battlefield.
Conficker's April Fool's joke -- the huge, menacing build-up and then nothing -- is a good case study on how we think about risks, one whose lessons are applicable far outside computer security. Generally, our brains aren't very good at probability and risk analysis. We tend to use cognitive shortcuts instead of thoughtful analysis. This worked fine for the simple risks we encountered for most of our species's existence, but it's less effective against the complex risks society forces us to face today.
We tend to judge the probability of something happening on how easily we can bring examples to mind. It's why people tend to buy earthquake insurance after an earthquake, when the risk is lowest. It's why those of us who have been the victims of a crime tend to fear crime more than those who haven't. And it's why we fear a repeat of 9/11 more than other types of terrorism.
We fear being murdered, kidnapped, raped and assaulted by strangers, when friends and relatives are far more likely to do those things to us. We worry about plane crashes instead of car crashes, which are far more common. We tend to exaggerate spectacular, strange, and rare events, and downplay more ordinary, familiar, and common ones.
We also respond more to stories than to data. If I show you statistics on crime in New York, you'll probably shrug and continue your vacation planning. But if a close friend gets mugged there, you're more likely to cancel your trip.
And specific stories are more convincing than general ones. That is why we buy more insurance against plane accidents than against travel accidents, or accidents in general. Or why, when surveyed, we are willing to pay more for air travel insurance covering "terrorist acts" than "all possible causes". That is why, in experiments, people judge specific scenarios more likely than more general ones, even if the general ones include the specific.
Conficker's 1 April deadline was precisely the sort of event humans tend to overreact to. It's a specific threat, which convinces us that it's credible. It's a specific date, which focuses our fear. Our natural tendency to exaggerate makes it more spectacular, which further increases our fear. Its repetition by the media makes it even easier to bring to mind. As the story becomes more vivid, it becomes more convincing.
The New York Times called it an "unthinkable disaster", the television news show 60 Minutes said it could "disrupt the entire internet" and we at the Guardian warned that it might be a "deadly threat". Naysayers were few, and drowned out.
The first of April passed without incident, but Conficker is no less dangerous today. About 2.2m computers worldwide, are still infected with Conficker.A and B, and about 1.3m more are infected with the nastier Conficker.C. It's true that on 1 April Conficker.C tried a new trick to update itself, but its authors could have updated the worm using another mechanism any day. In fact, they updated it on 8 April, and can do so again.
And Conficker is just one of many, many dangerous worms being run by criminal organisations. It came with a date and got a lot of press -- that 1 April date was more hype than reality -- but it's not particularly special. In short, there are many criminal organisations on the internet using worms and other forms of malware to infect computers. They then use those computers to send spam, commit fraud, and infect more computers. The risks are real and serious. Luckily, keeping your anti-virus software up-to-date and not clicking on strange attachments can keep you pretty secure. Conficker spreads through a Windows vulnerability that was patched in October. You do have automatic update turned on, right?
But people being people, it takes a specific story for us to protect ourselves.
This essay previously appeared in The Guardian.
Encrypting your USB drive is smart. Writing the encryption key on a piece of paper and attaching it to the USB drive is not.
Sometimes the basic tricks work best:
Police say a man posing as a waiter collected $186 in cash from diners at two restaurants in New Jersey and walked out with the money in his pocket.
Certainly he'll be caught if he keeps it up, but it's a good trick if used sparingly.
Does anyone have any other opinions?
Posting an excerpt would give it away.
General Dynamics Information Technology put out an ad last month on behalf of the Homeland Security Department seeking someone who could "think like the bad guy." Applicants, it said, must understand hackers' tools and tactics and be able to analyze Internet traffic and identify vulnerabilities in the federal systems.
Not a particularly subtle hack, but clever nonetheless.
EDITED TO ADD (4/20): Details of the hack.
EDITED TO ADD (4/29): More details.
Daniel Gardner's The Science of Fear was published last July, but I've only just gotten around to reading it. That was a big mistake. It's a fantastic look at how how humans deal with fear: exactly the kind of thing I have been reading and writing about for the past couple of years. It's the book I wanted to write, and it's a great read.
Gardner writes about how the brain processes fear and risk, how it assesses probability and likelihood, and how it makes decisions under uncertainty. The book talks about all the interesting psychological studies -- cognitive psychology, evolutionary psychology, behavioral economics, experimental philosophy -- that illuminate how we think and act regarding fear. The book also talks about how fear is used to influence people, by marketers, by politicians, by the media. And lastly, the book talks about different areas where fear plays a part: health, crime, terrorism.
There have been a lot of books published recently that apply these new paradigms of human psychology to different domains -- to randomness, to traffic, to rationality, to art, to religion, and etc. -- but after you read a few you start seeing the same dozen psychology experiments over and over again. Even I did it, when I wrote about the psychology of security. But Gardner's book is different: he goes further, explains more, demonstrates his point with the more obscure experiments that most authors don't bother seeking out. His writing style is both easy to read and informative, a nice mix of data an anecdote. The flow of the book makes sense. And his analysis is spot-on.
My only problem with the book is that Gardner doesn't use standard names for the various brain heuristics he talks about. Yes, his names are more intuitive and evocative, but they're wrong. If you have already read other books in the field, this is annoying because you have to constantly translate into standard terminology. And if you haven't read anything else in the field, this is a real problem because you'll be needlessly confused when you read about these things in other books and articles.
So here's a handy conversion chart. Print it out and tape it to the inside front cover. Print another copy out and use it as a bookmark.
That's it. That's the only thing I didn't like about the book. Otherwise, it's perfect. It's the book I wish I had written. Only I don't think I would have done as good a job as Gardner did. The Science of Fear should be required reading for...well, for everyone.
Here's a link from Powell's, if you're boycotting Amazon.
I like this one.
EDITED TO ADD (4/16): On further analysis, this seems more reasonable than I first thought.
From Foreign Policy:
8. If you are still having trouble working the Chinese or the Russian governments into your story, why not throw in some geopolitical kerfuffle that involves a country located in between? Not only would it implicate both governments, it would also make cyberspace seem relevant to geopolitics. I suggest you settle on Kyrgyzstan, as it would also help to make a connection to the US military bases; there is no better story than having Russian and Chinese hackers oust the US from Kyrgyzstan via cyber-attacks. Bonus points for mentioning Azerbaijan and the importance of cyberwarfare to the politics of the Caspian oil; in the worst case, Kazakhstan would do as well. Never mention any connectivity statistics for the countries you are writing about: you don't want readers to start doubting that someone might be interested in launching a cyberwar on countries that couldn't care less about the Internet.
Tweenbots are human-dependent robots that navigate the city with the help of pedestrians they encounter. Rolling at a constant speed, in a straight line, Tweenbots have a destination displayed on a flag, and rely on people they meet to read this flag and to aim them in the right direction to reach their goal.
Here's a tip: when walking around in public with secret government documents, put them in an envelope.
A huge MI5 and police counterterrorist operation against al-Qaeda suspects had to be brought forward at short notice last night after Scotland Yard's counter-terrorism chief accidentally revealed a briefing document.
Now the debate begins about whether he was just stupid, or very very stupid:
Opposition MPs criticised Mr Quick, with the Liberal Democrats describing him as "accident prone" and the Conservatives condemning his "very alarming" lapse of judgement.
It wasn't just a piece of paper. It was a secret piece of paper. (Here's the best blow-up of the picture. And surely these people have procedures for transporting classified material. That's what the mistake was: not following proper procedure.
Yesterday I talked to at least a dozen reporters about this breathless Wall Street Journal story:
Cyberspies have penetrated the U.S. electrical grid and left behind software programs that could be used to disrupt the system, according to current and former national-security officials.
Read the whole story; there aren't really any facts in it. I don't know what's going on; maybe it's just budget season and someone is jockeying for a bigger slice.
Honestly, I am much more worried about random errors and undirected worms in the computers running our infrastructure than I am about the Chinese military. I am much more worried about criminal hackers than I am about government hackers. I wrote about the risks to our infrastructure here, and about Chinese hacking here.
And I wrote about last year's reports of international hacking of our SCADA control systems here.
The team of researchers, which includes graduate students David Choffnes (electrical engineering and computer science) and Dean Malmgren (chemical and biological engineering), and postdoctoral fellow Jordi Duch (chemical and biological engineering), studied connection patterns in the BitTorrent file-sharing network -- one of the largest and most popular P2P systems today. They found that over the course of weeks, groups of users formed communities where each member consistently connected with other community members more than with users outside the community.
I found this great paragraph in this article on the future of privacy in the UK:
One of the few home secretaries who dominated his department rather than be cowed by it was Lord Whitelaw in the 1980s. He boasted how after any security lapse, the police would come to beg for new and draconian powers. He laughed and sent them packing, saying only a bunch of softies would erode British liberty to give themselves an easier job. He said they laughed in return and remarked that "it was worth a try".
I'm going to tell you exactly how someone can trick you into thinking they're your friend. Now, before you send me hate mail for revealing this deep, dark secret, let me assure you that the scammers, crooks, predators, stalkers and identity thieves are already aware of this trick. It works only because the public is not aware of it. If you're scamming someone, here's what you'd do:
Like what? Lend me $500. When are you going out of town? Etc.
The author has no evidence that anyone has actually done this, but certainly someone will do this sometime in the future.
We have seen attacks by people hijacking existing social networking accounts:
Rutberg was the victim of a new, targeted version of a very old scam -- the "Nigerian," or "419," ploy. The first reports of such scams emerged back in November, part of a new trend in the computer underground -- rather than sending out millions of spam messages in the hopes of trapping a tiny fractions of recipients, Web criminals are getting much more personal in their attacks, using social networking sites and other databases to make their story lines much more believable.
The NSA had an incinerator in their old Arlington Hall facility that was designed to reduce top secret crypto materials and such to ash. Someone discovered that it wasn't in fact working. Contract disposal trucks had been disposing of this not-quite-sanitized rubish, and officers tracked down a huge pile in a field in Ft. Meyer.
Nice rundown of the statistics.
The single greatest killer of Americans is the so-called "lifestyle disease." Somewhere between half a million and a million of us get a short ride in a long hearse every year because of smoking, lousy diets, parking our bodies in front of the TV instead of operating them, and downing yet another six pack and / or tequila popper.
At least, according to U.S. law:
This is a very broad definition, and one that involves the intention of the weapon's creator as well as the details of the weapon itself.
In an e-mail, John Mueller commented:
As I understand it, not only is a grenade a weapon of mass destruction, but so is a maliciously-designed child's rocket even if it doesn't have a warhead. On the other hand, although a missile-propelled firecracker would be considered a weapons of mass destruction if its designers had wanted to think of it as a weapon, it would not be so considered if it had previously been designed for use as a weapon and then redesigned for pyrotechnic use or if it was surplus and had been sold, loaned, or given to you (under certain circumstances) by the Secretary of the Army.
Amusing, to be sure, but there's something important going on. The U.S. government has passed specific laws about "weapons of mass destruction," because they're particularly scary and damaging. But by generalizing the definition of WMDs, those who write the laws greatly broaden their applicability. And I have to wonder how many of those who vote in favor of the laws realize how general they really are, or -- if they do know -- vote for them anyway because they can't be seen to be "soft" on WMDs.
It reminds me of those provisions of the USA PATRIOT Act -- and other laws -- that created police powers to be used for "terrorism and other crimes."
EDITED TO ADD (4/14): Prosecutions based on this unreasonable definition.
Computer scientists Arvind Narayanan and Dr Vitaly Shmatikov, from the University of Texas at Austin, developed the algorithm which turned the anonymous data back into names and addresses.
In "De-anonymizing social networks," Narayanan and Shmatikov take an anonymous graph of the social relationships established through Twitter and find that they can actually identify many Twitter accounts based on an entirely different data source—in this case, Flickr.
Here's the paper.
By looking in the stomachs of three sperm whales stranded in the Bay of Biscay, Cherel recovered hundreds of beaks from 19 separate species -- 17 squids including the giant squid, the seven-arm octopus (the largest in the world) and the bizarre vampire squid. Together, these species represent a decent spread of the full diversity of deep-sea cephalopods.
On the threats of insiders, from Federal News Radio.
Before his arrest, Tom Berge stole lead roof tiles from several buildings in south-east England, including the Honeywood Museum in Carshalton, the Croydon parish church, and the Sutton high school for girls. He then sold those tiles to scrap metal dealers.
As a security expert, I find this story interesting for two reasons. First, amongst increasingly ridiculous attempts to ban, or at least censor, Google Earth, lest it help the terrorists, here is an actual crime that relied on the service: Berge needed Google Earth for reconnaissance.
But more interesting is the discrepancy between the value of the lead tiles to the original owner and to the thief. The Sutton school had to spend £10,000 to buy new lead tiles; the Croydon Church had to repair extensive water damage after the theft. But Berge only received £700 a ton from London scrap metal dealers.
This isn't an isolated story; the same dynamic is in play with other commodities as well.
There is an epidemic of copper wiring thefts worldwide; copper is being stolen out of telephone and power stations—and off poles in the streets—and thieves have killed themselves because they didn't understand the dangers of high voltage. Homeowners are returning from holiday to find the copper pipes stolen from their houses. In 2001, scrap copper was worth 70 cents per pound. In April 2008, it was worth $4.
Gasoline siphoning became more common as pump prices rose. And used restaurant grease, formerly either given away or sold for pennies to farmers, is being stolen from restaurant parking lots and turned into biofuels. Newspapers and other recyclables are stolen from curbs, and trees are stolen and resold as Christmas trees.
Iron fences have been stolen from buildings and houses, manhole covers have been stolen from the middle of streets, and aluminum guard rails have been stolen from roadways. Steel is being stolen for scrap, too. In 2004 in Ukraine, thieves stole an entire steel bridge.
These crimes are particularly expensive to society because the replacement cost is much higher than the thief's profit. A manhole cover is worth $5–$10 as scrap, but it costs $500 to replace, including labor. A thief may take $20 worth of copper from a construction site, but do $10,000 in damage in the process. And even if the thieves don't get to the copper or steel, the increased threat means more money being spent on security to protect those commodities in the first place.
Security can be viewed as a tax on the honest, and these thefts demonstrate that our taxes are going up. And unlike many taxes, we don't benefit from their collection. The cost to society of retrofitting manhole covers with locks, or replacing them with less resalable alternatives, is high; but there is no benefit other than reducing theft.
These crimes are a harbinger of the future: evolutionary pressure on our society, if you will. Criminals are often referred to as social parasites; they leech off society but provide no useful benefit. But they are an early warning system of societal changes. Unfettered by laws or moral restrictions, they can be the first to respond to changes that the rest of society will be slower to pick up on. In fact, currently there's a reprieve. Scrap metal prices are all down from last year's—copper is currently $1.62 per pound, and lead is half what Berge got—and thefts are down along with them.
We've designed much of our infrastructure around the assumptions that commodities are cheap and theft is rare. We don't protect transmission lines, manhole covers, iron fences, or lead flashing on roofs. But if commodity prices really are headed for new higher stable points, society will eventually react and find alternatives for these items—or find ways to protect them. Criminals were the first to point this out, and will continue to exploit the system until it restabilizes.
A version of this essay originally appeared in The Guardian.
A story about a very expensive series of false positives. The German police spent years and millions of dollars tracking a mysterious killer whose DNA had been found at the scenes of six murders. Finally they realized they were tracking a worker at the factory that assembled the prepackaged swabs used for DNA testing.
This story could be used as justification for a massive DNA database. After all, if that factory worker had his or her DNA in the database, the police would have quickly realized what the problem was.
U.S. government cybersecurity is an insecure mess, and fixing it is going to take considerable attention and resources. Trying to make sense of this, President Barack Obama ordered a 60-day review of government cybersecurity initiatives. Meanwhile, the U.S. House Subcommittee on Emerging Threats, Cybersecurity, Science and Technology is holding hearings on the same topic.
One of the areas of contention is who should be in charge. The FBI, DHS and DoD -- specifically, the NSA -- all have interests here. Earlier this month, Rod Beckström resigned from his position as director of the DHS's National Cybersecurity Center, warning of a power grab by the NSA.
Putting national cybersecurity in the hands of the NSA is an incredibly bad idea. An entire parade of people, ranging from former FBI director Louis Freeh to Microsoft's Trusted Computing Group Vice President and former Justice Department computer crime chief Scott Charney, have told Congress the same thing at this month's hearings.
Cybersecurity isn't a military problem, or even a government problem -- it's a universal problem. All networks, military, government, civilian and commercial, use the same computers, the same networking hardware, the same Internet protocols and the same software packages. We all are the targets of the same attack tools and tactics. It's not even that government targets are somehow more important; these days, most of our nation's critical IT infrastructure is in commercial hands. Government-sponsored Chinese hackers go after both military and civilian targets.
Some have said that the NSA should be in charge because it has specialized knowledge. Earlier this month, Director of National Intelligence Admiral Dennis Blair made this point, saying "There are some wizards out there at Ft. Meade who can do stuff." That's probably not true, but if it is, we'd better get them out of Ft. Meade as soon as possible -- they're doing the nation little good where they are now.
Not that government cybersecurity failings require any specialized wizardry to fix. GAO reports indicate that government problems include insufficient access controls, a lack of encryption where necessary, poor network management, failure to install patches, inadequate audit procedures, and incomplete or ineffective information security programs. These aren't super-secret NSA-level security issues; these are the same managerial problems that every corporate CIO wrestles with.
We've all got the same problems, so solutions must be shared. If the government has any clever ideas to solve its cybersecurity problems, certainly a lot of us could benefit from those solutions. If it has an idea for improving network security, it should tell everyone. The best thing the government can do for cybersecurity world-wide is to use its buying power to improve the security of the IT products everyone uses. If it imposes significant security requirements on its IT vendors, those vendors will modify their products to meet those requirements. And those same products, now with improved security, will become available to all of us as the new standard.
Moreover, the NSA's dual mission of providing security and conducting surveillance means it has an inherent conflict of interest in cybersecurity. Inside the NSA, this is called the "equities issue." During the Cold War, it was easy; the NSA used its expertise to protect American military information and communications, and eavesdropped on Soviet information and communications. But what happens when both the good guys the NSA wants to protect, and the bad guys the NSA wants to eavesdrop on, use the same systems? They all use Microsoft Windows, Oracle databases, Internet email, and Skype. When the NSA finds a vulnerability in one of those systems, does it alert the manufacturer and fix it -- making both the good guys and the bad guys more secure? Or does it keep quiet about the vulnerability and not tell anyone -- making it easier to spy on the bad guys but also keeping the good guys insecure? Programs like the NSA's warrantless wiretapping program have created additional vulnerabilities in our domestic telephone networks.
Testifying before Congress earlier this month, former DHS National Cyber Security division head Amit Yoran said "the intelligence community has always and will always prioritize its own collection efforts over the defensive and protection mission of our government's and nation's digital systems."
Maybe the NSA could convince us that it's putting cybersecurity first, but its culture of secrecy will mean that any decisions it makes will be suspect. Under current law, extended by the Bush administration's extravagant invocation of the "state secrets" privilege when charged with statutory and constitutional violations, the NSA's activities are not subject to any meaningful public oversight. And the NSA's tradition of military secrecy makes it harder for it to coordinate with other government IT departments, most of which don't have clearances, let alone coordinate with local law enforcement or the commercial sector.
We need transparent and accountable government processes, using commercial security products. We need government cybersecurity programs that improve security for everyone. The NSA certainly has an advisory and a coordination role in national cybersecurity, and perhaps a more supervisory role in DoD cybersecurity -- both offensive and defensive -- but it should not be in charge.
A version of this essay appeared on The Wall Street Journal website.
The loss of two MOBA works to theft has drawn media attention, and enhanced the museum's stature. In 1996, the painting Eileen, by R. Angelo Le, vanished from MOBA. Eileen was acquired from the trash by Wilson, and features a rip in the canvas where someone slashed it with a knife even before the museum acquired it, "adding an additional element of drama to an already powerful work," according to MOBA.
Be sure and notice the camera.
Let's face it, the War on Terror is a tired brand. There just isn't enough action out there to scare people. If this keeps up, people will forget to be scared. And then both the terrorists and the terror-industrial complex lose. We can't have that.
We're going to help revive the fear. There's plenty to be scared about, if only people would just think about it in the right way. In this Fourth Movie-Plot Threat Contest, the object is to find an existing event somewhere in the industrialized world—Third World events are just too easy—and provide a conspiracy theory to explain how the terrorists were really responsible.
The goal here is to be outlandish but plausible, ridiculous but possible, and—if it were only true—terrifying. (An example from The Onion: Fowl Qaeda.) Entries should be formatted as a news story, and are limited to 150 words (I'm going to check this time) because fear needs to be instilled in a population with short attention spans. Submit your entry, by the end of the month, in comments.
EDITED TO ADD: The contest has ended; the winner is here
Powered by Movable Type. Photo at top by Per Ervland.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..