Hacking Team, Computer Vulnerabilities, and the NSA

When the National Security Administration (NSA)—or any government agency—discovers a vulnerability in a popular computer system, should it disclose it or not? The debate exists because vulnerabilities have both offensive and defensive uses. Offensively, vulnerabilities can be exploited to penetrate others’ computers and networks, either for espionage or destructive purposes. Defensively, publicly revealing security flaws can be used to make our own systems less vulnerable to those same attacks. The two options are mutually exclusive: either we can help to secure both our own networks and the systems we might want to attack, or we can keep both networks vulnerable. Many, myself included, have long argued that defense is more important than offense, and that we should patch almost every vulnerability we find. Even the President’s Review Group on Intelligence and Communications Technologies recommended in 2013 that “U.S. policy should generally move to ensure that Zero Days are quickly blocked, so that the underlying vulnerabilities are patched on U.S. Government and other networks.”

Both the NSA and the White House have talked about a secret “vulnerability equities process” they go through when they find a security flaw. Both groups maintain the process is heavily weighted in favor or disclosing vulnerabilities to the vendors and having them patched.

An undated document—declassified last week with heavy redactions after a year-long Freedom of Information Act lawsuit—shines some light on the process but still leaves many questions unanswered. An important question is: which vulnerabilities go through the equities process, and which don’t?

A real-world example of the ambiguity surrounding the equities process emerged from the recent hacking of the cyber weapons arms manufacturer Hacking Team. The corporation sells Internet attack and espionage software to countries around the world, including many reprehensible governments to allow them to eavesdrop on their citizens, sometimes as a prelude to arrest and torture. The computer tools were used against U.S. journalists.

In July, unidentified hackers penetrated Hacking Team’s corporate network and stole almost everything of value, including corporate documents, e-mails, and source code. The hackers proceeded to post it all online.

The NSA was most likely able to penetrate Hacking Team’s network and steal the same data. The agency probably did it years ago. They would have learned the same things about Hacking Team’s network software that we did in July: how it worked, what vulnerabilities they were using, and which countries were using their cyber weapons. Armed with that knowledge, the NSA could have quietly neutralized many of the company’s products. The United States could have alerted software vendors about the zero-day exploits and had them patched. It could have told the antivirus companies how to detect and remove Hacking Team’s malware. It could have done a lot. Assuming that the NSA did infiltrate Hacking Team’s network, the fact that the United States chose not to reveal the vulnerabilities it uncovered is both revealing and interesting, and the decision provides a window into the vulnerability equities process.

The first question to ask is why? There are three possible reasons. One, the software was also being used by the United States, and the government did not want to lose its benefits. Two, NSA was able to eavesdrop on other entities using Hacking Team’s software, and they wanted to continue benefitting from the intelligence. And three, the agency did not want to expose their own hacking capabilities by demonstrating that they had compromised Hacking Team’s network. In reality, the decision may have been due to a combination of the three possibilities.

How was this decision made? More explicitly, did any vulnerabilities that Hacking Team exploited, and the NSA was aware of, go through the vulnerability equities process? It is unclear. The NSA plays fast and loose when deciding which security flaws go through the procedure. The process document states that it applies to vulnerabilities that are “newly discovered and not publicly known.” Does that refer only to vulnerabilities discovered by the NSA, or does the process also apply to zero-day vulnerabilities that the NSA discovers others are using? If vulnerabilities used in others’ cyber weapons are excluded, it is very difficult to talk about the process as it is currently formulated.

The U.S. government should close the vulnerabilities that foreign governments are using to attack people and networks. If taking action is as easy as plugging security vulnerabilities in products and making everyone in the world more secure, that should be standard procedure. The fact that the NSA—we assume—chose not to suggests that the United States has its priorities wrong.

Undoubtedly, there would be blowback from closing vulnerabilities utilized in others’ cyber weapons. Several companies sell information about vulnerabilities to different countries, and if they found that those security gaps were regularly closed soon after they started trying to sell them, they would quickly suspect espionage and take more defensive precautions. The new wariness of sellers and decrease in available security flaws would also raise the price of vulnerabilities worldwide. The United States is one of the biggest buyers, meaning that we benefit from greater availability and lower prices.

If we assume the NSA has penetrated these companies’ networks, we should also assume that the intelligence agencies of countries like Russia and China have done the same. Are those countries using Hacking Team’s vulnerabilities in their cyber weapons? We are all embroiled in a cyber arms race—finding, buying, stockpiling, using, and exposing vulnerabilities—and our actions will affect the actions of all the other players.

It seems foolish that we would not take every opportunity to neutralize the cyberweapons of those countries that would attack the United States or use them against their own people for totalitarian gain. Is it truly possible that when the NSA intercepts and reverse-engineers a cyberweapon used by one of our enemies—whether a Hacking Team customer or a country like China—we don’t close the vulnerabilities that that weapon uses? Does the NSA use knowledge of the weapon to defend the U.S. government networks whose security it maintains, at the expense of everyone else in the country and the world? That seems incredibly dangerous.

In my book Data and Goliath, I suggested breaking apart the NSA’s offensive and defensive components, in part to resolve the agency’s internal conflict between attack and defense. One part would be focused on foreign espionage, and another on cyberdefense. This Hacking Team discussion demonstrates that even separating the agency would not be enough. The espionage-focused organization that penetrates and analyzes the products of cyberweapons arms manufacturers would regularly learn about vulnerabilities used to attack systems and networks worldwide. Thus, that section of the agency would still have to transfer that knowledge to the defense-focused organization. That is not going to happen as long as the United States prioritizes surveillance over security and attack over defense. The norms governing actions in cyberspace need to be changed, a task far more difficult than any reform of the NSA.

This essay previously appeared in the Georgetown Journal of International Affairs.

EDITED TO ADD: Hacker News thread.

Posted on September 15, 2015 at 6:38 AM43 Comments


Evan September 15, 2015 7:15 AM

Actually, the interesting possibility is that the NSA penetrated Hacking Team’s systems to acquire new assets (vulnerabilities) for its mission. Even if they didn’t use the products themselves, access to their data could enhance the NSA’s own tools.

One would also assume the NSA would be sure to harden their own systems against these vulnerabilities, but the people in the NSA devoted to improving security and those devoted to compromising it don’t seem to talk to each other much.

Ramriot September 15, 2015 7:53 AM

What I find fascinating Bruce is that amonghts the supposition and projection of this post you did not pause to consider the opposite possibility even though it has on the frame of the post equal likelyhood.

What if NSA etc, has NOT penetrated this Italian companies servers, for which there could be a multitude of reasons. Someone did, a someone who treasures openness over privacy so than now we know a little better the scope of the vulnerability exploitation business.

Either way, it is immaterial to the subject of the post and at best is filler while at worst distracts from the main thrust.

NSA duel role is to protect US and provide means to attack non-US. Unfortunately most computer vulnerabilities cover both camps equally so we would like to be assured their policy errs on the protection side, but we are just not sure we can trust them that it does

Benhm3 September 15, 2015 8:23 AM

I may be amplifying Ramriot’s remark: What if the NSA never did penetrate the Hacking Team’s site/server(s)?

I assume Bruce has nearly asymptotic certainty in his argument that they did, and I feel underdressed basing my position on the crappier side of human behavior so heavily amplified in large organizations (laziness, stupidity, greed, CYA, inertia.) But what if the NSA never got there? What if they missed this one, and ever so many others?

We’re ascribing to them nearly god-like powers of observation, collection, and integration but what if they’re actually quite bad at the 3rd item? Instead of chrome-plated, sleek, streamlined killbots, they’re actually more like a bunch of Gru’s minions?

I grant that depending on my humorous comparison is less than a fig leaf’s coverage in a hurricane, that the most inefficient integration of modestly decent data collection will eventually ruin the civil rights of a high percentage of humans. But still, assuming these guys are both smart enough to be restrained in their self-promotion AND star-spangled awesome at their jobs would lead me to believe that drones would be automagically eliminating threats with a far higher success rate.

Yes, humor was employed in this post, but it doesn’t diminish the basic question: What if these guys are bigger putzes than we’re think?

IanLB September 15, 2015 9:01 AM

It seems foolish that we would not take every opportunity to neutralize the cyberweapons of those countries that would attack the United States or use them against their own people for totalitarian gain.

I don’t know Bruce, I’m far from certain about my own opinion of the matter, but I could also say that it would seems foolish for the US to damper its ability to perform offensive cyber operations by systematically publishing known vulnerabilities, in a context where attackers have such a huge advantage over the defenders – a situation that could remain true for a long time.

Furthermore, it’s not enough to publish vulnerabilities, we need to patch them, and that’s by far the most difficult part. Even well known, very publicized bugs end up taking years to be properly mitigated across the industry, and it won’t be any faster for more obscure ones.

I’m not sure if it would be wise for the NSA to abandon (or at least, seriously restrict) activities they are currently the best at, just for potentially marginal defensive benefits that may not help the situation of American organisations, and would have to be shared with the rest of world by definition (thus making the NSA finance the defense of everyone else).

CaptDerWould September 15, 2015 9:39 AM

For those focusing on whether the NSA did or did not penetrate the Hacking Team’s systems, I think you are missing the larger point. The offensive capabilities of any government institution will win out against any defensive program. Snowden warned about this and was one of the reasons he decided on becoming a whistle-blower. (http://www.pbs.org/wgbh/nova/next/military/snowden-transcript/)

The U.S. government and companies are the largest data troves in the world. As the largest target we should be actively shoring up our defenses not ignoring them at the expense of a shiny new weapon that we might be able to use someday.

r September 15, 2015 10:49 AM

@peter: NSA stands for ‘no straight answers’.

unrelated: i don’t think hacking hacking teams infrastructure would be a necessary given the NSA’s intentionally top-down deployments. those exploits could easily be obtained in the wild either from the originator or from the users of hacking teams software themselves. PLUS – who’s to say that companies such as hacking team aren’t classified as law-enforcement and protected IP and thus respected boundaries. i’m not sure the exploits are where all the riches are, the customer lists give active intelligence targets.

948164 September 15, 2015 10:55 AM

@Bruce: It’s the National Security Agency, not Administration. A typo? Or maybe you were poking fun at their enormous authoritarian power.

Lev September 15, 2015 11:43 AM


When the OPM outsourced their system administration to someone physically located in China, I doubt there was any technological measure that would prevent the hack.

K.S. September 15, 2015 12:27 PM

I think securing any networked system against state-actors is impossible task. There are two ways to effectively hide something: a) take it completely offline like Russia with mechanical typewriters b) drown it in lots of ‘noise’ data like NSA with mass collections.

I think as a information security practitioner what state-actors do is largely irrelevant. There is no threat model that would be adequate for such task, and there won’t be any usability left if we try.

Bob S. September 15, 2015 12:35 PM

A few months after the very first commercial telephone service was created in the USA, police were pounding on the corporate door demanding and getting covert access. That was when police literally stationed themselves in the basement of the telecom office or target building using alligator clips to physically hack telephone lines. It’s been that way ever since, all over the world. Now the virtual alligator clips are on the WWW fiber optic backbone.

I don’t see changing the NSA organization chart would change the way they operate, at all.

What is different between now and then is the quantity and quality of the data collected. Then, only the most important targets could be targeted due to physical limitations and cost. Not any more. Everyone in the world is the target. Literally. We are targets for all manner of purposes including political vendettas, corporate espionage and furthering war profiteering.

Legal restrictions didn’t stop government agents then, or now.

I strongly object to current military-police leaders who refer to good American citizens as “adversaries” merely because they protest or criticize brazen lawless intrusions and mass collection of electronic data. If they view American citizens as the enemy, what does that make them in return?

The world wide mass surveillance paradigm was revealed and clarified due to the Snowden revelations. Also, the response of world governments, which is: “stay the course”.

So far the response of the majority of people is to comply, ignore or twiddle. Maybe that’s our future.

albert September 15, 2015 1:51 PM

“…hacking team aren’t classified as law-enforcement and protected IP and thus respected boundaries…”
No. NSA doesn’t give a RSA about IP or LE, or respecting boundaries. Only if Hacking Team was a CIA asset, might they have to avoid it, because CIA has more mojo. Speaking of which:
What if certain sections of the IC actually execute hacks on US websites, to maintain the nations hate/fear of the bogeyman du jour? Do we know anything about HT, or what their products actually do? If I would agree with anything The Enemy says, it would be that ‘hacking’ the HT (and its ilk) is OK.
. .. . .. o

tyr September 15, 2015 2:36 PM

I think what we have seen quite clearly is a failure of
leadership, or what is more properly named a lack of the
qualities of leaders. What we have in these bureaucracies
is management. That might work if you are singly directed
towards making money but it doesn’t work worth a crap if
you are not supposed to be single focussed. Humans are
very bad at dealing with complexity and in seeing long
term consequences of policies. The IC is horrible at it
from an institutional viewpoint, not only the left hand
doesn’t know what the right hand is doing, it is forced
to Stay unaware by internal policies. Applying external
pressures to a mess like this is just teaching pigs to
fly. It irritates the pigs, it won’t make them fly and
it wastes the time and energy of the teachers.
We do need an active government agency that does comp
defense in partnership with commercial actors, but NSA
is the worst possible choice for that job. The job of
any needed offense should be transferred to military
departments who should maintain their own IC as part of
the job. One thing is clear the IC complex has been a
total failure for more than 100 years because they have
done far more harm than good from any objective measure.

Here’s a news flash for them. The idea is to have less
enemies when you do something, not to make more with
every misguided clumsy attempt to manufacture an arcane

We aso need to get law enforcement out of the IC and
back to doing police work. That means throwing the next
bunch of them who manufacture a crime with entrapment
into their own jail as an example to the rest. It’s an
example of “rule of Law” and would send a clear message.

It is easy to lead, all you have to do is set a good
example,reward competence, punish failure and have a
goal that you communicate to your organization. You
are not managing anything if you’re leading.

Jacob September 15, 2015 3:19 PM

This “vulnerability equities process” can be a perfect assignment for the “Chief Risk Officer” – Anne Neuberger, who was recruited last year by Adm. Rogers for the newly created position. Ms. Neuberger, a 39 yo from an ultra-orthodox house, served before that at the Pentagon under Gates and in the the core team of the US Cyber Command as a project facilitator.


In an interview a couple of days ago, she indicated that her duty at the agency is to weigh operational, technical and startegic risks vis a vis cost-benefit analysis.

Go for it!

r September 15, 2015 4:05 PM

@albert, yeah the ‘avoidance’ of ht is an ignorant statement on my part.
but i really do believe that targetting ht solely for exploits is a little silly, the real gold would be in their contacts/clients/victims; not to mention the level of plausibility provided by the appropriation and redeployment of their software, or a strategic sed s/unsigned/signed/g of it’s source.

Allen September 15, 2015 4:07 PM

The larger question is whether covert programs, especially of the illicit variety, are a net gain or a net loss. The biggest obstacle to answering that question seems to be that, by their very nature, we don’t know the benefits of clandestine programs. As the IC is fond of saying, their failures make headlines, but their successes will never be known. (That also happens to be very convenient justification, because it boils down to this: “We’re doing a good job because we say we are, and you just have to trust us.”)

But is that true? Are successes hidden? If they are, we may not be able to point to them, but we can certainly put limits on them by simply observing reality. We can say unequivocally what the IC has not accomplished, and so any degree of success must necessarily be less than that.

For example, intelligence has not led to a bastion of democracy in the middle east, or anywhere else that it did not arise organically. Indeed, most efforts toward that goal have had the opposite result. There is over a half-century worth of data to illustrate that point, with zero successes, and failures that are staggering in both quantity and scale. Meanwhile, the IC failed to predict, let alone bring about, the collapse of the USSR, the secession of Soviet states, and the (somewhat) democratic governments that followed.

The Intelligence Community has not led to the downfall of radical islam, despite an intense effort over the past 15 years. It has, however, fueled anger over covert drone strikes. It has also directly contributed to an unnecessary war, which created the fertile, if putrid, soil in which ISIL has bloomed. In justifying its own existence, it has magnified a relatively small threat to one that will engulf the world in flames. It has either focused on the grandiose intent of radical organizations, at the expense of losing sight of the anemic capability, or else it has deliberately allowed the public to do the same. (This is par for the course, though, as there is a long history of overstated threats presented by bit players from South America to Southeast Asia, whether it’s from communism, or terrorism, or drugs.)

The Intelligence Community neither warned us of, nor protected us against, either Pearl Harbor or 9/11. The fact that larger attacks have not occurred is likely a testament to our deterrent capabilities rather than our omniscience. The lone exception may arguably be the Cuban Missile Crisis, but that was not necessarily anything more than posturing, and it merely required aerial observation to confirm the testimony of Oleg Penkovsky, a walk-in source. Indeed, walk-ins are still among the most important intelligence assets, much the way our own defectors have been our greatest liabilities.

The Intelligence Community has not given us a secure network infrastructure. On the contrary, they have made extraordinary efforts to compromise any security that does or may exist, for their own interests, which seem to be aligned with the public’s interest more in theory than in reality. Indeed, they have made the public less secure against real and direct harm that occurs every day online in exchange for the ability to monitor that network for potential threats. It’s the wrong tradeoff. The internet isn’t a pawn, it’s a queen, and it should be protected except in existential circumstances — assuring total victory, or escaping defeat.

The problem with illicit covert operations, and the reason it is so hard to find success, and so easy to find failure, is not because success is secret. If success is so marginal that its very existence can be kept secret, then it’s a shallow victory indeed.

The problem is that most illicit covert operations are in direct conflict with our values. We do not value secret trials, secret evidence, and covert justice. We do not value punishment without due process, let alone execution. Criminal proceedings project more power and garner more respect than targeted strikes ever will. We do not believe that government should be allowed to operate in any significant way without at least the knowledge of its electorate, if not the approval.

Once we accept that illicit covert activities do more harm than good — and denying that is willful disregard for reality at this point — then the answer to this question becomes apparent: Should we use vulnerabilities for defense? Of course we should. If our networks and infrastructure were secure, instead of porous, international businesses would be flocking to us instead of fleeing, as they are today.

And unlike weapons, we cannot stockpile vulnerabilities for, say, retaliatory strikes. They have very short lifespans. They’re discovered, they’re either used or disclosed, and then they’re fixed. If you sit on it, someone else will find it. Use it or lose it.

So let’s strengthen our infrastructure instead of trying to cripple someone else’s, because it’s one and the same. Unless you’re reading this on JWICS (hi there), we’re all on the same internet. If we had spent the past decade securing that network instead of weakening it, we might not be as susceptible to “cyber threats” as we are today. And that day is coming anyway, because business can’t allow the current environment to continue, and we cannot continue to exist without commerce. Networks will be fully encrypted, executable software will be validated, and, eventually, mass surveillance will be impossible. We will be back to the point where physical security is the weakest link. Hopefully by then we’ll be ready to start leading by example again.

Allen September 15, 2015 4:25 PM

“Furthermore, it’s not enough to publish vulnerabilities, we need to patch them, and that’s by far the most difficult part. Even well known, very publicized bugs end up taking years to be properly mitigated across the industry, and it won’t be any faster for more obscure ones.”

It’s exactly the opposite, actually. Discovering vulnerabilities is far more labor intensive than patching them. Most bugs can be patched within minutes once they’ve been identified. And improving the distribution time of those patches is always possible; it’s not an insurmountable obstacle.

For hardware, it’s more difficult, and more expensive, but not impossible. Vehicle recalls happen all the time.

Speaking of vehicles, should we keep, say, a high rate of brake failure a secret because we know Al Zarqawi has those brakes on his car? That’s the dilemma. If you can call it that. Every time we knowingly leave something broken, we’re asking for problems, because it’s just as likely to bite us in the ass, if it hasn’t already.

albert September 15, 2015 5:11 PM

“…The job of any needed offense should be transferred to military departments who should maintain their own IC as part of the job….”
Yes. The US military is very good at what they do. Their leadership seems intelligent, and they are capable of deep thinking on military matters. Unfortunately, their psychotic civilian bosses always succeed in making them look incompetant.
Yeah, we’ll see what Anne Neuberger can do. She’s got an MBA, and a masters in “International Relations”. It doesn’t seem like they need another bureaucrat, does it? Organizationally, she just be another member of Alexanders Ragtime Band.
Well…I guess I’m done wading through it for today. Start with fresh boots tomorrow.
. .. . .. o

Rob Ot September 15, 2015 6:04 PM

@Bruce Schneier:

In my book Data and Goliath, I suggested breaking apart the NSA’s offensive and defensive components, in part to resolve the agency’s internal conflict between attack and defense. One part would be focused on foreign espionage, and another on cyberdefense.

Actually they shouldn’t be too completely broken apart. They need to stay under the same umbrella, because otherwise their conflicting interests will lead to both consuming more resources when in effect working against each other.

A conceptual internal conflict is much easier to manage than a real one. As long as they do not work too hard in either attack or defense (or more correctly, choose carefully where they put their efforts in either attack or defense) the internal conflict is more like just conceptual.

But a house divided against itself cannot stand (Mark 3:25) and in current situation some NSA might still be preferable to a crippled NSA.

TascoBlossom September 15, 2015 8:07 PM

From #6:

“In a May article in The Atlantic,[BS] Bruce Schneier asked a cogent
first-principles question: Are vulnerabilities in software dense
or sparse? If they are sparse, then every one you find and fix
meaningfully lowers the number of avenues of attack that are extant.
If they are dense, then finding and fixing one more is essentially
irrelevant to security and a waste of the resources spent finding
it. Six-take-away-one is a 15% improvement. Six-thousand-take-
away-one has no detectable value.”

I can tell you that just a couple years ago, it was thought vulnerabilities were sparse. Go back and read from essays, articles, blogs, etc. Invariably, when an attack was publicized the reaction was “they should have done such and such” as if had they not made that one little bitty mistake it would not have happened. Users were all stupid and if they just used open source or got rid of Windows and went all Linux or got rid of certain routers or just used this or that encryption and made sure passwords were longer… on and on. It was as if there were really just a few problems and if you could only avoid those then attacks could not happen.

IanLB September 15, 2015 9:36 PM


It’s exactly the opposite, actually. Discovering vulnerabilities is far more labor intensive than patching them. Most bugs can be patched within minutes once they’ve been identified. And improving the distribution time of those patches is always possible; it’s not an insurmountable obstacle.

Patching a single machine is simpler, sure. Patching large environments, companies, entire industries and countries are what the people on the defensive side of the equation are actually asked to do. You say “most bugs are patched within minutes”, but patches need to be tested, distributed, sometimes applied on live systems. That’s not trivial. And in the end that’s a much more difficult and expensive (in term of manpower and time) that the vulnerability/exploitation part.

This is especially significant since “success” here is basically a 100% mitigation of the vulnerability (of all vulnerabilities, in fact). A hacker, in theory, only need to find one flaw and exploit it. The defender needs to correct and mitigation every single one of them. If he can’t, he’ll be called an incompetent.

In term of resources consumed, hacking is cheap compared to the defense. That’s why so many states and criminals have been able jump in the game, with an ever growing success, while pretty much the entire IT industry has been struggling to defend and still does. And much more money is poured in the later.

Sahlberg September 15, 2015 9:44 PM

If they now disclose such a majority of vulnerabilities they find,
it should be fairly easy to publish a list of all those.

So, where is the list of vulnerabilities they have reported and had vendors fix?
CVE numbers please.

Allen September 16, 2015 12:03 AM


Ok, granted, from a total manpower perspective, it takes relatively few people working on something for a relatively short amount of time to exceed, say, one man-year. It can cost more man hours to patch something than it does to exploit it. I concede that.

Nonetheless, it has to be done. It’s upkeep. It’s overhead. Whether the vulnerabilities are discovered by hackers, by software authors, or by the NSA, vulnerabilities must be patched at some point in order to have a secure system. Discarding one potential source of vulnerabilities doesn’t make the maintenance costs to go away.

Yes, it’s cheap to hack, and yes, there’s the “x must be right all the time, y need only be right once,” mantra that’s applied to various scenarios, whether it’s hacking, or law enforcement, or what have you. But we don’t need to be bulletproof, actually. The perfect is the enemy of the good. We just need to be good enough. And right now, we’re not that. But with more focus on building secure infrastructure, we could be.

Brandioch Conner September 16, 2015 12:45 AM


But we don’t need to be bulletproof, actually. The perfect is the enemy of the good. We just need to be good enough.

I agree. And I wonder why we keep fixating on the EASIEST option?

Fixing vulnerabilities is not going to kill electronic espionage (saying “cyber” is stupid). It might make the NSA’s job more difficult. But difficult is not the same as impossible.

And fixing vulnerabilities will also make the criminals’ jobs more difficult.

ianf September 16, 2015 1:02 AM

What a world we live in. A state actor, an agency tasked with—among other things—protecting own citizenry by thwarting cyber attacks against groups and individuals, instead interprets its mission as a license to hijack those methods for use in “preventive” current & future spying on said citizens. And the elected representatives nominally responsible for overseeing such activities collude in their charges’ thus perverted “security narratives.” The spying produces ever growing “aircraft hangers full of stacks of data”[*] that are impossible to sift through for any particular “needle.” The agency in question considers it MISSION ACCOMPLISHED, then gen. Comey gets a raise & another stripe-tattoo up his ass.

[^*] actual quote from “American Terrorist,” the story of NSA & GCHQ’s failure to detect plain-text terror-planning activities of one Daood Gilani a.k.a. David Coleman Headley (written, produced & directed by Thomas Jennings for WGBH Boston/ Frontline 2014-2015).

65535 September 16, 2015 3:07 AM

“Many, myself included, have long argued that defense is more important than offense, and that we should patch almost every vulnerability we find.” –Bruce S.

I agree. It gives powers of unbridled spying to certain Agencies in a circular firing squad fashion [who will spy on the spies?]. This circular firing squad is a loose cannon on the deck and will be a point of contention among Agencies and 5-eye allies.

Rampant spying lends to rampant intellectual property theft and incentives for those in high political positions to manipulate their opponents. This could end very badly.

The “collect it all” including zero-day exploits behavior also creates a one-way mirror for those in Government to gaze at those who people who they deem “enemies” and crush them. The little guy cannot see past the one-way mirror. Worse, the little guy has to pay for it.

Further, stockpiling of Zero-day exploits by the NSA and the US Navy is counter productive and invites an Arms Race that we may not be able to win.

For example, it may be easier for China [PRC] to build back doors into hardware that American’s buy. These backdoors could be more potent than the NSA believes and more destructive [in operational effectiveness]. The machine/bios/chip rootkits work better than the software implanted ones. China may have a huge advantage in that aspect. It may take years to reverse that balance.

Lastly, starting an Arms race also starts an “easy money” style of rootkit industrial complex that will grow to huge proportions and end up consuming productive resources – only to fight another war. This is destructive and wasteful. Many innocent people will be sucked-up in the carnage.

“I suggested breaking apart the NSA’s offensive and defensive components, in part to resolve the agency’s internal conflict between attack and defense. One part would be focused on foreign espionage, and another on cyberdefense…” – Bruce S.

I am beginning to see this as the most realistic solution. But, with all solution there has to be oversight by the people – not just secret courts issuing secret rulings – real oversight!

Misplaced Priorities Loyalties and Incompetence in Govt September 16, 2015 3:46 AM

“This vulnerability equities process can be a perfect assignment for the “Chief Risk Officer” – Anne Neuberger, who was recruited last year by Adm. Rogers for the newly created position. Ms. Neuberger, a 39 yo from an ultra-orthodox house, served before that at the Pentagon under Gates and in the the core team of the US Cyber Command as a project facilitator…”

From Snowden, is a proven fact that the NSA gives vast amounts of raw American citizen communications to Israel and the other Five Eyes because its illegal for the NSA to process itself.
What are her rulings in this area? Did she ever study the USA Constitution?

Her appointment is worrying as the ultra-orthodox first allegiance is for religious reasons with America’s security being a distant third!
I hope the NSA is giving polygraphs consistently to everyone without high level protection. Are there spies within the NSA or CIA? Certainly are but we NEVER hear about it!

Its remarkable that the every top-secret clearances investigation were stolen. Why weren’t this OPM treasure-chest of data identified as a primary security risk by the NSA? Unforgivable!
Instead hard-hitting NPR takes about what a great mother she is.

I can 100% predict that the health-care records of every American will be stolen. What is the NSA doing to prevent this completely predictable vast crime from occurring? Health care providers also collect Drivers License and Social Security numbers. So much for HIPAA protection. Thanks NSA for being a good mother!

Smirk September 16, 2015 5:17 AM

What i missed whas a 4th possibility: That the nsa finds vulnerabilities (in hacking team or otherwise) and patches them without disclosing it or not patching it where someone can find out its been patched.

So defending their own/most important systems while keeping holes open or turning holes into honeypots.

It doesnt seem too far fetched is it?

albert September 16, 2015 12:22 PM

@Misplaced Priorities Loyalties and Incompetence in Govt,

It seems obvious that knowledge of the Constitution isn’t a priority for NSA employees. Ya don’t want yer worker bees gettin’ any uppity ideas…like that traitor, Snowden.

Neuberger is a bureaucrat with no apparent background in cyber-anything. It’s ‘security’ theater, window dressing, BS. Maybe Israel needed some warm and fuzzy from US. Unless she’s Mossad, she’ll have zero understanding of anything she sees at the NSA. AFAIK, NSA and Mossad are sleeping in the same bed, so what’s the fuss? It’s non-news.

You know that NPR is a card-carrying member of the MSM, right?

. .. . .. oh

r September 16, 2015 1:09 PM

I don’t think they patch vulnerabilities, I don’t think there would be an extra version of flash per intelligence agency.
The same goes for the internet explorer vulnerability.
At most there’s private IDS/NIDS/signatures with very little or no examples outside their own walls because a) people do binary comparison reverse engineering pig patches and b) signatures existing for these on government networks would alert others as to the exposure or existence of such an exploit.

Who’s job is it too protect the state department, Senate, congress, president and the DoD from their own stupidity? Hillary was running an off-site email server, I really don’t see them practicing ANY real security outside of maybe in-transit streams over the non-public military network.

rgaff September 16, 2015 7:05 PM

@ Smirk

So you’re suggesting the NSA has their own “secure” fork of every software/hardware they ever use anywhere in our government?

Smirk September 17, 2015 9:15 AM


No but i do think they decide with each vulnerability they find how many of their own assets are vulnerable and when they patch they wait a little to inform the rest so they can have a couple of weeks more fun.

Jaymie Vischer September 17, 2015 10:34 AM

As well as what has already been mentioned, the relation between public budgets and private businesses, which has become increasingly commonplace in the security sector, has a lot to answer for these conflicts of interest. There are individuals in prominent positions within TLAs who control extremely generous budgets and have developed mutually beneficial agreements with the private sector (consultancies, service providers, manufacturers, etc.). These individuals and their contacts have made a lot of money as a result of the weaponization of the internet. It is very much in their interest to keep the market “healthy” (by which I mean turbulent).

rgaff September 17, 2015 4:35 PM

@ Smirk

So you’re saying they purposefully and knowingly leave THEMSELVES open to attack for “a while” (maybe even a year or two sometimes)… Sounds really smart to me… (sarcasm)

fajita September 18, 2015 12:29 AM

How about a certified ethical bloggers?

I’d be careful not to make fun of other ppls acronyms. Especially around Bruce ….

Sasparilla September 23, 2015 1:44 PM

Excellent article Bruce.

Quite frankly I think the major world governments have (intentionally or not) settled into embracing the “let things be vulnerable to us choice” so that our govt (and their govts) can see whatever we (& they) want whenever we (& they) they want, but of course the citizenry doesn’t benefit from this. Dividing the world’s population not along the lines of States but along the lines of the rulers/monitors and the monitored. As time goes on it seems this environment continues to be strengthened / cemented into position.

It’s absolutely stunning the lack of Democracies in the world that have stood up to prevent wholesale data monitoring of their and other’s general populations.

Preventing aggressive closing of zero days (and implanting new ones in partnership of major vendors) is essential for this environment to continue.

Dirk Praet September 8, 2016 3:54 AM

@ antipsychiatry

How can I bring these bastards to justice????

Perhaps you can start by taking your anxiety medicine.

@ Moderator

Please add me to the list of people who are seriously fed up with the mind-numbing levels of trolling here.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.