How to Stop an Insider from Stealing All Your Secrets

This article from Communications of the ACM outlines some of the security measures the NSA could, and should, have had in place to stop someone like Snowden. Mostly obvious stuff, although I’m not sure it would have been effective against such a skilled and tenacious leaker. What’s missing is the one thing that would have worked: have fewer secrets.

Posted on May 16, 2014 at 12:34 PM36 Comments

Comments

Daniel May 16, 2014 1:07 PM

“What’s missing is the one thing that would have worked: have fewer secrets.”

Don’t be a party pooper Bruce. No one likes a party pooper. Secrets are FUN!

Anura May 16, 2014 1:19 PM

The problem with having fewer secrets is that the more the voting public knows, the more informed their decisions can be. Keeping the public in the dark about as much as possible is the best way to keep power.

QM May 16, 2014 2:49 PM

The points in the article are laudable, but seem just so much “I told you so” in the end.

The reason the NSA, and many other critical operators, doesn’t do this well, is because it requires extra layers of administration. Besides the normal sysadmins, you need secure, compartmentalized sysadmins, none of which are authorized, knowledgeable, or even aware, of most of the rest of the security measures.

And every layer and compartment adds an order of magnitude to the time and effort required just to get things done. Imagine having a top-secret file system die — it’s probably 10 times harder to recover from that, than it was to set it up to start with. Hardware failures in a large organization happen daily. And software and people get it wrong too. Did your drive fail? OK, put it in this secure carrier, you can have it back in 6-8 weeks. Why so long? It has to travel to 4 other sites and be poked at by 12 different people, most of who don’t know each other, or what they all do. So cutting corners will be the norm.

At first blush one might think the NSA is little more than script kiddies with no oversight, who dream large. But they have the same human foibles as the rest of us, just higher consequences for getting things wrong, and held to higher standards (which so far hasn’t happened).

David May 16, 2014 3:03 PM

Interesting that there’s also no mention of SELinux (developed, amusingly, by the NSA). My understanding is that mandatory access control (along with signed everything) is significantly intended to (help) address this kind of thing.

Andrew Wallace May 16, 2014 3:38 PM

It is not how to stop an insider stealing your secrets.

It is how to stop your employees falling victim to people targeting them. People think of an insider as a rogue employee knowingly leaking secrets.

What is a lot more common are employees, leaking secrets without knowing. Often on a Friday night after work, employees can be followed to a bar.

While the employee is intoxicated, a USB stick can be tampered with or swaped.

Or an employee takes someone home for a one night stand and laptops, usb sticks and other devices are left vulnerable.

This is what is thought to have happened in the lead up to Stuxnet.

Nick P May 16, 2014 3:57 PM

I think the key is that they’re just not being disciplined. There are technologies and approaches that greatly reduce their risks. There are also ways to effectively take the admin out of a trusted status where you shift trust to people who build the black boxes & have no idea how they’ll be used. There’s a few ways to do much system administration while having no access to data on system. I also like that the article mentions Orange Book’s high security requirements might have dealt with the issue. Seeing as NSA was prime evaluator for Orange Book era, this just goes to show how they understand what works and simply don’t apply it.

NSA can have much more security than they do right now without a huge amount of disruption of operations. They even helped invent various methods and standards for doing it. I’ve given suggestions here and here. Yet, they still choose to use risky tech, not apply well-known methods of reducing risks, and put unnecessarily high number of people in positions of trust. If anything, they’re just incompetent.

And worse, they’re one of the few organizations with the resources and legal secrecy protections that could create a secure administration capability. And they do less than commercial sector. What losers…

Nick P May 16, 2014 3:59 PM

@ Andrew Wallace

That’s an especially good point in this situation because it’s exactly what Snowden did. The problem wasn’t that he was malicious. The problem was that he was malicious and could easily make unwitting accomplices out of his coworkers. Had this not been the case, he might have gotten away with fewer documents or been caught in the process.

Spooky May 16, 2014 4:11 PM

Point well-taken, Bruce, though your comment today is definitely a bit on the cheeky side. Fewer secrets? You cannot operate an intelligence directorate without retaining compartmented information, i.e. secrets. Operating on a global scale necessitates keeping an enormous amount of compartmented information in systems that are tightly coupled, efficiently providing operational data to leaders and analysts so they can make effective use of intelligence products (and any windows of opportunity afforded by them). How would it benefit us, exactly, to store less–and consequently, know less–about our adversaries? Or alternatively, erect such bureaucratic hurdles that our gathered intel is less accessible, less timely and ultimately, less useful? The need for operational efficiency in these matters should be weighed very, very carefully. We have an intelligence apparatus that works quite well (arguably, too well) and if we deliberately break it, we get to keep both pieces. At the very least, we risk being at a competative disadvantage with our neighbors–at worst, we’ll be flying totally and completely blind, with no basis for understanding ground truth. And the consequences of THAT are far less predictable than you might imagine…

Bob S. May 16, 2014 6:39 PM

Gen. Alexander had a slightly different twist on upgrading security…late last summer he fired 90% of their sysadmins, sort of a mega-decimation.

(In Roman times only one in ten disgraced troops lost their head, the general reversed that ratio.)

I would imagine IF all the steps were taken in the referenced article it would slow down NSA quite a bit.

Goldry Bluszco May 17, 2014 2:26 AM

@Spooky

Actually, your point about “compartmentalized information” is precisely the point Bruce was making – which is related to the “need to know” point Bob Toxen was making.

It’s all about the effectiveness of said “information”. If everything conceivably possible is a “secret”, which seems to be in vogue these days, then nothing ends up secret, because it must also be accessed, and the process of accessing it is fraught with “many a slip twixt cup and lip”.

Whereas, if there is a strict “need to classify” applied to the “information” so that only what must be classified gets classified, then there is a much greater inhibition to allowing “sidechannel” access to it to allow more timely use of it.

Or as someone said once – I forget who – “If everything tastes like chicken, what does chicken itself taste like?”

Gweihir May 17, 2014 5:35 AM

There are other inaccuracies and mistakes as well. The whole thing is more than a bit naive. I would go so far as call it pretty incompetent and unaware of reality. This list is a re-hash of all the simple stuff that helps (maybe sometimes) against incompetent attackers. Snowden does not fall into that class.

  • They claim “wanding” would have found an USB stick. They seem to be unaware of how little metal and all of that non-magnetic some USB sticks hold.
  • “Rings of Security” does not even apply to the question at hand.
  • “Islands of Security”, also called a “Zone Concept”, does not apply. Snowden had legitimate access.
  • “Physical Security” does not apply, AFAIK, Snowden did not physically access servers.
  • “Prevent Unauthorized Copying” sounds nice in theory, but is typically unworkable in practice. Sure, if nobody ever is allowed to take any equipment in or out, that would have helped, but it would also have killed productivity. And as the NSA needed contractors because of staff shortage, this does not apply to the Snowden case either.
  • “Two-Factor Authentication” deals not with this issue at all, but with the question of access rights according to privilege level. They claim it would have been “trivial” to ensure Snowden does not access documents higher than his clearance. They are wrong. The sysadmin must be trusted. As soon as a system the admin administrates has access, the admin has access.
  • “Two Person Authentication”, yes, sounds nice in theory, but I know of instances where it has been subverted in practice up to a full remote backdoor being placed. And that was directly under the eyes on the second person that did not notice anything and it was not malicious, but done just to make work easier.
  • “Log Events and Monitor” another naive paragraph. Sure, anomalous access and behavior detection works to a degree, but only if it does not disrupt normal business. In the NSA, I expect that analysts do extensive searches all the time and in non-predictable patterns, hence somebody doing this with low intensity will never be noticed.
  • “No Internet Access or Homework Whatsoever” that one works, but only if the resources are there to compensate for the significant loss in productivity. They likely were _not_ there, see above under “Prevent Unauthorized Copying”.
  • “Prevent Removable Media from Leaving the Building” They claim this is simple. It is not. It is basically impossible.
  • “Creatively Use Encryption” This one is complete BS. It makes everything a lot more complex and thereby vulnerable. It prevents people from being able to work.
  • “Plan for Break-in to Minimize Damage” This is one of the few ones of any value on the list. “Have less secrrets” should be under this heading.

uh, Mike May 17, 2014 7:44 AM

Say you have a control, like a virus scanner. Somebody has to watch it, even while there are no incidents. Especially while there are no incidents.

That’s expensive, and when done right, there’s nothing to show for it. Lack of trouble draws precious little attention.

Paul May 17, 2014 9:13 AM

Wanding would pick up a USB drive? I get wanded all the time at airports and they fail to pick up the six inch surgical steel plate in my forearm (which, by the way, would make a better blade than a box-cutter, so much for the drama of airport security). A USB to microSD adapter is $1.99. A 64GB microSD card contains less metal that the fillings in your teeth.

Now bear in mind that all those clever techniques NSA uses to gain access to systems work in reverse. Imagine a device that sits in a router and uses ultra-wideband over the power line as a second network channel. Now bribe a CISCO employee to to install that in a router headed for the Bluffdale facility. Or in several.

Really NSA has missed a trick with their emphasis on spying and collect-it-all. They would be far more useful devoting their considerable resources to closing vulnerabilities rather than exploiting them.

Wael May 17, 2014 2:11 PM

Title of thread: How to Stop an Insider from Stealing All Your Secrets
Recommended solution:

What’s missing is the one thing that would have worked: have fewer secrets.

Having fewer secrets will not stop an insider from stealing all your secrets; it’ll reduce the secret-stealing surface attack vector for two main reasons:
1) Less secrets to steal (but still all secrets is independant of the amount of secrets, exception is zero secrets and infinite secrets)
2) The less secrets there are, the more managable they are, I guess that’s what you mean.

Nick P May 17, 2014 4:48 PM

@ Wael

Perhaps researchers need to look into applying the TCB concept to secrecy. In computers, the TCB must be minimal, well-designed, and well-managed to keep safe. In secrecy, leaks are prevented by reducing secrets, developing technical ways to protect them, and having policies/personnel manage them properly. So, not the same but very similar. Could be worth some cross-disciplinary R&D.

Clive Robinson May 17, 2014 5:35 PM

@Nick P, Wael,

How about looking at how TEMPEST proof crypto equipment is designed?

We have discussed the process here several times befor and it takes very little to see how to apply the electronic process to human workflow processes.

Wael May 17, 2014 6:37 PM

@ Nick P,

Perhaps researchers need to look into applying the TCB concept to secrecy…

I’ll need to think about that a bit. I won’t forget 😉

Wael May 17, 2014 6:44 PM

@Clive Robinsom, @Nick P,

How about looking at how TEMPEST proof…

Well, I can see shielding as one category that can be applied. Another area could be compartmenalization. Back to Nick P’s TCB thing… Principle of least privilege is one common area… Separation / segregation of roles / duties. We’ll have to go back to “principles” to port “TCB” or “TEMPEST” type desgins to humans. The same principle applies to both domains, how the princple is imlpemented will vary from the machine to human.

65535 May 18, 2014 1:24 AM

I did a quick search of this blog and found a lot comments regarding half hearted security measures at the NSA/GCHQ and “plausible dependability”. I sense many forms of security and logging are built into the systems but not used – for a reason. Bad actors could be using it to increase funding by digging up dirt on government officials in key positions and so on.

Here is one comment:

Bogwitch • August 27, 2013 7:13 AM

“Having worked on (UK) Intelligence systems, I can attest to the fact that auditing is NOT considered and often is not desired. Plausible dependability.” -David

https://www.schneier.com/blog/archives/2013/08/detaining_david.html#c1647862

That is group think. It starts at the top. The NSA and other TLO routinely engage in plausible dependability and parallel construction.

The NSA must have its budget slashed in half and then it should be broken into mangle accountable and audit-able departments. With the NSA’s current budget, power and size that is not possible. No more “least truthful” lies to congress!

moz May 18, 2014 3:21 AM

@65535;

I think you are looking for “plausible deniability” not “plausible dependability”. A bit weird given that you seem to have cut and paste the quote. I guess your spelling checker is out for revenge.

Wael May 18, 2014 4:14 AM

@ Nick P,

Perhaps researchers need to look into applying the TCB concept to secrecy. In computers, the TCB must be minimal, well-designed, and well-managed to keep safe. In secrecy, leaks are prevented by reducing secrets, developing technical ways to protect them, and having policies/personnel manage them properly. So, not the same but very similar. Could be worth some cross-disciplinary R&D.

In addition, TCB typically defines trust boundaries, in the human world this is achieved by various groups with a protocol of engagement “boundary” and “interface”. Also, TCB needs to be trusted in and of itself. Humans are not always predictable and unlike software, one cannot perform “formal testing” on them; they cannot be trusted — trust me. As Spock once said: “No one can can guarantee the actions of another” – I get to correct the first time I used this quote after I found the reference. TCB concepts came from principles, and the principles came from experience. One way to look at principles as “strategy” and a manifestation of a principle as a “tactic”. Just like in chess, one example of a “strategy” is to control an open file or a half open file with a rook. A tactic example would be a “pin”. The difference is in strategy, we know it’s “good” to control an open file with a rook, in the long term. There are no calculations required – just put the rook there, experience has shown it to be a good thing to do. Tactics on the other hand are comparatively short term and use precise calculations to forecast the exact set of outcomes. So, for your idea, you explicitly proposed to go from principles to TCB implementations. I believe it’s also implied that researchers should go from TCB to their governing principles then applying these principles to protecting secrets. Fascinating area of research which will need several disciples as you noted. I think “shrinks” would be an instrumental part of the team…

Wael May 18, 2014 4:52 AM

Was still thinking about this

What’s missing is the one thing that would have worked: have fewer secrets.

Part of me says that sounds rational. The irrational part of me says:
Oh, yea? Suppose you have 1000 secrets and an insider steals 500 of them. Then all the secrets were not stolen. Suppose you have 1000 secrets, and an insider stole one of them. That’s still not all. Now suppose you reduced your secrets to only one secret, and an insider stole it. Then all your secrets got stolen because you reduced your secrets. Sounds like a counter-example to me…Then the rational part comes back and says: There must be a correlation between the number of secrets and the strength of the controls to protect them, effectively producing a proportionality constant in the linear approximation which multiplies the difficulty of protecting the secret into the number of the secrets. Thus, stealing one secret from a well protected set of {one secret} is more than a 1000 times more difficult than stealing one secret from a set of a {1000 secrets}. Then the irrational part of me says the probability of… Oh, never mind.

Wael May 18, 2014 5:02 AM

Correction:
Thus, stealing one secret from a well protected set of {one secret} is more than a 1000 times more difficult than stealing one secret from a set of a {1000 secrets}.
Should be:

Thus, stealing one secret from a well protected set of {one secret} is more than 1000 times more difficult than stealing 1000 secrets from a set of well protected {1000 secrets}.

Too early in the morning…

Clive Robinson May 18, 2014 5:48 AM

@Wael,

The way to lock at the problem is,

1) Secrets are stored in a repository,
2) Access to the repository is proportional to the number of secrets.
3) The greater the access the less secure the repository is.

That is the less the access the easier it is to check and control each access. Beyond a certain number of access in a given time period the harder it becomes to check and control each access.

We have touched on this in the past when talking about terrorits in airport checkin ques with limited numbers of rapiescan body scanners which take three or four times longer than metal detectors to check an individual. Eventually preasure of numbers waiting to be checked stops any kind of random behaviour by the checkers and the terrorist with a little forthought can significantly stack the odds of going through a metal detector not a rapiescanner.

There is a bunch of mathmatics behind this which appears from amongst other things queing theory and games theory.

Nick P May 18, 2014 11:38 AM

@ Wael

re picking trustworthy people for high-secrecy organizations

Absolutely correct in that, like the TCB, the people must be trusted [to some degree]. The problem is this: “they cannot be trusted – trust me.” I know people who are more predictable than the apps on my computer. I’ve put trust in others to great risk to myself many times and only regret a few choices. Even they haven’t caused me harm (so far). It more a case of I’d rather have not taken on the exposure. There are also many organizations full of people doing all sorts of things that could be damaging and with few to no leaks in their history. Their people, in practice, were trustworthy [enough].

That’s the thing about people and trust. Even Bruce acknowledges in his book that all of us trust many strangers and it often works out in practice. For a high secrecy organization, the trick is to pick people that act on their principles and by nature are very loyal to a group. I’ve known many such people that would never betray their group. If they gossip, at worse they do it in the group so still some containment there. These qualities aren’t easy to access: it takes time and you really need to get to know the person.

Best organizations are typically homogenous in that this makes their members easier to understand. The environment is desirable, those in charge are effective managers/leaders, compensation is good, and an “insiders are elite” mentality is fostered. In other words, it’s a good place work. 😉 Such organizations will have scouts in areas with high potential for the right candidates. The scouts blend in, get to know people, identify potential talent, and filter out risky types. They might even do little tests on potential candidates to assess trustworthiness. Those that make it are only gradually accepted into “inside” and given privileges.

A good example of this kind of thing is depicted in The Good Shepherd in how the early CIA chose its people. The culuture, at least. They were missing an element or two of my requirements which led to a number of failures. Plus, they were targeted by some of the best infiltrators. That’s always the beginning of problems. 😉 Cults like Scientology are also exemplar in producing the culture I’m describing. Steve Jobs used such cult-type techniques and internal controls at Apple to achieve an excellent track record of secrecy/loyalty.

Not to mention all the scheming at Goldman Sachs, Exxon Mobil, etc. So many companies with dirt or plans work huge money. Vastly fewer leaks or failures than one would expect. The implication is they handle the personnel aspect well enough to keep defectors a minimal risk. Not sure the methods in each company, but picking and paying well with a NDA + internal controls is the minimum.

“So, for your idea, you explicitly proposed to go from principles to TCB implementations. I believe it’s also implied that researchers should go from TCB to their governing principles then applying these principles to protecting secrets. ”

Yes although we can’t stretch the metaphor too much. 😉

“Fascinating area of research which will need several disciples as you noted. I think “shrinks” would be an instrumental part of the team…”

Definitely. Even Bruce shifted to psychology and economic research to help understand security better. So, this field will definitely benefit from both.

@ Wael and Clive

effect of number of secrets

Don’t you guys forget what MLS taught us. Clive is close with his repository idea. The number of secrets any person has access to can be restricted, monitored, etc. This takes tool support (thin clients, CMW OS’s, etc). There should also be policies and procedures to support this. I’m not talking about the kind people ignore, either. Like in military and intelligence, the culture has to be one where people have responsibility to manage secrets correctly & the policies/tools are designed to make this easier.

This leads me another consideration: secrets as a privilege and a burden. Anyone joining an organization with plenty of secrecy is accepting a burden. They will have to put in more effort to do every single thing. So, they should definitely be more motivated to accomplish their goals (including secrecy) than the average worker. As I think on it, I also think that telling them about the burden upfront might help in fostering the rewarding “elite” feeling esp if they successfully perform their duties. Additionally, the policies and tools can be seen as psychologically beneficial in that they greatly reduce that burden by making certain decisions for you. If this concept is embedded into employee indoctrination, they might be less likely to bypass tools or policies while feeling more satisfied managing confidential data than if they had no access.

Just a thought…

Wael May 18, 2014 12:57 PM

@ Nick P,

<

blockquote>those in charge are effective managers/leaders, compensation is good, and an “insiders are elite” mentality is fostered. In other words, it’s a good place work. 😉 Such organizations will have scouts in areas with high potential for the right candidates. The scouts blend in, get to know people, identify potential

“Insiders are elite” may work for people with trivial minds. Won’t work for Richard Feynman

Nick P May 18, 2014 2:02 PM

@ Wael

I knew you’d jump on that haha. Sounds like he’s got some emotional baggage about the issue. He might have been shoved into lockers and such by elitist kids. 😉 Remember that I’m talking about an elite attitude with a strong focus on responsibility and results. He’s talking about one focused on “honors,” which I agree have no inherent value. Yet, so many people (even smart ones) compete for honors that I disagree with Feynman that they’re meaningless. Advanced degree’s, video game “achievements,” having name on top of salesperson list, being the CEO, being invited to meet the President, etc. There’s entire fields of study and industry that try to motivate people with honors. Bright as Feynman is, he’s wrong about honors except for their affect on him and those like him that don’t care for them.

In his case, he appreciate’s solving a problem and seeing other people use his work. The latter is an honor for him. So, he does like at least one and probably for a similar psychological reason as others like being called “Doctor.” In any case, the use of a form of elitism in my scheme is certainly controversial and worthy of more investigation. It’s just that every organization that successfully kept many secrets for any length of time had elitist tendencies which were stronger than most. Maybe there’s a counterexample I’m overlooking where a democratic, very diverse, and distrusting group worked well effectively while keeping their activities & plans secret. I’ve just never heard of it.

So, I go with what worked before [which includes some elitism] and ask “how can this be done? And done better? And done consistently? And done longer?”

Unrelated note: Thanks for the Feynman vid as it links to a bunch of others on individual topics. I didn’t know these were on YouTube. I primarily got info on him from books or web sites. So, it’s nice to have some more vids seeing him say stuff himself. 🙂

Semirelated note:

I’m also investigating how to create the effect in more diverse organizations where people don’t understand each other as well. Open-source development cultures shed light on the unity and communications aspects. Not secrecy, obviously. Closest thing I’ve seen in a proprietary company is Opera software. They aim for many goals and accomplish them pretty well. Hell, they’re the only alternative browser company from 90’s still around. (Impressive right?) They accomplish it while having an extremely diverse team. I tried (and failed) to find the interesting article I read on it a while back. I do remember that they went out of their way to embrace other nationalities & their unique perspective in various activities. They even did a lunch event each week themed to the nationality or background of a worker. So, while my schemes used multiple nationalities that were competitive to achieve integrity, they have a way of unifying multiple nationalities to achieve effectiveness in many areas. Worth thinking on.

Wael May 18, 2014 2:28 PM

@ Nick P,

Remember that I’m talking about an elite attitude with a strong focus on responsibility and results

Yes! Controversial subject. The point is regardless of the reasons, he left a group and spilled some of the secrets.

Thanks for the Feynman vid…

I’ll share more vids related to some previous thread engagments we had a year or so ago during our Castles_v_Prisons fun discussion. Just need to get some work done, so propbably in a few hours…

Wael May 18, 2014 3:03 PM

@ Clive Robinson,

The way to lock at the problem is,
1) Secrets are stored in a repository,
2) Access to the repository is proportional to the number of secrets.
3) The greater the access the less secure the repository is.

We are in agreement there, with one exception; #2. Access to the repository is proportional to the number of secrets only if the secrets are categorized and require different access controls. So access to the repository is proportional to the number of groups of secrets within the repository. It’s proportional to the number of partitions that require different access rights, and not the number of secrets, per se. Had @ Bruce Schneier dropped the “all” out of the title thread, we probably wouldn’t be having this discussion. And I say “probably” because if we reduce the number of “groups of secrets” then we are controversially in violation of another principle: Separation of privilege, since it means we’ll lump two or more groups of secrets that have different classification under one privilege, rights, and access control mechanism. I would think the proportionality factor in #2 is as @Gweihir put it (out of context):

access rights according to privilege level

If you restate #2 as: Access to the repository is proportional to the number of access rights according to privilege level, then we are OK.

By the way, do “you people”, accross the pond, use the hash sign (or pound sign) to denote “number” as in #2? 🙂

Craig McQueen May 18, 2014 8:37 PM

Spooky: “How would it benefit us, exactly, to store less–and consequently, know less–about our adversaries?”

Perhaps because we make our own adversaries by distrusting and spying on everyone. Maybe facilitating an atmosphere of trust is a step towards having a less adversarial world. Or am I a naïve idealist?

Wael May 19, 2014 3:18 AM

@Nick P,
A few hours have passed, so here are a few informative links…

Richard Feynman:
Physics lectures:
http://www.youtube.com/playlist?list=PLLzGzdSNup63lMYeOpU9Hax6MBsTjdDas

Short talks:
http://www.youtube.com/playlist?list=PLF3336DF170907056

Quantum Electrodynamics:
http://www.youtube.com/playlist?list=PL01619985657950A3

Gilbert Strang, Linear Algebra:
This guy happens to be the advisor of someone I talked about in the past.
http://www.youtube.com/playlist?list=PL49CF3715CB9EF31D

Short talks on Quantum computers:
http://www.youtube.com/playlist?list=PLtnZnrhPnXZoGUj9qqQFMlpU8ficmajUU

And after this Quote from @Clive Robinson… (June 12, 2012)

Back in the 1930’s Kurt Godel can up with an unfortunate truth which along with Turings later work shows that what we regards as “trust” (ie that a system that has and currently does behave in a particular way will carry on doing so in the future) is a mear illusion.

I still cannot find what Gödel said to corroborate the above quote…
Kurt Gödel: The world’s most incredible mind. A man that Einstein, toward the end of his life, said that his “own work no longer meant much, that he came to the Institute merely… to have the privilege of walking home with Gödel”. [Wikipedea]
http://youtu.be/i2KP1vWkQ6Y
http://garygeck.com

Anon May 19, 2014 5:13 AM

Regarding “have fewer secrets”:

Because the NSA thought they could keep their their data collection abilities secret, they didn’t bother putting in appropriate checks and oversight. So they were acting with gross disregard for the law (certainly non-US law, and arguably US law as well). This outraged Snowden, so he leaked it.

If the NSA had publicly disclosed their data collection abilities, they would have been forced to put in appropriate checks and oversight, and they would have been forced to act in accordance with the law (or at least US law). If they had done that, Snowden may not have leaked anything.

Nick P May 19, 2014 1:07 PM

@ Wael

Thanks for the links. Will go through them. Currently watching the vid of that (censored) Godel guy whose work ruined a lucrative software proving industry before it gave me all the tools I needed. 😉

paul May 19, 2014 1:29 PM

I think it’s important to note that Snowden didn’t actually work for the NSA, and that most of the NSA’s systems are not currently run by the NSA. Thanks to the contractorization of government infrastructure, much of the work in such setups is done by people at multiple removes from the NSA’s security mission, with primary legal allegiance to their shareholders, budgets and/or paychecks. Sure, the systems can be specified to operate securely, and rules could be laid down to prevent insiders from carrying information home unencrypted, but that would cost time and money and require full training of everyone in the NSA/contractor/subcontractor chain. And the enormous expansion of the surveillance state during the past 15-20 years may not have allowed for that kind of careful-deliberate design.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.