Protecting Against Leakers

Ever since Edward Snowden walked out of a National Security Agency facility in May with electronic copies of thousands of classified documents, the finger-pointing has concentrated on government’s security failures. Yet the debacle illustrates the challenge with trusting people in any organization.

The problem is easy to describe. Organizations require trusted people, but they don’t necessarily know whether those people are trustworthy. These individuals are essential, and can also betray organizations.

So how does an organization protect itself?

Securing trusted people requires three basic mechanisms (as I describe in my book Beyond Fear). The first is compartmentalization. Trust doesn’t have to be all or nothing; it makes sense to give relevant workers only the access, capabilities and information they need to accomplish their assigned tasks. In the military, even if they have the requisite clearance, people are only told what they “need to know.” The same policy occurs naturally in companies.

This isn’t simply a matter of always granting more senior employees a higher degree of trust. For example, only authorized armored-car delivery people can unlock automated teller machines and put money inside; even the bank president can’t do so. Think of an employee as operating within a sphere of trust—a set of assets and functions he or she has access to. Organizations act in their best interest by making that sphere as small as possible.

The idea is that if someone turns out to be untrustworthy, he or she can only do so much damage. This is where the NSA failed with Snowden. As a system administrator, he needed access to many of the agency’s computer systems—and he needed access to everything on those machines. This allowed him to make copies of documents he didn’t need to see.

The second mechanism for securing trust is defense in depth: Make sure a single person can’t compromise an entire system. NSA Director General Keith Alexander has said he is doing this inside the agency by instituting what is called two-person control: There will always be two people performing system-administration tasks on highly classified computers.

Defense in depth reduces the ability of a single person to betray the organization. If this system had been in place and Snowden’s superior had been notified every time he downloaded a file, Snowden would have been caught well before his flight to Hong Kong.

The final mechanism is to try to ensure that trusted people are, in fact, trustworthy. The NSA does this through its clearance process, which at high levels includes lie-detector tests (even though they don’t work) and background investigations. Many organizations perform reference and credit checks and drug tests when they hire new employees. Companies may refuse to hire people with criminal records or noncitizens; they might hire only those with a particular certification or membership in certain professional organizations. Some of these measures aren’t very effective—it’s pretty clear that personality profiling doesn’t tell you anything useful, for example—but the general idea is to verify, certify and test individuals to increase the chance they can be trusted.

These measures are expensive. It costs the U.S. government about $4,000 to qualify someone for top-secret clearance. Even in a corporation, background checks and screenings are expensive and add considerable time to the hiring process. Giving employees access to only the information they need can hamper them in an agile organization in which needs constantly change. Security audits are expensive, and two-person control is even more expensive: it can double personnel costs. We’re always making trade-offs between security and efficiency.

The best defense is to limit the number of trusted people needed within an organization. Alexander is doing this at the NSA—albeit too late—by trying to reduce the number of system administrators by 90 percent. This is just a tiny part of the problem; in the U.S. government, as many as 4 million people, including contractors, hold top-secret or higher security clearances. That’s far too many.

More surprising than Snowden’s ability to get away with taking the information he downloaded is that there haven’t been dozens more like him. His uniqueness—along with the few who have gone before him and how rare whistle-blowers are in general—is a testament to how well we normally do at building security around trusted people.

Here’s one last piece of advice, specifically about whistle-blowers. It’s much harder to keep secrets in a networked world, and whistle-blowing has become the civil disobedience of the information age. A public or private organization’s best defense against whistle-blowers is to refrain from doing things it doesn’t want to read about on the front page of the newspaper. This may come as a shock in a market-based system, in which morally dubious behavior is often rewarded as long as it’s legal and illegal activity is rewarded as long as you can get away with it.

No organization, whether it’s a bank entrusted with the privacy of its customer data, an organized-crime syndicate intent on ruling the world, or a government agency spying on its citizens, wants to have its secrets disclosed. In the information age, though, it may be impossible to avoid.

This essay previously appeared on Bloomberg.com.

EDITED TO ADD 8/22: A commenter on the Bloomberg site added another security measure: pay your people more. Better paid people are less likely to betray the organization that employs them. I should have added that, especially since I make that exact point in Liars and Outliers.

Posted on August 26, 2013 at 1:19 PM52 Comments

Comments

Rick Lobrecht August 26, 2013 2:21 PM

Really, I think the corrected sentence works either way. As pointed out in the article, if the organization doesn’t do anything worth leaking, i.e. if they aren’t betraying the trust of their employees or customers, there will be no leak.

Doug Coulter August 26, 2013 2:22 PM

I think a big part of the problem was going to COTS. All that stuff is designed such that a sysadmin for a corp kinda automatically has root everywhere on the network in order to be able to do his job.
Hence Snowden’s access – because they went to COTS stuff, he kinda had to have root everywhere to get his job done, and therefore not only had access to it all, but to the logs of access….

I’m not so sure that putting 90% of sysadmins on the street that have/had access is such a smart move…deliberately angering insiders and tossing them out? Yeah, that’ll work.

Way back in the cold war era, when I worked on the NMIC and WWMCCS – I used to rail at the “air gap” they had, as it slowed things down, in this case literally a sergeant sitting at a pair of Teletypes – one with incoming stuff, also punched on paper tape, the other hooked to the NMIC, etc. If said sergeant thought things were OK, he’d rip the tape off one and feed it to the other.

Boy was I ever wrong. I though it’d be easy to implement some kind of system to replace all those guys, and get our bandwidth up.
History shows that indeed I was wrong.

Being smart without admitting one’s mistakes is like trying to learn to sing while being deaf – it doesn’t work out that well.

pointless_hack August 26, 2013 2:27 PM

I need to read Beyond Fear. I expect it would have a good contrast between personal security, and security as it affects organizations.

The NSA appears paradoxical, because it questions the need to pay external contributors, but does not question its own right to be paid.

Patriotism should not equate to IP Communism.

Larry Lennhoff August 26, 2013 2:28 PM

This is the same advice that my dad gave me when I was a child, and it has served me well “Son, if you don’t want to get caught doing something, don’t do it.”

Doug Coulter August 26, 2013 2:29 PM

Of course, if they have nothing to hide, why do they fear these leaks anyway? Shoe meet other foot. All Snowden did was confirm what most of us suspected all along (and a few had more than a little reason to suspect).

One thing I know from back in the days I had a super duper clearance is that compartmentalization is a fail. Really, you can’t tell even your wife what you do?

So you seek out, or already know others with similar clearances, and over beer talk about what we do, destroying compartmentalization more or less completely. They might have hated that (eg the agenices involved) but what to do – human nature is what it is, you can’t work all day and give your heart to something and then never talk about it at all.

Further, it sucked being in “the community” for the following reason.
If you pulled off something great, by golly, it was even more secret than before – if the adversary knew how they were leaking, they’d plug the hole. But mess up – and you’re on the front page of the Washington Post and so on. It’s a no win game if you ever want any recognition for your abilities at all. Your boss won’t do that much, as it might lead to a pay raise or job improvement that takes you out from under his bureaucratic butt. Your co workers are merely jealous if they care at all, and are sick enough of what they do to not even want to talk about it much (see above).

Bruce – this problem needs more analysis on the human nature aspects I’m pointing out here. Not just “how to set up a system of compartments etc” – we just got done proving that this alone doesn’t fix it.

Stuart Lynne August 26, 2013 2:41 PM

If you classify everything as top secret then you obviously have to then grant access to a lot more people on a much wider basis so that they can do their jobs. Which increases costs of protection and substantially increases the threat of security breaches for everything.

Drastically limiting the number of documents that need protecting to documents that actually do need protecting would result in far few documents that need to be protected and far few people who need to be able to access them.

Lower cost, more efficiency, and hopefully much better security for what actually needs to be secure.

Ben Franklin August 26, 2013 2:47 PM

Three can keep a secret, if two of them are dead.

It seems like reducing secret knowers by 90% doesn’t actually fix the problem (for most reasonable guesses at the number of admins at the NSA).

What’s the collective opinion of the “secret keeping database machine” to which all phone records are entrusted? NSA officials all claim that only 22 people can add a number to the list of numbers searched and that there have been 0 intentional errors in millions of queries. Snowden even somewhat confirms the trustworthiness of this “machine”, in that he released none of the phone records in the database, which would have been a more effective news story than leaking some obscure FISA court order.

Have we reached the brink of sci-fi, where trusting data to computers is a better idea than trusting it to human administrators??

tatere August 26, 2013 2:47 PM

You imply this somewhat towards the end, that there is another step organizations can take: Stop having so many secrets. Having however many million “classified” documents created every year is a guarantee that some “secret” information will get out. But it should never have been secret to begin with, and that overuse dilutes the value of each secret – including the few that probably do need to be kept hidden. The more people you trust, the less weight there is on each of them individually.

JimBo August 26, 2013 3:33 PM

A few things the military does that you left out: Having personal and human reliability programs (PRP and HRP). These allow you or management to remove you from the secure information for any personal issue without jepordy to your job or pay. They are usually invoked for temporary personal issues. They also allow anyone to alert management of any concern they have about a coworker, without fear of retribution. In addition to the two man policy you also have (armed) guards monitoring activity. If this seems overkill its not. I worked for 4 years in USAF and another 11 years at a DOD contractor with these controls in place.

nycman August 26, 2013 3:52 PM

Related to “pay your people more” is ensure your people have a lot to lose – family, cushy house, easy paycheck, nice car, links to community/society/friends, career, reputation, etc. To a certain extent, this is within human nature. Which is why, generally, people are less trusting of the young single guy, who can just walk away with nothing to lose.

StewBaby911 August 26, 2013 4:10 PM

The reality is that your Systems Administrators are under
tremendous pressure by the Business and Security vendors to relax the security rules to allow the business people to ‘just get their jobs done’ and by ‘reputable’ security vendors touting DOD/NIST/PCI/SOX/HIPPA ‘scanning tools’ that login directly over the network and ‘have to have’ access like ‘cat *’ to work….
That’s the sad reality these days.
I hope everyone is shocked.

Oz Ozzie August 26, 2013 4:41 PM

The other thing you can do is make the cost of betrayal higher than the gains thereof. Hence the vigor with which they have pursued Snowdon and mannng. They are very dangerous, because you can’t filter them out by recruitment processes, as they are “true believers”, not untrustworthy to begin with

The rut lesson to learn is that if you don’t break the law, then this won’t happen. But it’s evident that this is not a prospect.

GLD August 26, 2013 4:56 PM

Maybe there’s a reason now explained in the articles, but….
Why are the data stored in the clear on those systems?
That requires the sysadmins be trusted not to peek at files the users own.
Could they not have established a client-end encryption scheme where the users (the owners of the data) hold the keys to the data they are authorized to read, so all the sysadmins do is operate containers that effectively just hold blobs of opaque bytes that only the users…can use?

Mark August 26, 2013 5:04 PM

What about the algorithm used to determine who is trustable? How is the algorithm created and refined?

For example, what details about Snowden’s background will now (probably arbitrarily) make it into the list of things that make someone unworthy of clearance?

A related point: if you’re not providing clearance to some people you don’t think deserve it, then you can’t know how effective your clearance algorithm is. In other words, you may be absolutely convinced that people who have smoked marijuana (or failed a lie detector test, or read the wrong books, or been arrested, or whatever disqualifies people) are not trustworthy, but unless you hire some people who have done these things and track them, you don’t really know. (See Dan Ariely’s work.)

Sevesteen August 26, 2013 5:11 PM

Too many secrets and too many clearances. I had a temp job on an Air Force base a few years ago, installing Vista. I was not exposed to any classified information at all, not allowed into classified areas or to work on the classified network–but the job required a Secret security clearance.

David August 26, 2013 6:05 PM

“But it should never have been secret to begin with, and that overuse dilutes the value of each secret – including the few that probably do need to be kept hidden.”

I really think we need to somehow bring stiff penalties for CYA secrecy.

rhotheta6 August 26, 2013 6:15 PM

Alexander, it seems, must have passed all the screening for trust worthiness

The he took an oath to defend and protect the constitution

Now he’s running an apparatus that is flagrantly in violation of e.g. the fourth amendment

Trustworthy…not in my book

In a way,the worst thing about our government’s slide towards fascism is it makes it impossible for any knowledgeable citizen to trust it

Bruce; how can trust be restored

Brian August 26, 2013 6:25 PM

I agree with a lot of what Bruce writes, but the piece of advice at the end about whistle-blowers seems pretty naive: “A public or private organization’s best defense against whistle-blowers is to refrain from doing things it doesn’t want to read about on the front page of the newspaper.”

The implication seems to be that “things it doesn’t want to read about on the front page of the newspaper” refers to bad things and that organizations can avoid the issue by doing only good things. But it doesn’t take much imagination to come up with a whole list of things that organizations would legitimately want to keep off the front page of the paper without falling into the category of “bad things”. Technical details of new products being developed or security vulnerabilities, for private companies, or lists of assets for an intelligence organization all fall into the category of things that they wouldn’t want to see on the front page but are unreasonable to ask the organization to avoid.

A better way to phrase it might be that organizations can try to prevent whistle-blowers by not engaging in activities that someone might feel compelled to blow the whistle on, but I think Bruce’s choice of phrasing is interesting. It’s more reflective of the modern idea of whistle-blowing as less about revealing bad things and more about just revealing information someone wants to keep secret, whether or not they’re keeping it secret for legitimate reasons. More Wikileaks than Woodward and Bernstein.

Hooch August 26, 2013 8:05 PM

Would it not be a better way to say the best defense against whistle blowers is to not break the Trust of the user’s/People.

Dirk Praet August 26, 2013 8:39 PM

As pointed out by @Jimbo, the establishment of trust does not end with a person being hired and/or given a security clearance, but is an ongoing process. Failure to properly implement PR and HR programs carries the risk of unusual or otherwise atypical behaviour of workers to go unnoticed or not acted upon. Most managers know how to deal with the average overtly disgruntled employee, but it becomes an entirely different story when a less outgoing person over time becomes disgusted not with his job, manager or coworkers, but with the core activities of the company he’s working for. That seems to have been the case with both Manning and Snowden.

If such a scenario unfolds in a high-security environment with inadequate technical controls on classified data and systems (MLS, MAC, RBAC etc.), even a low-grade nobody can become a serious threat. Personally, I think it’s a statistical improbability that aforementioned whistleblowers are the first or only folks to have ever breached SIPRNet or NSA systems. Maybe they were just the first idealists we have come to hear about, whereas an unknown number of others are still sitting on a similar stash or found a way to secretly sell it to 3rd parties.

@ Mark

In other words, you may be absolutely convinced that people who have smoked marijuana (or failed a lie detector test, or read the wrong books, or been arrested, or whatever disqualifies people) are not trustworthy …

You might be surprised what you can get away with when applying for a clearance. The first time I applied for a **** Secret Clearance was through the US company I was then working for. Seeking advice from our local security attache, he told me that the best thing I could do was give straight answers to anything the application form was asking because the body examining it would find out anyway.

After about 3 months – and much to my surprise – I got cleared, even though I had admitted to a prior misdemeanour conviction, having used controlled substances, participating in anti-government rallies and some other eyebrow-raising stuff. Admittedly, all of that was during a time that I was part of a somewhat colourful anarchist subculture, but I’m pretty sure that if I had lied about it, I’d never been given that clearance.

Eric Shelton August 26, 2013 8:50 PM

If by people “like him” you mean people who have managed to run off with thousands of top secret NSA documents due to the poor security of their systems, then the US can probably count on various foreign intelligence agencies, hostile or not, having already sent and benefited from “dozens more like him.”

Although security is a complex field, there ss not just one little hypertechnical mistake involved here. The NSA’s security apparatus was an obvious high-value target, and was doomed to fail due to a number of decisions that have been generally recognized and warned as being bad practices for years throughout government and industry. Clearly Snowden recognized and effectively exploited these problems. No doubt it was a tough problem, but the mismanagement on display is worse than getting caught with their pants down – the NSA has long bragged about their fancy belt & suspenders (cultivating a reputation for unique expertise in security) while never having any pants to begin with. Worldwide exposure is a consequence of their hubris.

It should not have required what Hayden called “the most serious hemorrhaging of American secrets” to prompt addressing obvious issues. But no one had to or will have to worry, because these days responsibility in government is not merely diffuse – it has evaporated, often by design. Problems of all scales are now merely well-intentioned mistakes, instituted by so-and-so’s predecessor, for which no one in charge today should or will be held responsible. Under enough pressure, some of those tasked with implementing their leaders’ plans may get fired. But the leaders get to stay in place, now smarter thanks to this “opportunity to learn” – in this case that sysadmins with facial hair (which apparently is about 90% of them) should not be trusted.

Wesley Parish August 26, 2013 9:11 PM

Max Fisher’s absolutely right. Digging out a hole in the hillside won’t automatically set off an avalanche; digging a couple of holes in the hill won’t automatically set off an avalanche.

It’s when you can’t see the hillside for the holes in the slope, that the hillside caves in.

Classifying every single thing they can has got to be one of the most suicidal things the US Feral Government has done. The other is trying to keep tabs on everybody all the time.

If everything tastes like chicken, what does chicken taste like?

Chilling Effect August 26, 2013 9:50 PM

How about “don’t classify everything.” Or, “If you’re running an agency that feels it necessary to do things that make people uncomfortable, provide a legitimate way to express that discomfort that protects secrets while providing the assurance that communicated concerns won’t result in retaliation or end up in the bit-bucket.”

Of course, both of these measures are an anathema to the Security Mindset that believes everything needs to be classified, and that everything they do is legal, legitimate, and necessary for National Security. That mindset seems to be reflected in Obama’s recent displays of “transparency theatre,” intended to mollify critics outraged by Snowden’s disclosures while steadfastly avoiding any change to anything.

HiTechHiTouch August 26, 2013 10:05 PM

“…hold top-secret or higher security clearances. That’s far too many.”

The clearances Confidential, Secret, Top Secret, etc. are only a measure of the screening done on the individual holder. Holding a “Top Secret” clearance doesn’t give you access to anything — it only enables you to be granted access to specific materials.

In other words, a holder has been validated against certain general criteria, such as drug use, relatives (potential hostages) in foreign countries, financially responsibility, etc.

You want people to have these “clearances” — it means that they are not openly exposed to manipulation. Employers want and need these sort of qualified people.

To say there are far too many cleared employees is to say there are too many good employees.

Now if you want to discuss the number of people allowed to see specific materials, then you may argue that an access list is too long. But having a Secret clearance doesn’t put you on any access list for any specific sensitive materials, so you can’t count Secret clearances to see how long the access list is.

(Please forgive a copy of this post incorrectly appearing with the previous blog entry.)

T August 26, 2013 10:14 PM

The Bloomburg reader is wrong. Paying someone more does not necessarily make them more trustworthy. Sure you want to pay a sysadmin more than minimum wage but have million dollar salaries make CEO’s more trustworthy when it comes to running companies into the ground? In some way Snowden’s oversized salary may have encouraged him to take the risk of defecting.

Michael Moser August 26, 2013 10:47 PM

Why don’t they just use their own tools to monitor their own staff? One half of the NSA can snoop on the other half, maybe take turns; now if someone has ever used TOR then that would be a fireable offense 😉

Michael Moser August 26, 2013 11:02 PM

… also monitoring one another would keep the NSA busy and the rest of us safe. Kafka is too outdated in order to be a model for reality, newer works need to be adapted, like Minority report 😉

Coyne Tibbets August 26, 2013 11:29 PM

I love how this article dances around the real issue, which is whistleblowing, not just leaking.

The problem is that people have multiple spheres of trust to which they must respond. Simplifying for this example there are two spheres:

  • The sphere of keeping secrets of the NSA.
  • The sphere of duty to the Constitution.

Okay, now suppose a person is asked to perform an act, and the spheres require different, contradictory responses; a problem which is quite non-theoretical in current NSA activities. Consider the specific act of recording a phone call, without a warrant, made by a citizen to a citizen, within the bounds of the U.S. No matter how the NSA tries to lily-gild the “innocence” of such acts, the act is a clear violation of the Fourth Amendment. So on the one hand, the employee owes a duty to keep the act secret within the bounds of the NSA; on the other hand, the employee has a duty–under Constitutional oath–to report the NSA violation.

To ask the question, “Which duty prevails?” is almost self-answering: The NSA derives its authority from the Constitution; without the Constitution it does not exist. It is therefore subordinate to the Constitution and the greater duty of the employee is owed to the Constitution.

If the employee fails to report the act, the employee is complicit, breaching his or her own oath.

The NSA immediately counters, “We have an office for that.” But the interesting thing with that office is that it takes no action for the violation of the Constitution, but has been demonstrated to heavily penalize any employee that reports a violation. It therefore demonstrates itself to be complicit in the same acts the NSA is asking the employee to become complicit. For the purpose of meeting the employee’s duty of trust to the Constitution, it is useless.

So now let’s consider a quote from the article:

[…] whistle-blowing has become the civil disobedience of the information age.

As I hope I have demonstrated above, that is flatly wrong: It is not “civil disobedience” to meet the demands of your oath to the Constitution. It is your duty…a greater duty than the same employee owes to the NSA.

If the NSA had an effective internal enforcement mechanism, it might be different. It clearly does not. The NSA is therefore doubly culpable in its acts in that, first, it deliberately violates its Constitutional duty; and, second, it deliberately acts to punish complainants rather than oath-breakers. (Yes, oath-breakers, because every person in authority in the NSA swore to the same Constitutional oath.)

In the face of that, it isn’t a matter of civil disobedience: It is a matter of a person like Snowden meeting their greater duty (at great cost, BTW, because the NSA intends to destroy him by any available means).

This extends to every person who exists under our system. I, for example, in roughly ascending order…

  • …have a responsibility of trust to myself;
  • …have a responsibility of trust to others, for the promises I make to them (including businesses); personal promises and contractual ones;
  • …work for an employer and owe a duty of trust to that employer;
  • …reside in the state of Florida, and owe a duty of trust to its laws; and beyond, to its Constitution;
  • …reside in the United States, and owe a duty of trust to its laws; and beyond, to its Constitution

This isn’t theoretical to me: I was once confronted with the reality of a conflict between a personal promise I made, and my duty of trust to my employer. That is not a comfortable place to be.

Because of that, I have great sympathy with Snowden, because he was in a much more uncomfortable place, a place in which he had to exchange all the rest of his life (no matter where he goes) to resolve a conflict between his duty to the NSA and his duty to the Constitution.

That is a problem that shouldn’t ever arise; the fact that it does is proof that the NSA is out of control. It is also the hallmark of whistleblowing: The case where a person must choose between two spheres of trust and breach the lesser in favor of duty to the greater.

And notice here, another hallmark of whistleblowing: In every valid case, the party committing the breach that is reported, has duty to the same sphere to which the whistleblower reports. The NSA–and every person in it–has the same duty to the Constitution that Snowden had.

To then relegate his act to that of leaker; to talk about leaking as a problem not distinct from whistleblowing, is to pretend that the problem of conflicting spheres doesn’t exist. The NSA says, “We can do anything we please,” and that is wrong. They then say, “Snowden owes us trust greater than he owes to the Constitution”; and that is also wrong. To relegate Snowden or whistleblowers in general to the status of “another leaker” is to honor the crimes the whistleblowers would report and to dishonor their duty to the greater spheres of trust.

y6kk36df8 August 27, 2013 12:57 AM

@Bruce
One aspect of security in organizations that seem to be missing from the list is knowledge.

Users in general want security, which requires a certain degree of control (over access codes, storage media, communication channels and so on), while insisting that they do not need to know anything about how it works.

That’s fine for most users and consumers, as they are rarely in possession of too much sensitive information, and they tend to operate in less complex frameworks. The problem in a large (and/or powerful) organization with that kind of attitude is that the (traditionally) powerful people (management, CEOs, and everyone more than one desk removed from the actual coding) tend to follow the same pattern. They don’t want to know how it works, as long as it works. But how can they tell that it’s working? They must rely on people like Snowden.

I bet that the reason they gave him such freedoms and fairly unrestricted access was that he was the only one who really knew what he was doing.

Figureitout August 27, 2013 1:05 AM

In the information age, though, it may be impossible to avoid.
Bruce
–Yeah, in the “information age” w/ devices that have more functionalities and capabilities that no one can possibly secure and even worse they don’t even understand. Not to mention connecting a device to the internet and now it’s vulnerable to exploits that billions of people may have thought up. These can then be combined w/ “out of band” or perhaps very much “in band” exploits that only freaks will discover. How much fail can humans take, really….

Poul-Henning Kamp August 27, 2013 1:19 AM

For entirely bad reasons, we still operate with the omnipotent model of system administration. Instituting a two-man rule is about the only thing you can do to mitigate that.

Second, if you get rid of 90% of your system administrators and institute a two-man rule, the remaining sysadms will have to perform 20 times better.

Third, they already paid Snowden $200k/y, and that got them no loyalty. I wonder what they’ll have to pay sysadms to be 20 times more effecient and loyal ?

Finally, and most importantly: Charles Stross raised a very good point some days ago, about how loyalty in the workplace has been eliminated:

http://www.antipope.org/charlie/blog-static/2013/08/snowden-leaks-the-real-take-ho.html

Snowdens generation has never had access to job security, and may not even desire it, should it be offered.

Therefore, the “Company has my back, I have their back” loyalty has largely been eliminated from the security toolbox. This will be particularly troublesome for “secret” organizations which previously very much relied on the unconditional loyalty of their employees, who in return could count on good employment until the golden watch.

That Snowden was a contractor, yet had full, omnipotent sysadm access, shows how badly NSA overlooked this tacit assumption in their security model.

Jakub Narebski August 27, 2013 1:22 AM

Well, clearance process / background checks would not work against whistle-blowers which (like Snowden) leaks because his/her moral integrity, because what you are doing is illegal, immoral, injust…

Gerald August 27, 2013 1:40 AM

More surprising than Snowden’s ability to get away with taking the information
he downloaded is that there haven’t been dozens more like him.

We don’t know that.

What Snowden did for the public how many are doing it for foreign intelligence?

Mirar August 27, 2013 1:44 AM

I’m immediately thinking about the Lavabit / Ladar Levison construct, where not even the system administrator/system owner could easily get into the documents stored on the system.
[ http://arstechnica.com/tech-policy/2013/08/how-might-the-feds-have-snooped-on-lavabit/ ]

TL;DR: The construct is that you store data encrypted, where they keys (or passphrases, in this case) are stored at the owner of the data. The system can’t decrypt the data without the data-owner unlocking it.

It should be a fairly good way of protecting data from system administrators.

(Of course, if you have access to the source and keys used to initiate the key exchange, you can still technically snoop and get the keys at the next exchange. But you still can only get the documents where the key is actually used, and it will take a lot of work.)

response to Brian August 27, 2013 3:40 AM

Brian:

>
The implication seems to be that “things it doesn’t want to read about on the front page of the newspaper” refers to bad things and that organizations can avoid the issue by doing only good things. But it doesn’t take much imagination to come up with a whole list of things that organizations would legitimately want to keep off the front page of the paper without falling into the category of “bad things”.
<<<

Of course organizations have legitimate/good secrets, but whistleblowers (which is what this article is about, after all) aren’t interested in revealing legitimate/good things! Whistleblowers are interested in revealing the illegitimate/bad/ things. If the organization only does good things, whistleblowers would not feel the need to whistleblow.

The organization would still rightly be concerned about protecting its secrets (e.g. from business competitors), but it wouldn’t need to worry about whistleblowers.

ATN August 27, 2013 4:46 AM

More surprising than Snowden’s ability to get away with taking the information he downloaded is that there haven’t been dozens more like him.

We don’t know that.
What Snowden did for the public how many are doing it for foreign intelligence?

Or simply for cash?
It is like trying to estimate the number of vulnerabilities of some software based on the number of disclosed vulnerabilities, ignoring the price paid for undisclosed vulnerabilities.

Keith Glass August 27, 2013 5:52 AM

Bruce, you note:

More surprising than Snowden’s ability to get away with taking the information he downloaded is that there haven’t been dozens more like him. His uniqueness — along with the few who have gone before him and how rare whistle-blowers are in general — is a testament to how well we normally do at building security around trusted people.

How do we KNOW there HAVEN’T been dozens more like him – just doing it for cash or support of another nation?

Not just espionage, but intentional placement as an insider for access to information. There’s already some evidence that Snowden was in contact with Wikileaks, and had carefully cultivated particular assignments just to be able to access the data. It seems intuitively obvious that other nations have also had people do this, and not just at NSA. We saw this same scenario at Los Alamos several years ago, with carefully planted Chinese agents worming inside, and then doing an “exit, stage right” with a suitcase or three of secrets. . .

Gen Knoxx August 27, 2013 6:12 AM

In reply to: His uniqueness — along with the few who have gone before him and how rare whistle-blowers are in general — is a testament to how well we normally do at building security around trusted people.

I’d say: No. It’s probably not. It probably means something entirely different from that.

Jason August 27, 2013 7:01 AM

While a sysadmin needs to have access to “everything”, he doesn’t need to be able to read it. Why didn’t the NSA encrypt all data on the client side with user keys that are hidden from the sysadmin? Sysadmin can do his job, manipulating blocks of arbitrary data and the server programs that access them, without being able to read a damn thing.

Layer_8 August 27, 2013 7:21 AM

There will always be two people performing system-administration tasks on highly classified computers.!

I hope the second person is no other sysadmin out of the IT. I think it must be a person out of the group of data owners who can value the risks by its content and understand the need of this access.

As in a few posts described, it depends on the personal spheres of any employee, too. If you order an employee to do something illegal and let him/her alone with this situation, the decision could be good or worse for you.

To solve this moral conflict you could hire psychopath but this would create more problems than solving them.

If NSA wants to act like before they should ensure that moral conflicts don’t come up, even if the order isn’t legal. They have to give the employee good reasons on the hand to deal with this conflict and a few persons, who are personally responsible for this order and risk suspension or jail (in other words a “pawn sacrifice” that the employee don’t feel guilty). They need people who do anything in the name of anti-terrorism, freedom for homeland, etc.

Any effort to find security bugs to make the systems a bit more secure has turned from white to black, knowing that this could have been used by NSA & Co. to hack american systems and foreign systems first. It’s a frustrating situation.

Anonymouse August 27, 2013 8:52 AM

A public or private organization’s best defense against whistle-blowers is to refrain from doing things it doesn’t want to read about on the front page of the newspaper. This may come as a shock in a market-based system, in which morally dubious behavior is often rewarded as long as it’s legal and illegal activity is rewarded as long as you can get away with it.

Until now documents with technical details of how the NSA surveillance works seem to be unpublished. If these documents will find their way online in unencrypted form I fear of the possibility that foreign countries could use the systems for their own espionage. I hope NSA is aware of this.

John August 27, 2013 9:21 AM

I don’t know why they spend so much effort trying to secure all this data. If they’re not doing anything wrong, then they should have nothing to hide, right?

But wise-ass comments aside, yes, if you’re going to secure a system, at least read the security chapter out of an undergraduate database textbook. I mean, seriously, this is strictly amateur-hour stuff, if your administrator can actually read all the data.

Adrian Ratnapala August 27, 2013 9:29 AM

More surprising than Snowden’s ability to get away with taking the information he downloaded is that there haven’t been dozens more like him. His uniqueness — along with the few who have gone before him and how rare whistle-blowers are in general — is a testament to how well we normally do at building security around trusted people.

For every whistle-blower, there will be dozens of spies working for foreign governments, private companies and even criminal organisatons. We don’t find out about those guys in the media.

vas pup August 27, 2013 9:44 AM

Just couple thoughts:
Trustworthy is dynamic not static and depends on loyalty which is dynamic as well depending on multiple factors (money/salary, recognition, other interests, respect, reciprocity, etc.).
Loyalty is two-way street by default. All attempts by private/government employer to establish ‘blind’/unconditional employee’s static loyalty framework will sooner or later decrease loyalty, and as result, trustworthy. The better is corporate culture towards employee, the higher is loyalty and trustworthy. Treat your employees as you want to be treated – simple (but not panacea) remedy. By the way, in a long run those corporations are more profitable.

Trustworthy could be undermined from inside (see above) or from outside based on M.I.C.E. system (CIA/FSB/Mossad/KGB/STASI etc. ) used it:
M oney
I nterest (non-financial: nice chick, “honeypot” )
C oersion (direct or inderect threats/violance towards object/target or relatives/important others)
E go (recognition, validation, fairness, transperancy etc.).
Meaning, that money ($ amount) is just one component.

It is easier to undermine loyalty from outside when half of the job was done from inside already.

Snowden had two steps in his activity:
wistleblower (leaking NSA activity on spying inside USA in violation of Constitution) and traitor (leaking NSA activity on spying on foreign agents/countries). He should be rewarded for former and punished for latter – that is objective point. Procrastination on the Government side on the first step lead to the second step which could be avoided by timely granting Snowden immunity and guarantees of personal safety with option to testify before US Senate.

Gweihir August 27, 2013 9:59 AM

Loyalty is key. And loyalty (like trust) can only be freely given and can never be demanded successfully. Penalties, threats, etc. do not work, at least for people with intact personal integrity.

So the only thing that really works is having people handle the secrets that do not want to leak them. And that only works if they believe in the organization, and believe it is the right thing to keep the data secret. No technical measure can prevent people that work with certain data from leaking it. True, the nature of the data and additional protection may make it hard to leak, for example you cannot copy a video into you biological memory. But you can still transcribe it later at home. For leakage of things like names, addresses, etc. even people with bad memory can carry out several in-memory per day.

That means this problem has an often-overlooked component: What is in the data to be protected is critical and can compromise the loyalty of the people handling it.

Hence: Make sure you do not do things that are so repulsive that the people handling the data have their loyalty compromised. Of course, for the US intelligence community in particular, that requirement may be difficult or impossible to fulfill.

Kevin Granade August 27, 2013 10:03 AM

Something I’m surprised you didn’t highlight specifically* is that idealistic leakers don’t seem to be part of the threat model. Most screening seems to be aimed at excluding individuals vulnerable to blackmail or corruption, and is therefore useless against someone that comes to the rational conclusion that keeping certain secrets is more damaging than releasing them.

There’s another aspect to this, which is that a very strong defense against idealistic leakers is to convince society in general that leaking is an immoral act. In this light, the unified front of government officials demonizing Snowden makes perfect sense. Conversely, a strong weapon against secrecy is a message of support for leakers and whistleblowers.

*This is indirectly addressed by “don’t do things that people will feel the need to leak”.

Johannes August 27, 2013 1:26 PM

The article that the linked article references for the 4M figure actually states that 3.6 million people have clearance for secret data, but “only” 1.4 million have a Top Secret clearance.

John Doe August 27, 2013 5:20 PM

I’m not sure you folks realize just what a system admin can do to compromise data. It’s not just setting permissions on a file or logging who reads it. The sysadmin can dump the entire drive, partitions and all. The sysadmin can “borrow” backup tapes and duplicate them. The sysadmin can put a thumb-sized computer on the network and sniff/record all network traffic while the backup runs over the network. Given the amount of data involved, harddrives in the drive arrays have to be failing constantly. With so many drive replacements every day, all you need is a few minutes with one of the “soon-to-fail” drives. Bonus points if you turn it into a “failed” drive while stealing data. No evidence.

In this sort of environment, preventing theft of data is quite difficult. It really needs to focus on the human element, removing the motivation to steal.

As others have mentioned, some sort of end-user-decryption / only store encrypted files is viable. Many operating systems (Mac OSX, Linux) have support for files containing encrypted filesystems today using twofish or AES. Or perhaps some sort of end-user dongle that handles decryption.

But you are fighting the sysadmin here. The fellow who can keylog your every keystroke, or who can use a dot-camera to watch you type in a password, or who can replicate the codes off any USB device you plug in. And, in general, your better sysadmins are going to know how to break virtually any COTS technology. Especially at some place that specializes in breaking security technology.

I would suggest going the other direction. Create a “dirty tricks” subteam, embedded in other organizations such as newspapers, whose job it is to pry secrets about particular topics out of particular people.

It’s a fun probabilistic mathematical problem, since you don’t know who you can trust. Your “breakers” could be just as corrupt as your sysadmins. But with sufficient overlap and suitable rewards for success on the part of the breakers, you can develop a high confidence in your employees.

You can also go the other direction, and compromise people based on illicit activity. Those who are willing to do illegal things for “their country” and keep their mouth shut can be promoted to more illegal roles with a higher degree of trust. At the very least, you can blackmail them into keeping silent with threats of jail time over past activities.

Morality is kind of out the window here. But, you know, the fellow who outs some minor things to a newspaper on the basis of honor or serving his country. Well, don’t let him near the deep dark secrets. Those who don’t spill the beans, well you can trust them a bit more. Frequent testing over small inconsequential matters will filter out many of the untrustables before they can do significant damage.

The fellow who sells out for money is another kettle of fish. You might be able to pay him more, get him to do the really dirty work, and blackmail him with threats of jail time. Though he has to be aware that he is digging an ever deeper criminal hole for himself and that at some point he will be sacrificed as a scapegoat. In which case, he will try to establish a counter-arsenal of stockpiled off-site secrets and seek to flee to a non-extradition country beforehand. Interesting problem that. Amoral too.

Which brings up the big problem: Long-term deep-cover “perfect” employes going rogue after they have stored extensive quantities of secrets off-site. There just aren’t a lot of ways to stop that.

Come to think of it, most sysadmins, aware that they are being forced to be complicit in breaking the law, would seek to build up a stockpile of off-site secrets to cover their ass or as a bargaining chip for when the cops come to arrest them. It’s always the little guys who get the worst jail sentences as the bigger guys can cut deals. With a stockpile those sysadmins could cut a deal.

The NSA must be royally and truly fucked right about now.

Wesley Parish August 28, 2013 8:03 PM

@Anonymouse

Until now documents with technical details of how the NSA surveillance works seem to be unpublished. If these documents will find their way online in unencrypted form I fear of the possibility that foreign countries could use the systems for their own espionage. I hope NSA is aware of this.

I think an average espionage agency could follow the paper trail of NSA purchases to determine their hardware bases; follow the paper trail of software companies in supply contracts to the NSA to get a picture of the NSA’s intended software capabilities; follow the paper trail of power purchases from energy companies to find out how much of that power is being used at various locations.

Michael Moorcock made the observation in either The Russian Intelligence or The Chinese Agent – I forget which – that the “Intelligence” agencies are mostly a big show, a way of wasting taxpayer’s moneys. This was seconded by of all people, NZ Prime Minister David Lange, making the comment during the nineties that you could often get the same “intelligence” from the newspapers a lot quicker and cheaper.

No, “Intelligence” Agencies have historically acted primarily as secret police to control a state’s own population.

As it stands, the foreign “Intelligence’ Agencies that invest and reduce the NSA in these ongoing petty “cyberwars” will be in possession of a vast hoard of information on every US citizen that makes open war a pointless, macabre, tragicomic joke. With that information you could take control of the US – and the NSA, like good little Quislings, have stored up such a treasure trove, like a chocolate bon-bon, hard on the outside, soft on the inside.

I’m surprised nobody’s yet lynched Clapper and every other NSA employee that they can get their hands on.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.