First Look Media Shutting Down Access to Snowden NSA Archives

The Daily Beast is reporting that First Look Media—home of The Intercept and Glenn Greenwald—is shutting down access to the Snowden archives.

The Intercept was the home for Greenwald’s subset of Snowden’s NSA documents since 2014, after he parted ways with the Guardian the year before. I don’t know the details of how the archive was stored, but it was offline and well secured—and it was available to journalists for research purposes. Many stories were published based on those archives over the years, albeit fewer in recent years.

The article doesn’t say what “shutting down access” means, but my guess is that it means that First Look Media will no longer make the archive available to outside journalists, and probably not to staff journalists, either. Reading between the lines, I think they will delete what they have.

This doesn’t mean that we’re done with the documents. Glenn Greenwald tweeted:

Both Laura & I have full copies of the archives, as do others. The Intercept has given full access to multiple media orgs, reporters & researchers. I’ve been looking for the right partner—an academic institution or research facility—that has the funds to robustly publish.

I’m sure there are still stories in those NSA documents, but with many of them a decade or more old, they are increasingly history and decreasingly current events. Every capability discussed in the documents needs to be read with a “and then they had ten years to improve this” mentality.

Eventually it’ll all become public, but not before it is 100% history and 0% current events.

Posted on March 21, 2019 at 5:52 AM18 Comments

Comments

Nameless Cow March 21, 2019 9:37 AM

@Bruce

Every capability discussed in the documents needs to be read with a “and then they had ten years to improve this” mentality.

I only paid casual attention to the headlines and summaries of news coming out of the documents. To me, the biggest revelation is not the specific details, but that no imaginable way of attacking a system is too far-fetched. If you can imagine it, they’ve probably looked into it, and there’s a good chance that they are actually doing it.

RealFakeNews March 21, 2019 10:09 AM

Is it really a loss? AFAIK Greenwald never really published much as the result of the archives.

Is there really anything in there? What did we learn?

From what I can tell, the only real things we discovered were that mass surveillance is on a scale we didn’t think was happening; that anything goes (technically); that they were more advanced in what they’d achieved than anyone thought actually possible; and that just about every system in existence is massively insecure.

Being honest, if you just took a cynical view, you’d already be closer to the truth than you’d think.

Other than this, what else did we learn? Not much, I’d suggest. Only that we under-estimated the scale, and consequently how good these people are.

Have any new classes of attack been discovered because of the leaks? Is this why we’re now seeing catastrophic discovery of processor flaws, such as Spectre and Meltdown, or were “common” researchers already looking in this direction?

Conversely, if not, were these attack-vectors mentioned anywhere in the archives?

Does public research influence the alpha-numeric agencies in their work, or are they broadly seperate?

So many questions, so few actual answers that I thought we’d see by now.

1&1~=Umm March 21, 2019 11:05 AM

@RealFakeNews:

“… that they were more advanced in what they’d achieved than anyone thought actually possible”

Actually that was not the case. Due to what was happening in Uhta with the NSA data silo our host actually asked the blog readers what they thought the capabilities might be. This was before Ed Snowden released the archive he had built up.

I think even our host was some what shocked to find from those a little closer to the technical capability side just what could be done “store it all” and the implication of what that realy ment in terms of the NSA effectively being able to go back in time to re-run various traces etc. In short the total demise of ephemeral communications.

The point being that there are people on the technical side who know, not just what the laws of physics will alow, but also the resource cost to do it, thus what budjets will alow. These sorts of people tend not to discuss it outside of their own work domain, not because it’s secretvetc but simply because the average person not only does not believe them, but actually refuses to believe it after being given proof.

The same happened with the Ed Snowden Revelations, people went into mass denial, then for most the other stages of mourning. In a way for them it had been a major death because what they had been brought up to believe was shown to be total lies. Some never came to terms with it and are still in denial, and would rather ‘kill the messenger’ than admit the truth of the message. Which is just one of the reasons Ed Snowden is still effectively in limbo in Russia very much dependent on the vagaries of international politics.

As for what’s in the archive we mostly do not know. It’s been alleged that there are over 1.7 million documents in there, which for obvious reasons nobody has actually read. Surching by term is a very inefficient way to get information out of such a repository so I doubt we have even seen a fraction of a percent of what is in there.

Personally I think the whole thing should be put into the public domain such that thousands of eyes can look for what they can find, that way we still stand a chance of getting worthwhile and actionable information out of it, of which it would be reasonable to assum there is still a fair amount of.

But it won’t be put in the public domain because ‘Knowledge is power’ and certain people have done very well out of it, and it is unlikely they would want to give up such power. Especially as others have indicated that power might be that between life and death.

Some Rando March 21, 2019 1:12 PM

Uh…. if they delete it, can we be sure the entire archive will be released at some point? Greenwald says he has a full copy but if nobody else has seen the full archive we will never know for sure that the entirety of it has been released to the public. If they want to just stop hosting it, maybe they should turn the entire thing over to the web archive project and be done, or release it as public domain.

SpaceLifeForm March 21, 2019 2:37 PM

The pile of documents is a haystack.

There are still needles.

Maybe we could put NSA to work on the search for the needles.

Rach El March 21, 2019 9:01 PM

i suspect the fact of the release has historically been more important than the details we’ve gained, apart from the broader facts (mass surveillance etc) So far.

Snowden had two conditions
1. the material would be vetted and published by journalists – believing he wasn’t the right person to do so – which is why he handed it over completely to GG and didn’t retain anything for himself
2. names and places that may endager personal or national security would be redacted – see 1.

I am sure this is why it’s not, a la Wikileaks, going to be in the public domain

A reminder of Greenwalds book ‘No Place To Hide’ for the full story of the Snowden journey, which also puts to sleep the various controversy and disinformation in the MSM (he was a spy, he gave details to russia, he risked lives blah blah)
The second half of the book gets into the issues with mass surveillance, and while a scholarly read and no doubt useful it was very effective at putting me to sleep. Unusually for Greenwald I may add, as he is always so compelling

JonKnowsNothing March 21, 2019 9:29 PM

We lost our innocence… or rather our naiveté.

For years before Snowden, people talked, wrote and asserted that the US was not only “watching others” they were also “watching us(a)” too.

For years before Snowden, those that knew what was going on, were smeared, slandered and subjected to arrest and legally harassed. Those that attempted to Speak Out paid the price of the for the world.

The rest of us, stayed MUTE.

We said NOTHING.

We let others be lynched in the public domain, being blacklisted and did NOTHING.

Then Snowden showed us what we already had heard was true. Those that sought to side-pass the details, that controlled the laws and courts where corrupt as all other entities in the world who subscribe to the same “values” are corrupt.

It showed the corruption and hypocrisy of High Tech and Big Business.

It showed the corruption of ALL THE PEOPLE WHO WORK in those fields.

It showed WHY we cannot have a secure internet. Not Now. Not Ever.

It revealed the shame of all of us who help build it and continue to be employed in the same fields.

What remains is only an illusion: A Dream Time.

Tatütata March 21, 2019 10:26 PM

These news confirm IMO the generally dim view held by John Young and Deborah Natsios of Cryptome on The Intercept. They have been arguing for full dump.

1&1~=Umm March 21, 2019 11:55 PM

@Tatütata:

“These news confirm IMO the generally dim view held by John Young and Deborah Natsios of Cryptome on The Intercept.”

They are not the only ones. Let’s be honest the Intercept has gone for the human factor in it’s stories, with little or nothing on the technical side.

I’m guessing most readers who come here are not looking for the human factor but the technical.

Whilst ‘human interest sells papers’ just like ‘My horrible surgical scars’ stories do, neither helps in solving the problem, only the technical information does that.

But there is no corporate lawyer in the world that will not go ‘waily waily waily’ if you try to publish hard technical details. Because ‘there be dragons’ of liability involved without the explicit consent in writting of the ‘technology owner’ for that which is not yet in the public domain.

As any engineer will tell you 90% of fixing a technical problem is actually finding the fault. It’s why repairing technical issues are more often called ‘fault finding’ rather than ‘repairing’. That is the assumption is the actual repair comes as the final stage of fault finding as it verifies the hypothesis.

The inverse of this is that you can not ‘fix’ a broken system unless you are aware of how and why it broke. Further you cannot ‘design out’ such a fault in future without being aware of how and why earlier systems broke.

Few admit it in our corporate world but whilst imagination might be the mother of inovation, technical history is the father of the design that makes the inovation possible.

Without the technical information in the Ed Snowden trove, we effectively have ‘sensationalism’ as the reason to publish, which ages very very quickly, and rarely fixes anything unless there is already the political will to do so.

The technical information teaches, thus knowledge is a result, this is essentialy timeless from then onwards as it’s legacy is avoiding the mistakes that gave rise to the technical information in the first place. In time such information becomes woven into the fabric of design, almost as lore but usually a lot more practical, even when underlying technologies change.

It was lack of knowledge that led to the CIA fielding a badly designed system for use by their agents. We know for a fact that this lack of knowledge has led to the deaths of tens of peoplescand the disaperance of others. I can not say for certain that the knowledge that would have prevented such a system ever being fielded is in the Ed Snowden trove but it is if you thibk about it highly likely as both are ‘Signals Intelligence’.

The mealy mouth cowerdice that the Cryptome owners lay at the door of those who hold the Ed Snowden trove who use the ‘don’t reveal method and sources’ as people might get hurt excuse, has had the exact opposite effect, it probebly has led to the deaths of those agents.

As gets pointed out about technology it is ‘agnostic to use’ and ‘the use is decided by the directing mind’. If you deny agnostic information to people then people can not learn from it thus others will be hurt without doubt. One of the greatest leasons history teaches us is the harm to mankind caused by the witholding of information, because as an act it imprisons people in effective slavery and stops them becoming greater than the parents were.

In short withholding knowledge is an attempt to stop evolution, and we know where that leads, we call it ‘extinction’, for some unfortunates that has already happened, and questions raised that should be answered.

Rach El March 22, 2019 1:07 AM

1&1~=Umm
Tatütata

The mealy mouth cowerdice that the Cryptome owners lay at the door of those who hold the Ed Snowden trove who use the ‘don’t reveal method and sources’ as people might get hurt excuse, has had the exact opposite effect, it probebly has led to the deaths of those agents.

I wouldn’t be surprised if those people, Snowden and Greenwald including, are indeed regularly following the comments on this blog. They’d certainly be reading this particular post

hmm I think it’s time for a russian reversal 😉

Who? March 22, 2019 12:28 PM

We need the full [but redacted to protect the names of people involved on them] archive being available to download or, at least, available to true security experts.

Mr. Snowden did a big mistake sharing it with journalists that only want short-lived sensationalism (as said by 1&1~=Umm) and, what is even worse, damage the reputation of the United States to increase the number of readers of their stories.

Journalists are, by definition, non-technical staff. They are not qualified to understand the true value of these documents. They have no knowledge. Instead of using the NSA archive to improve computers security by closing as many vulnerabilities as possible (improving networks and computers security is part of the NSA mission) they choose to err on the “human side” of the documents, damaging the reputation of the United States and the NSA, turning its relation with allies into a nightmare and publishing short-lived uninteresting stories about how bad the IC is.

In my opinion, the NSA archive had been managed on an unprofessional way for six years.

Not sharing the right (i.e. technical) information with the members of the security community will make the world victim of more wannacry events again.

Sz March 22, 2019 12:32 PM

Goes to show they are foreign spies not interested in open journalism.

The silver incentives and/or the lead disincentives were applied judiciously in this case.

RealFakeNews March 22, 2019 7:35 PM

@1&1~=Umm

I remember those discussions taking place at the time on this blog.

What I was trying to say was that there seemed to be a few things (beyond the ability to go back in time with new technology and apply it to old communications as if they just happened) that were in the archives regarding capability that were previously thought “too much” for even the NSA-types to have the time to do (such as searching for zero-day vulnerabilities in just about anything with an internet connection).

They didn’t just have a vast array of attack vectors for various pieces of equipment, but they had very good attack vectors for various pieces of equipment using attacks that appeared to require 20 years just to discover, never mind dozens of them years before anyone else even thought about it.

When the first Spectre/Meltdown vulnerabilities were announced in Intel processors that just absolutely destroyed any idea of security on the x86 platform, I seem to recall it being called a “new class of attack” that has almost no way to mitigate it short of all-new architecture.

Given that so much of the leak has so far not really been covered, we don’t know if such attacks were attempted by the likes of the NSA.

Given the Spectre/Meltdown attacks are just over a year old, and these docs are at least 6 years old, then is it a fair assumption from what we do know, that the NSA is (was) at least 10 years ahead of the best research we know of today?

1&1~=Umm March 23, 2019 6:00 PM

@RealFakeNews:

“I seem to recall it being called a “new class of attack” that has almost no way to mitigate it short of all-new architecture.”

Only it was not new, it was known that the IAx86 architectute was in deep dodos quite a while prior to that, and I don’t mean by just the designers who should have known a decade or more back.

There have been warnings not to use IAx86 hardware from this century on this blog and from time to time it was pointed out that the complexity around the core CPU/ALU had gone over a tipping point. Researchers at variois Uni’s had shown how the address decoding had become so complex it had in it’s own right become ‘Turing Complete’ which ment it was in practice a hidden CPU sitting between the RAM and the intended core CPU/ALU and thus sat below it and could do similar attacks as an unrestricted DMA unit which has plagued the likes of Apple’s high speed serial interconnects (fire wire etc).

The thing people tend to forget about hardware engineers is that when it comes to exploits they know a lot but say very little outside of their community.

One reason for this is at a certain point down the computing stack you can not have security, that is security is a function of complexity it’s self… You need a minimum of complexity to be able to implement any kind of security. So it’s well known to hardware engineers that all supposedly secure systems have foundations made of insecure components and subsystems. With a further implication being that various security measures require different levels of complexity, so if you have insufficient or the wrong types of complexity then your system will be by default insecure in some way.

The question then arises as to ‘Do we know all the ways that a system can be insecure?’ to which the answer is no we don’t and even if we did the complexity required to cover them all would open up new pathways for new insecurities.

Look at it like this, you can securely guard a tree, you might be able to securely guard a very small wood, but a forest is a non starter security wise.

The IAx86 architecture went from being a wood to a forest around the early Pentium CPU’s. The only reason you can make such hardware secure is it did not have the later complexity of the extra hidden CPUs intentional or otherwise. Thus by use of an RF tight case suitable filtering and dampening you could keep things ‘in the box’ unless you decided to connect it up to some kind of communications channel be overt, covert, subliminal, or just unrecognizable to most as a potential comms channel..

If you search this site for @Nick P, @RobertT, @Clive Robinson, @Wael and TEMPEST or EmSec then you will see a whole set of rules you have to go through to design a system that could ‘keep your privacy in a box’. Later discussions indicated what you had to do, to ‘put a box around the box’, in terms of home made secure information facility (SCIF). The thing being that the likes of WiFi can be used not just as static radar to see the operators hands and fingers, but also it can if placed inappropriately cause the likes of the serial data on keyboard cables to be cross modulated onto the WiFi carrier. If you look on the Internet some out there call the various issues ‘HI-JACK’ and ‘TEA-POT’ (with and without hyphens).

Likewise as pointed out here and in other places the old ‘Air-Gap’ ideas are well and truely out of date you have to improve your level of issolation immensely. I think it’s Techion University in Israel that keeps showing how simple it is to get information out of laptops including ‘PubKey private key-bits’ by acoustic means. But the simple rule of thumb is ‘If the laws of physics say it’s possible, then somebody will do it sooner rather than later.’.

I guess it won’t be long before the only cost effective way to get privacy is take a pick and shovel into the woods and dig a deep hole and sit in it…

But that’s future, the NSA and GCHQ are all about past. As long as they always have a slim lead be it mear months not years, they can exploit that lead. We saw this with Bletchly Park during WW II. Past recovered plaintext is a statistical key to future plaintext recovery, alowing certain forms of attack that would not have been possible without the past recovered plaintext.

But with regards the NSA, GCHQ and other SigInt agencies being a decade ahead there is actually no real reason why they should be. Hardware security vectors is a very target rich environment, and neither the SigInt agencies or academia or private research have the resources to cover a fraction of the potential areas.

Thus it’s more like turning stones over on a beach, you never know what you are going to find under them but you know you are going to find something fairly quickly. Likewise the same aplies to the people around them, they are all going to find something, quite a lot of it and fairly quickly, but it is probably not going to be what other people are finding.

1&1~=Umm March 23, 2019 6:09 PM

@Rach El:

“hmm I think it’s time for a russian reversal ;-)”

It might well be so, but it could also be time for what the UK Private Eye calls a ‘reverse ferret’, by a different ‘esteemed organ’.

robert March 29, 2019 4:41 AM

@1+1 wrote,

“Personally I think the whole thing should be put into the public domain such that thousands of eyes can look for what they can find, that way we still stand a chance of getting worthwhile and actionable information out of it, of which it would be reasonable to assum there is still a fair amount of.”

I’m of the believers that everything has a price and Greenwald is no different.

Old Saying March 29, 2019 5:34 AM

@ Robert,

I’m of the believers that everything has a price and Greenwald is no different.

Some know the price of everything but the value of nothing.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.