Schneier on Security
A blog covering security and security technology.
« Outsourcing to an Indian Jail |
| Militarized Marine Mammals »
May 18, 2010
History of NSA Computers
A recently declassified history through 1964.
Posted on May 18, 2010 at 1:16 PM
• 27 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Fascinating! I was particularly impressed with the ABNER's memory system: acoustic waves in mercury. I've read about such things in old sci-fi books, but I never realized that systems like that were actually used.
Oh Cool! The NSA invented Al Gore.
Declassifying it only through 1964 is pretty stingy. The latest and greatest in that document is 46 years old. If IT development is about 3 times faster than everything else (i.e. a new technology generation every 3 years versus 10 years for other fields), this is like the U.S. Navy releasing information about propulsion designs up to 1872.
It would appear that concerns about information privacy needed to be addressed far earlier than we might suppose.
"One of the reasons for the great flexibility of the modern computer is its ability to modify its own instructions or the course of problem solution, depending upon intermediate results. A variation of this capability is the technique of causing the incoming data-stream characters themselves to form addresses for insertion in skeleton instructions later to be executed."
And thus was born the buffer-overflow attack.
Seriously, as a veteran of early '60s computing, self-modifying code was a key technique for conserving precious memory.
Yes, during the Vietnam war, there were several attempts by radical groups to destroy computers that were owned by the government. They distributed leaflets on how to best destroy a computer, if you had the opportunity.
@Mike Wyman: And thus was born the buffer-overflow attack.
Actually, most buffer-overflow attacks modify data, often return addresses. The new name for "self modifying code" is "Just in Time Compilation", and yes, it is still a poweful optimization tool, although the lack of standard APIs to allow execution of former data-space can be annoying.
Note for Pedants: the paper widely considered to have introduce the notions of stored programs and self-modifying code (First Draft on EDVAC) describes a machine which could not modify instructions willy-nilly. Storing to a word tagged as an instruction would alter only the address, not the whole instruction.
@Grande Mocha: One of my college professors had spent several months programming a computer with mercury acoustic memory. He once had a bug which caused an infinite loop, and the repetitive loop created a vibration in the memory system which physically smashed the glass tubes, spilling the mercury onto the floor! He told me that the computer owners were not particularly upset. It was something that happened from time to time.
It stopped just before the CDC 6000 Series machines. Too bad.
OT but Bruce's new mug shot is better than the old one imho... Less, shall we say, 'professorial'...
I think I came across some more upto-date versions of this in a set of documents I recently had released to me.
I will endevor to examine them once I get off work, however, I have only managed to scan about 8 boxes of 25 so far.
Neat document. The mercury memory was new to me as well. I wouldn't overclock it. ;) Looking through the pictures, I think I saw "Joshua" in there. It was big and had lots of blinkety lights. I wish I a room sized computer with that looked like something out of a 90's hacker movie. *conspicuous cough*
I wonder what happened in 1965 that they couldn't share? They didn't pick a simple rounded number of years (such as everything up to 30 years ago) so could it be that in 1965 they did something which they aren't ready to talk about yet?
Random question: Why are the black "page border" markings all at jaunty angles but the text is perfectly lined up?
Am I missing something when I scan documents?
"I wish I a room sized computer with that looked like something out of a 90's hacker movie. "
Naah... What I wish is that I had the room sized computer AND the Bond girl...
I wish the pictures were clearer.
@Section9_Bateau "...only managed to scan about 8 boxes of 25"
Any better resolution of the photo's would be ideal.
@greensquirrel "..."page border"...?"
At first I thought it was either an artifact of photostating or removing the classification markings.
But the hole punches are obvious through the border. Now I think it may be both. Depends on the document's provenance. It's been copied at least once before the released copy.
At one time this had a higher classification as witness by the "Regrading to FOUO" annotation. Although that could be regraded from FOUO to unclass in 1976 but then it makes no sense there's be a mandatory declassification review in 2009. But none of the other pages carry any classification or handling markings at all. In usual cases of downgrading the document isn't reprinted/distributed just a line is drawn through the classification level. But here there appear to be no markings at all. (Usually, but not always, indicative of an unclassified publication) So I figure they ran this through a photostat with a cardboard cut out mask around the outside that prevented the markings from being reproduced. (Ellsberg did the same thing with the McNamara report during the 60s)
At the same time the photostat cover was left open during copy and anything not having a visibility in the scan field ended up as solid black space. (borders and binder holes) Scanners today are more intelligent at guessing where the pages end so we don't see it as much but open a book and copy both pages and you'll see the fade out in the center.
I don't entirely like this hypothesis. The page numbers are too clean (they tend to line up with the class markings and I'dve expected them, at least a few, to be missing), there's no evident sign of redactions or changes entered. I guess they could have just masked top and bottom.
@Ari Maniatis "...they didn't pick a simple rounded number of years (such as everything up to 30 years ago) so could it be that in 1965..."
Everything is conspiracy? that's my problem with conspiracy thinking it papers over the gaps in our knowledge without our having to go get more data or test hypotheses.
No. That's pretty much exactly how it is supposed to work. The law says there's supposed to be an automatic declassification at the end of 25 years unless the agency reviews the material and continues the classification. Classify by exception. That's why the tussle to have Reagan era material released on schedule was so contentious. 'cause Hey! nobody cares what happend decades ago right? It's only of interest to historians. It is not like it would hurt or embarass some former high Reagan administration offical who might be running for office or nuthing.
Thanks very much for the pointer.
I'm particularly interested in HARVEST, and its programming language ALPHA, which I've read elsewhere was specialized for cryptanalysis.
My guess, not yet verified, is that one use of HARVEST was on VENONA. HARVEST was returned to IBM in 1980, and VENONA ceased in 1980.
I've been looking and looking but I can't see anything that is not already effectivly in the public domain (one way or another)...
Mind you I must be getting old I've actually played with some of the stuff mentioned...
As for the mercury delay line memory the Science Museum in london had one on display atleast 40 years ago. I remeber my dad explaining how it worked (and he was an accountant by trade).
Speaking of not being as young as we once were,
Bruce, that's a nice bit of "badger" your getting in your beard in your "new photo". I've been trying to cultivate some in my beard but all I've got so far is a bit of "dazzle".
@ BF Skinner at May 19, 2010 6:36 AM
Thanks. Its a better hypothesis than anything I had.
The reason I ask is I recently had to photocopy quite a few pages from an old, bound, report and as you say there is a fade out towards the bottom centre - however this is matched by the slight distortion of the text lines; the text remains aligned with the page edge not the edge of the scan screen. Somehow it looks like the opposite happened here - the shadow implies the paper wasnt flat on the scanning plate, but the text is square.
I am actually quite jealous because being able to do this would have saved quite a bit of editing time.
Off topic but... I like the new photo... very debonair.
There has actually been work done lately on automatically correcting for that sort of issue in scanning books. Some of the modern high-end scanners can detect and fix that in post-processing, and that's starting to work its way down into the consumer-grade scanners as well.
@Grande Mocha: likewise, although I read about it in a not-so-old (well, OK it's 11 years old now) science fiction book: http://en.wikipedia.org/wiki/Cryptonomicon
I thought it sounded like an interesting idea, but I had no idea anyone had actually done it.
Wasn't 1964 when S/360 came out ? ?
just a 'maybe' thought.
Sadly, I was unable to find a more recent version of this in the collection I've scanned to date, however, I did come across one that might be interesting, a good copy of "History of Protection in Computer Systems"
My copy is in good condition, and I OCRed it to make it searchable. You can access it at http://iaarchive.fi/papers/... (server has limited bandwidth and under moderate load, file is 9mb, so save it, don't load in browser)
I expect it will be several months before I get everything scanned, if I come up with it, I'll update of course.
What I found particularly interesting was the mention of what quite probably was the first computer with a hardware RNG: on PDF page 17, it describes the 1952 addition to ATLAS 1 of a "random jump instruction", which could be used to generate "streams of random characters".
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.