Comments

Boyd November 4, 2005 9:11 AM

I’m a retired Navy cryptologist, and I’m on an email reflector for folks like me. This thesis came up for discussion a few weeks ago, and another member of the community had the following to say about it:

“I don’t know about the rest of you, but I didn’t get more than 12 pages into it before deciding that she was WAY off base and didn’t really understand either the systems she was discussing or the nature of the problem she purported to examine for a Master’s thesis. If I had been on that committee, I’d have sent her back to the drawing board. It was technically inaccurate, and not really that good an example of research. I’d like to think that Naval Postgrad School would never have let something like that see the light of day without a pretty substantial re-work.

“Some of the conclusions about key distribution might have been valid, but you have to look at the bulk of the work, not just a portion of it.”

Jacob L E Blain Christen November 4, 2005 10:01 AM

“Some of the conclusions about key distribution might have been valid, but you have to look at the bulk of the work, not just a portion of it.”

Sure thats valid criticism for an academic thesis, but in the field of security all you need is one weak point to penetrate the system. Dismissing the thesis out of hand because of academic invalidity would seem silly given the subject matter.

RSaunders November 4, 2005 10:01 AM

It’s difficult to challenge the technical content of an unclassified report on classified crypto technology. Perhaps we should wait for a good report on the technical aspects after the systems are declassified, if we live that long.

I was much more interested in her analysis on the human element as the weak link in the FBS system. The folks who read this blog are thinking “Duhh”, but the notion that this is master’s thesis news persists outside our community. It is clearly news in the military history community, and that’s an interesting insight to me.

Roy Owens November 4, 2005 10:47 AM

The abstract gave me the idea that the paper examined what was suppposed to be layered security and found it to be fragmented security.

Work like this needs to be done. Otherwise lessons are never learned.

I have as a civilian in street clothes, without showing identification (1) entered Fort Irwin NTC through the main gate, walked into the office of the general, and got his help in finding someone; and (2) entered the USMC Logistic Support Base through the main gate and walked through several of the doors marked ‘Absolutely No Admittance — etc.’. Armed guards just waved me through.

Why was I let in, through, and out again? Because I looked, and acted, like I belonged there. I was embarrassed by the nonperformance of their ‘security’.

Too often what is made to look like elaborate security — seeming so for all the tedious steps required and voluminous records kept — is counterfeit security, going through the motions and fooling most people.

GM November 4, 2005 10:50 AM

I read the whole thing, and I would really be interested in hearing exactly why the thesis was “technically inaccurate” and why “she was WAY off base and didn’t really understand either the systems she was discussing or the nature of the problem.” Especially since the maker of these claims only got to page 12 before deciding this.

Of course, I don’t have the requisite military background, but it looks to me like she nailed the organizational problems pretty well; they’re universal.

A Prohias November 4, 2005 11:35 AM

From the thesis:

Many people have asked the question, “Why did John Walker spy for the Soviets???? The answer is both amply documented and utterly simple: he was greedy. He wanted money, and he did not care whom he had to hurt to get it.

And:

Despite the enormity of the compromises, however, the spy ring was caught only because John Walker’s ex-wife turned him in to the FBI in a fit of drunken spite over unpaid hush money.

Greed for $$ and insensitivity to humans works in both directions. It is because of this bi-directionality that the breaches in security are ultimately (un/dis)covered 🙂

Davi Ottenheimer November 4, 2005 12:01 PM

@ GM

Good request. Some clarification would help reveal something that is supposed to be so obvious.

“she was WAY off base”

Perhaps this is meant to mean she was looking at things from an outsider perspective (pun intended). My sense is that because she approached the problem from a universal view, some within the ranks might say she is still just addressing the surface-level symptoms and never achieved a true insider’s understanding of the full scope of the problems/controls.

Boyd November 4, 2005 7:27 PM

I can’t elaborate on my colleague’s points, unfortunately, although I do believe he was more heavily involved in cryptographic theory and implementation than I was (I was not much more than an end user, maybe you could call me an “equipment operator” when it came to cryptography).

My theory (okay, it’s a wild-assed guess) that how Major Heath described fleet communications, specifically those using KW-7s, was based on research, which was different from how my colleague Bill experienced it.

Another possible factor is that, as I mentioned earlier, we were in the cryptologic field where most of what we transmitted through KW-7s was called “Sensitive Compartmented Information,” whereas Major Heath was addressing the fleet broadcast, which while they’re similar, were entirely different beasts.

I should also point out that Bill didn’t say he didn’t read past page 12; he said he drew that conclusion by that point. I suspect he read the whole thing. He was a contemporary of John Walker (as was I, although my service overlapped Walker’s only at the end), and he was also a CWO, so the story was near and dear to him. I doubt he could have put it down.

Boyd November 4, 2005 7:32 PM

Oh, and I was one who had to deal with the Walker aftermath. After his apprehension, cryptographic materials required two-person control (God, I still hate the acronym TPC), which became a scheduling nightmare. Up and down the line, everyone was divided into two groups: one had the combination to the outer lock, the other had the combination to the inner lock (by that point, most safes used for cryptographic materials were really a safe within a safe), so you had to be sure you had one person from each group available at all times.

That was a royal pain in the ass.

Ol' Dirty Reader November 4, 2005 11:02 PM

I simply can’t see the logic of: “author didn’t know the details of how we were using the equipment — so the thesis is not worth reading” makes much sense.

The the spy had access to a photocopier and a bunch of keys. The bad guys already had equipment; all they needed were keys. The keys gave them all they needed to know. Once the channels were compromised, the bad guys were even able to get the change-orders on the crypto equipment, so the bad guys were able to update their equipment to stay current (with minor exception, noted in the thesis).

The breach of all that info was the last of a series of failures: the spy shouldn’t have been in the Navy, shouldn’t have been handing sensitive data and should have been caught in ’67, when he forged the document to keep handling classified data.

The author covered that stuff quite well — even if the equipment was used a bit differently than the author described, the failure mode wouldn’t go away.

I hope the author options the thesis to Hollywood and we get a good movie out of it; it is a great story!

Milan Ilnyckyj November 5, 2005 8:31 AM

What I found most interesting about the thesis – and it was well worth reading one hundred doubled spaced pages over – was the way the NSA was viewed and treated by the Navy.

It is a demonstration of an area where increased secrecy actually sharply diminished security, because key information about vulnerabilities and expectations was not passed on. A similar dynamic was apparently in place with regards to the people doing security screening: they were being trusted at the same time as the basic character of what they were doing was not communicated or understood.

In the present day, it seems as though an increasing number of agencies are employing increased secrecy as a security tactic. That makes it seem likely that similar failures and oversights will crop up.

Erasmus November 8, 2005 3:16 AM

@ Boyd
I suggest you track down an old copy of Dixon’s ‘On Military Incompetence’ which describes why the failures described in the first part of the thesis are to be expected in a close-knit group.
It also notes how difficult it can be for these otherwise very competent groups to learn from failures.

The piece clearly points out, everyone thought they were behaving ‘properly’; there are big social disencentives from betraying a colleague – even if he is a spy!

Its acts as a salutary lesson for anyone bulding a secure system. Doubtless there were very clever people working on the technical components and process, but no-one properly working on the ‘system’. When a system fails because, say, ’embarassing’ social factors haven’t been addressed, cognitive dissonance cuts in and we cope by finding a displacement activity. So its not unusual to nit-pick around a periferal area rather than admit that we all loyally trusted each other, exactly as we’re supposed to do.

Roger November 13, 2005 5:13 AM

I’ve also read the full paper now. Very interesting, and instructive in some ways.

I have to say that I agree with the sentiment that Boyd’s friend was, at least, premature in his assessment. Pages 1 to 12 — note that these are double-spaced — give little more than an overview of the structure of the paper, a ~very~ high level summary of what encryptors do, and a high level overview of the FBS in the relevant period. I am in no position to judge the accuracy of statements on the FBS but they are referenced to a USN official history so if they are wrong MAJ Heath is not to blame. Possibly Boyd’s friend took exception to the summary of encryptors (which is indeed rather vague and wishy washy), but it turns out that that is of very little importance: the only technical aspect of the KW-7 which is relevant to the analysis is the physical format of the key cards.

Charlie December 14, 2005 11:28 AM

I just stumbled across this thread and read the first 25 pages of the Major’s paper. She may have adequately identified the weaknesses in the CMS, but she missed the technical information regarding both the KW-7’s uses and the FBS by a mile. I was a Radioman from 1965 to 1986, working in Navy Communications Stations (NAVCOMMSTAs) around the world (Philippines, Guam, Spain, Italy). Specifically, I worked in Facilities Control (the correct Navy designator), also frequently called Tech Control. We maintained the circuits the Fleet Broadcast rode and operated the crypto equipment used to cover the FBS. Too bad she didn’t talk to an old controller.

New Lambton South Public School June 26, 2006 10:14 PM

Dear Mr. Schneier,
At school each person in my class has been assigned a different job to study and I have been assigned the occupation of cryptologist. At first I didn’t have a clue what a cryptologist did until our teacher read out from the dictionary that it meant:the science of maintaining the security of communication in and of extracting information from codes. When she read this out the class was amazed and I thought it was a very suitable job for I love codes and ciphers.
So I would like to ask you a few questions.
Firstly what type of subjects did you have to study in high school and university to be able to have this job?
How did you get recruited as a cryptologist?
And what was the hardest code you have had to decipher?
Please reply soon.
Sincerely,
Jackson,
student from year 5.

byteboy September 10, 2016 2:16 PM

Very late to this party, but I think what some are referring to
is that the KW-7 was used to cover teletype circuits. Not Fleet
Broadcasts.

This is because the KW-7 can be operated in full duplex. By
definition, the Fleet Broadcast is a one-way transmission.

There were several types of Fleet Broadcasts. Each used
different keying material, but they all were covered by
the KW-37 equipment, not the KW-7.

The KWT-37 equipment encoded the Fleet Broadcast from
a shore station, and the shipboard KWR-37 decoded the
traffic.

This is a one-way transmission – a broadcast. The traffic
is addressed to individual commands or ships, or many
different addressees. The recipients does not acknowledge
receipt of the message

I was a Navy Tech Controller for about 4 years (1965-1969)
and operated KW-7, KW-37, KL-7, KW-26 equipment. The KW-37
was very unstable. When the system lost sync, all sorts of
lights started blinking and a loud, annoying alarm went off.

Since Walker also compromised the KW-37 keys, the author’s
error certainly shows a lack of understanding of Navy technology,
but still is on point as to the security issues raised by the
Walker episode.

I’m not 100% sure about this, but I don’t believe the Army
used the KW-37 equipment, but I’m aware of wide spread use of
the KW-7 within the Army and other services.

David Winters June 21, 2017 11:53 PM

If you want to know “the rest of the story” attend my presentation for the Symposium on Cryptographic History, Johns Hopkins University, October, 2017.

George J. Sliney August 19, 2017 3:20 PM

I quickly grew tired of reading the paper. When she made the technical error of saying the KW-7 was used for FBS then I wondered how deep she really did her research for the rest of the content. Also, it is not nit-picking to expect a Thesis at the U.S. Army
Command and General Staff College for the degree of MASTER OF MILITARY ART AND SCIENCE
Military History to be subject to much more rigorous scrutiny.

I was a US Navy ET (Electronics tech) that was trained to maintain the KWR-37, KG-14, KW-7, and KY-8 units. As one other poster said the FBS was was one way traffic sent from the NAVCOMMSTAs to the fleet. We used the KWR-37 and KG-14 systems for that process. The KW-7 unit was used for two way teletype communications ship to ship and/or ship to shore.

Anyone that wants an excellent technical discussion of FBS and other Fleet communications, please go to: http://www.virhistory.com/navy/traffic.htm

Rick March 18, 2018 9:12 AM

As Byteboy and others have mentioned, the KW-7 was NOT utilized on the Fleet Broadcast System. That was, rather, the KWR-37. Here’s a pic of one of those beasts: http://www.jproc.ca/crypto/kwr37_frontview.jpg

Onboard my ship – a taskforce flagship – we used the KW-7 for ship-to-shore termination, a two-way real time encrypted system. We also used it for Task Group Operations circuits, whereby the flagship would take in message traffic and relay it to the naval communications station the ship was under the direction of (depending upon geographical position at the time). Sometimes we used UHF transceivers and encryption systems for TGO.

The KWR-37’s fed a bank of teletype receiver units (no keyboards).

Bummed out when I found that all my work as a Radioman was for naught, given that the key lists were in the hands of the Russians, thanks to Walker

David Winters November 26, 2018 2:21 PM

The means used to smash “Walker” style threats are explained in the websites below.

We did it by throwing out the old crypto key production, distribution, and handling paradigm.

This was sort of a one man rogue caper. The story is told in the video below. It also addresses the key shortcoming in the academic thesis discussed above.
David Winters

(YOUTUBE. NASA, OTAR)
https://m.youtube.com/watch?v=PNfumtMVtlk

https://cryptologicfoundation.org/file_download/f539ed6b-cedd-4d0a-bd64-1c858fe4bbee

https://en.m.wikipedia.org/wiki/Over-the-air_rekeying

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.