The Third Edition of Ross Anderson’s Security Engineering

Ross Anderson’s fantastic textbook, Security Engineering, will have a third edition. The book won’t be published until December, but Ross has been making drafts of the chapters available online as he finishes them. Now that the book is completed, I expect the publisher to make him take the drafts off the Internet.

I personally find both the electronic and paper versions to be incredibly useful. Grab an electronic copy now while you still can.

Posted on September 10, 2020 at 6:26 AM31 Comments


kai September 10, 2020 7:02 AM

From Ross’ site: “Once the manuscript’s finished and goes to press, all except seven sample chapters will disappear for a commercial period of 42 months. I’m afraid the publishers insist on that. But therearefter the whole book will be free online forever.”

DF September 10, 2020 7:15 AM

Ok I’m sold and will buy the book when it comes out. The Little Britain reference in the Chapter 29 title did it for me.

DAVID RUDLING September 10, 2020 11:04 AM

The link Bruce has posted for amazon is to the Second Edition. The link to pre-order the Third Edition is (albeit fractured) :-

ht tps://

Probably your best $70.00 Christms book.

Apparently UK readers will officially have to wait until 26 January 2021 unless they order from the USA.

ht tps://

Particularly good value, however, as the Third Edition will be £10 cheaper on than a new copy of the Second Edition. No, I don’t understand the workings of the publishing industry either.

Anonymous September 10, 2020 11:24 AM

By the way, does anyone know which cipher he uses to encrypt the quotes in the beginning of Chapter 5 (Cryptography)?

Curious September 10, 2020 1:05 PM

Nuclear commmand and control? Seems ambitions, or maybe I am getting the wrong idea from simply reading the chapter name.

Z.Lozinski September 10, 2020 1:37 PM


That chapter is indeed about the security protocols associated with nuclear weapons. Think about the permissive action links imbedded in US weapons by Presidential order. I think Ross also covers some of the work Gus Simmons did at Sandia National Labs on the cryptography required to implement the monitoring for the Comprehensive Test Ban Treaty. (The USA is allowed to deploy a monitoring platform in the USSR next to a nuclear test site: how to ensure it is not tampered with, and only transmits back the measurements allowed by the international treaty?)

MikeA September 10, 2020 1:49 PM

That “by Presidential order” scares me a bit. What is done by Presidential order can be undone by Presidential order, and there’s been a lot of that going on.
I can only hope the existing stockpile would be “grandfathered”

David Rudling September 10, 2020 2:55 PM


“By the way, does anyone know which cipher he uses to encrypt the quotes in the
beginning of Chapter 5 (Cryptography)?”

Ummm, try reading chapter 5. The harder of the two is referenced towards the bottom of page 145.

“..the quote at the head of this chapter is a Playfair-encrypted message sent by the future President Jack Kennedy…”

Clive Robinson September 10, 2020 3:26 PM


By the way, does anyone know which cipher he uses to encrypt the quotes in the beginning of Chapter 5 (Cryptography)?

The first should be obvious and worked out faster than you can type up the aproximate answer.

The first quote is attributed to Julius Caeser. His most famous quote in English, and is apropos cryptanalysis is,

“I came I saw I conquered”

Which lacks the punch of the Latin three word equivalent,

“Veni vidi vici”

So if you make that as an assumption and look at the cipher text of,


It should be immediately obvious it is a simple substitution cipher.

That is,

cdeiv – plaintext
FGHMZ – ciphertext
33344 – offset

With the last line showing the position in the alphabet offset.

The implication of which is it’s a simple three position offset “Caeser Cipher” for the low part of the alphabet and four position offset for the rest of the alphabet.

But there is way to little cipher text to be certain.

As is aleays done… The simple problem is done by the “instructor” and the hard is “left as an excercise to the reader” 😉

Clive Robinson September 10, 2020 3:33 PM

@ MikeA,

What is done by Presidential order can be undone by Presidential order

That is true, but why bother undoing the order? Just come up with a way to do the PALs but make them ineffective…

Such as setting the combination all to zero’s…

But remember,

“It’s an very secret secret, above top secret and you have to be specially read in to know it”

Or just read some documentation =(

David Rudling September 10, 2020 4:51 PM

@Clive Robinson
Out of great respect for your contributions on this blog, I tremble to comment on your answer about the Caesar cipher, but it is simpler than you suggest. Bear in mind that the classical latin alphabet had only 21 letters, viz:-

It is then seen to be a straight 3 letter shift.

David Rudling September 10, 2020 4:55 PM

@Clive Robinson
See, it serves me right. I can’t even type 23 instead of 21. (Memo: must proof read my posts)

GhostRider September 10, 2020 5:32 PM

Awesome! I cant wait to be able to purchase the book. While reading chapter 25 regarding AI I thought It would be very interesting to see how two separately created AI systems might combat each other. Especially if each one is created in separate countries using completely separate data and tactics unique that particular country. It makes me wonder if the AI from either side would demonstrate almost alien tactics in an attempt to outwit the other AI. Or in an opposite simulation I wonder how two of the same AI’s competing against eachother in say a dog fight, while using the same plane would compete against eachother. Two identical systems fighting eachother if they both are starting off head to head. Why might one system win over the other? Interesting stuff.

Clive Robinson September 10, 2020 6:49 PM

@ David Rudling,

See, it serves me right. I can’t even type 23 instead of 21

Yes you are quite correct that the Latin alphabet was missing J, U, and W.

But using that would have been cheating because I could not prove it by what I had from the ciphertext or my assumed plaintext.

My point was to show only what I could prove and through that something looked odd. Hence my stating what I had and what it said by implication of the data at hand, then stating,

“But there is way to little cipher text to be certain.”

As you can see I can not prove what alphabet it is that is being used only that the offsets are different in different parts of the modern standard 26 letter alphabet.

The problem is history shows that when it comes to cryptography there were actually many base alphabets in use, with some actually containing other symbols or numbers.

That is not only was there the base 23 Latin alphabet as used in ancient Rome by Julius Caesar, there were several standard alphabets throughout history many of which were based on 5×5 matrices thus base 25. Some just replaced J with I (Playfair etc) some that replaced W with UU and several that replaced Q with various other low frequency charecters like X or Z on the assumption that context would tell the decryptor what was substituted. Then there were base 28 of the Nihilist and Russian ciphers including VIC, and so the list goes on.

From what little I had it could have been any of them or none of them and I would need more traffic to work out which.

You and I both “assume” from context that Ross probably used the original Julius Caeser cipher as described by Gaius Suetonius Tranquillus in his “Life of Julius Caesar” as,

“If anyone wishes to decipher these, and get at their meaning, he must substitute the fourth letter of the alphabet, namely D, for A, and so with the others.”

Thus the oddity at I was due to a missing J and at V due to a missing W was most likely due to the use of the Latin alphabet Julius would have used as well so it was historically correct.

But “assuming” and “proving” are a country mile apart 😉

The problem with “assuming” is it is something some exam question writers just love to use to mark you down 🙁

I got shot down once by an exam question that asked a question about someone falling out of an air craft. They were after a “terminal velocity” calculation and I indicated there were two terminal velocities one that followed the other with the latter being when “the body was at rest with zero velocity relative to the ground”.

Ismar September 10, 2020 7:57 PM

Great book provided for free. As I have had a lot of free time in the last couple of months I have read all of it and only have one objection – when asking for feedback Ross should be able to respond when people submit valid error reports which in my case did not happen.

MarkH September 10, 2020 8:16 PM


The security aspects of systems for the command and control of cataclysmic weapons form a worthwhile area of study for those interested in security engineering.

In some ways, the security challenges are fairly typical. The regulation of access (in its most general meaning) has positive and negative aspects.

When security geeks think about the design of a bank vault, most of us probably focus on the negative access controls, intended to prevent unauthorized persons from getting into the vault.

But of course, a bank vault is useless (or at least very unsatisfactory) unless the positive access mechanisms are highly dependable, and not very costly to use.

What makes nuclear command and control so interesting is that it combines these standard requirements with a very distinctive set of challenges1:

• positive access should operate across great distances with fairly high reliability, within a time frame not to exceed 300 seconds or so

• negative controls must be as nearly perfect as possible2, because the risk is all-consuming (up to the obliteration of the state to which the weapons belong, and perhaps the civilization in which that state and its allies partook)

• because the costs of negative control failure are so extreme, extraordinarily large investment has been made in security design, including large financial resources and an the involvement of an abundance of exceptionally capable and motivated people

• operational cost efficiency is largely subordinated to functional and safety requirements, so certain resources can be used with an extravagance beyond the scope of what can be contemplated in most security schemes

The combinations of extremes offer a glimpse of what is possible when security design is taken very far.

1 I presuppose here the post-60’s U.S. system with hair-trigger ICBMs. There is surely much variability among the nuclear weapons states, and over time.

2 Positive and negative access controls stand in natural tension to one another. The U.S. policy is to give priority to negative controls, because the cost of failure is so great, and because (in the standard deterrence framework) it isn’t strictly necessary that “the bomb” can be used reliably, but rather that any adversaries will estimate the probability of successful use as sufficiently great to ensure deterrence.

Clive Robinson September 11, 2020 4:20 AM

@ MarkH, Curious,

In some ways, the security challenges are fairly typical. The regulation of access (in its most general meaning) has positive and negative aspects.

In Communications Centers (ComCens) safes are used for storing Keying Material (KeyMat) which even these days can still be on “punched paper tape”.

For obvious reasons these safes have to be quite secure and be resistant to most types of attack.

However unlike commercial safes used to store valuables of high worth the safes do not have “re-lockers”. Which can add a lot of security to such commercial safes.

A re-locker is a fairly simple device in concept, it is in effect part of the bolt withdrawal mechanism and can be made of glass or ceramic or similar material that easily shatters. So if someone trys to “blow the safe” the re-lockers shatter and the locking bolts can not be withdrawn by the locking mechanism. Thus it adds considerable delay to the opening of the safe giving more time for a response by guard labour[1].

So why do ComCen safes not have re-lockers? Many assume incorrectly that it’s incase the re-lockers get broken by rough handling in transport during a “bug-out” or if the ComCen comes under bombardment from infantry mortars or artillery shells they could not get to the KeyMat[2] to re-establish secure communications.

The real reason is actually so that the safes can be opened for audit at any time. The problem with secure communications is not loosing it, there are procedures for that. The real danger is to continue using “compromised KeyMat” thus making what you think are secure communications compleatly insecure to your enemy. This is a way more dangerous situation so the ability to Audit KeyMat is given a higher priority than protecting it from theft. After all stolen KeyMat that never gets used has no intrinsic value to an enemy.

Thus unlike the commercial world the value is not the items being secured but in their use so re-lockers add hinderence not security when it comes to KeyMat.

It’s an important piece of knowledge for those designing security systems to understand, and why there is one heck of a difference between physical security and information security systems design.

It’s a specific case of the more general problem that physical security is a proper subset of information security and as such physical security has what some considered are axioms, that in fact, are little more than dangerous assumptions in information security.

[1] Most physical security is not about stopping attackers but detering or delaying them. That is the sensible attacker realises that the time taken to get into the safe will be very long and fast attack methods will cause an alarm to be raised and bring down the authorities on them via guard labour. Not sensible attackers will thus get caught and punished. That said the golden rule is there are always smart attackers who will know or find a way to bypass all physical deterrents so no physical security is 100%.

[2] There are procedures that were developed back in WWI and earlier for loss of KeyMat and what to do should it happen that as a Signals Sgt or detachment commander you were required to memorize and most operators of all ranks that had contact with codes and ciphers as part of their duties new it as well. In fact one memorable code book had the emergency cipher words to use to report the loss printed clearly on it’s cover. The words being,

“Damn Damn Damn”

itgrrl September 11, 2020 4:33 AM

FWIW the page and linked PDFs are archived in the Wayback Machine, so barring a DMCA take-down notice it’s likely to remain available for those looking hard enough…

me September 11, 2020 5:54 AM

Anyone read it (or an older version)?
what i should excpect from it?
it looks both interesting and boring… i’d like to read something about general approach/mentality to security, not limited only to computers.
i don’t want to read a too techincal book like “pls use tls1.3 with perfect forward secrecy cipher”

for example thieves came in our house few days ago while we were watching tv (it ended good: nothing stealed noone harmed, they runned away) but it would be nice to have a book that helps you thinking: what you failed? how can you improve it? adding a gate would help? or is a uselees thing and placing an automatic doorbell that rings if someone opens the door is better?

i’d like something about threat modelling and how to understand which is the best way to solve those issues

me September 11, 2020 6:04 AM

Another example: i repaired a treadmill because it throwed error on turn on.
i replaced the board with a piece taken from an old induction coock plate, but the original version prevented turning it on if mosfet was broken, mine not.
so while i was using pwm 1-10% because 10%=running fast. if my breaks it goes to 100% which is quite a lot.
this is security too, the original version prevented a broken component from making the thing go at 100% speed.
also oven, iron and other things that become hot have a thermo fuse or bimetallic switch that turn off the heating part if it overheats to prevent fire.
you can’t notice this security stuff if you don’t open electronics stuff.
security is not only “you are authorized to enter or not”.
is stuff like this discussed in the book too?

(sorry for double comment and thanks a lot to Schneier for keeping the comments free from logins)

Clive Robinson September 11, 2020 7:16 AM

@ me,

Anyone read it (or an older version)?
what i should excpect from it?
it looks both interesting and boring…

I have both the first and second editions and have read them both.

The book is unusual because it is one of very few books that cover an area that is very very deficient in available and concise and readable information.

It realy covers what you need to be aware of as a “Security Engineer” or “Security Systems designer” and provides both a concise collection of diverse information as well as acting as a bridge from security theory to practical implementation.

If you are thinking about becoming a security engineer or designer or in fact any kind of professional security practicioner such as a Pen-tester etc, you realy should read it and use it as a spring board into more indepth and almost certainly harder to find information that will be relevent to any kind of security related design and implementation.

Our host @Bruce wrote a similar book originally in colabaration with Neils Fergerson which was more cryptography and software implementation focused,

Which went into it’s second edition in 2010 with a third writer Tadayoshi Kohno, and is in my view overdue a third edition to update it on what has happened in the past decade (@Bruce, is that a heavy enough hint?)

I would recommend both books as those that should be “read first” before nearly all other security books. Even if you are only vaguely thinking about getting into security. Because unlike most other books they will teach you about new ways of thinking, which are of much wider scope than both physical and information and their associeted EmSec/TEMPEST and other fields of endeavor. In short they should change your perspective on the world and thus give you not just a new “life skill” but also enable you to think in broader terms about your own personal security and that of your loved ones.

ATN September 11, 2020 8:34 AM

Missing some words around:
24 Copyright and DRM
24.2.2 Free software, free culture?
about how GPL software was copied by big companies (for instance mobile phone producers) without respecting GPL terms (“try to sue us… have you seen the size of our legal department?”)
and about how a big company used copyrighted software to protects its own software, without having the right to use that former copyrighted software.

me September 11, 2020 9:12 AM

@Clive Robinson
Thanks for the answer.

they will teach you about new ways of thinking… enable you to think in broader terms

This looks promising and it’s exactly what i was searching.
because it can be applied literally anyware, dumb example: “maybe placing the ladder there to change the lamp is not a good idea”
other dumb example that i’m applying at work: sometimes i have to open boxes and they contains fragile and costly instruments, i open them on earth and not on table, because if it’s on ground it can’t fall on ground and break, is yet there!

If you are thinking about becoming a security engineer

I like security (in general) and computer security but i think i would not like it as a job.
I also think that computer security jobs are done “wrong”:
1-we do a bad software and don’t care about security.
2-we hire someone to hack us and if he fail surely is 100% secure, if he success we will fix that.
What about coding it correctly in the beginning? teaching coders instead of hiring pentesters? whitebox analysis instead of blackbox?
to cite (by memory) Joanna Rutkovska: whitebox would be more efficient/better but look less like a hollywood movie.

aksilu September 12, 2020 2:52 AM

I have to say it’s very noticeable in Chapter 15 how R Anderson talks about Nuclear proliferation, focusing on the cases of Iraq, Libya, North Korea, but only talks about Israel which itself proliferated nuclear weapons, only in the context of its bombing of the Iraqi nuclear reactor, attacks on Iran, etc. Bit of a double standard, to say the least.

Clive Robinson September 12, 2020 7:03 AM

@ aksilu,

Bit of a double standard, to say the least.

Perhaps not.

North Korea, Iran and Lybia all purchased technology from AQ Khan and not only is there a very significant paper trail there is also solid physical evidence that will hold up to criminal court standards that they have the equipment and have used it.

In the case of North Korea unless they are the worlds best fakers they have nuclear devices and more important a more than viable intercontinental delivery system.

Iran has been peacefully making non weapons grade reactor fuel for quite a number of years despite US war mungering for political reasons that involve Israel as the US houses have repeatedly shown. With various members of the current administration actively trying to provoke open hostilities and commit illegal acts of war against the civilians of a sovereign state. Which if the US did not have veto power at the UN that it excercises to protect it’s self and Israel would have seen both the US and Israel subject to the harshest of sanctions.

Iraq is an interesting case as far as we can tell they have not gone into nuclear enrichment in any way that would produce either fuel or weapons grade material.

As for Israel “where’s the evidence?” All we realy have publically is the word of an Israeli Technician Mordechai Vanunu, who Israel abducted from Rome and put in special administrative measures in one of their more disreputable prison complexes.

That’s not to say Israel are not into nuclear enrichment to weapons grade levels or the making of nuclear weapons. Publically we don’t have the proof and those who may have it such as the US, Russia, and China etc are not revealing it.

Therefore I suspect Ross J. Anderson, who is no infrequent visitor to the stand in court decided to stick to what was provable to criminal court standards.

I suspect I would do exactly the same in his position, especially as there are various splinter groups of “chritian-jews” that like very many ultra-right groups take pleasure in trying to discredit people with inuendo, faux accusations, and worse a lot worse to “re-write history” for various reasons. The fact that it’s been found that there are strong links between these splinter groups and various politicians both in the UK and US along with strong support from various parts of the Israeli Government does not help…

When part of your work is as an expert witness fighting the behaviour of banks who consistantly lie directly, deliberately misslead, and evade court scrutiny on their behaviour, you do not also want to be giving them reason to attack your integrity on spurious grounds.

william September 13, 2020 6:37 PM


All these books contain errors, including Security Engineering. Several of the definitions in it are cuckoo. Even the CISSP Official Study Manual states that RSA relies on the difficulty of “factoring prime numbers”. Don’t bother reporting errors. All these books contain errors, it doesn’t matter.

David Rudling September 14, 2020 3:02 AM

I disagree completely.
Error correcting is vital not just in technology but in book publishing.
If there is an error of fact, rather than just a disagreement, it needs to be corrected.
To be corrected it needs to be reported to the author.
If you see a factual error please report it.
It matters.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.