ISO Rejects NSA Encryption Algorithms

The ISO has decided not to approve two NSA-designed block encryption algorithms: Speck and Simon. It’s because the NSA is not trusted to put security ahead of surveillance:

A number of them voiced their distrust in emails to one another, seen by Reuters, and in written comments that are part of the process. The suspicions stem largely from internal NSA documents disclosed by Snowden that showed the agency had previously plotted to manipulate standards and promote technology it could penetrate. Budget documents, for example, sought funding to “insert vulnerabilities into commercial encryption systems.”

More than a dozen of the experts involved in the approval process for Simon and Speck feared that if the NSA was able to crack the encryption techniques, it would gain a “back door” into coded transmissions, according to the interviews and emails and other documents seen by Reuters.

“I don’t trust the designers,” Israeli delegate Orr Dunkelman, a computer science professor at the University of Haifa, told Reuters, citing Snowden’s papers. “There are quite a lot of people in NSA who think their job is to subvert standards. My job is to secure standards.”

I don’t trust the NSA, either.

Posted on September 21, 2017 at 5:50 AM79 Comments


keiner September 21, 2017 6:25 AM

…how long till the first: “Snowden killed our security, nobus trusts our good surveillance… NSA is good… terror… blablah” comment pops up?

Dan H September 21, 2017 6:42 AM


In July, 2013 you commented:
“The code is not relevant here; the question is whether a back door could be hidden in the mathematics of the cipher,… so maybe, but I don’t think so.”

Why would you now not trust the NSA’s new block encryption algorithm when previously you didn’t think it was possible for them to subvert it?

keiner September 21, 2017 6:46 AM

…cause he changed his opinion in the light of new facts? Intelligent people do this from time to time (stupid people every 5 minutes).

Allen September 21, 2017 6:58 AM

Technically, it has not been rejected yet. The last line of the article states that there is a final vote in February. With the compromise of removing the “lightweight” versions it might still pass. Let’s hope that it does not pass, but just because something is approved by the ISO as a standard does not mean they have to use it. There are still other encryption algorithms out there.

225 September 21, 2017 7:01 AM

@keiner I have a horse shoe theory on how often people change their views and opinions, with stupid people at both ends.

it looks like the ISO is not approving the light weight versions of speck but allowing the versions that use key sizes and rounds “The NSA has now agreed to drop all but the most powerful versions of the techniques”

Trebla September 21, 2017 7:36 AM

It seems hard to sneak in backdoors into open standards, but if somebody could do it, it would be the NSA.

Mee Etherr September 21, 2017 7:48 AM

NSA has worked extremely hard to earn our complete and total distrust and disrespect.

Also, they have set a worldwide example for other countries to follow: Mass distrust. It’s a very sad chapter.

I don’t think this nation can recover from it.

Dr. I. Needtob Athe September 21, 2017 7:48 AM

Yet another illustration that a hero of the people is an enemy of the state, forcing us to logically conclude that this government of the people, by the people, and for the people, did in fact perish from the earth.

Clive Robinson September 21, 2017 7:52 AM

@ Dan H,

Why would you now not trust the NSA’s new block encryption algorithm when previously you didn’t think it was possible for them to subvert it?

Because a lot has changed since July 2013, not just the Ed Snowden revelations. For instance NIST dropping a NSA designed random number generator. Which confirmed the published suspicion of a researcher Bruce had worked with in the past. A family of routers getting backdoored by a similar method and the fact that RSA managment had taken a considerable backhander to in effect force the bad random number generator on people.

Perhaps circumstantial evidence at best, but when you are dealing with information security that might have to be still secure in a thousand years you need to be cautious in the extream.

But the NSA has a long track record of backdooring systems as @Nick P and myself have pointed out for years, as you can see by looking back on this blog and others. In effect it started in WWII before the NSA actually existed with mechanical field ciphers with both strong and weak keys[1] which were not obvious. The fact they had a backdoor arangment with the only –supposadly– independent manufacturer of crypto kit in Zug Switzerland. Through to how they rigged the AES contest to virtually guarantee that the implimentations would be rife with time based side channels. Which they in effect confirmed by approving AES for secret and above BUT only for “DATA AT REST”, which is a covert way of saying “DO NOT USE ON LINE”. Then therecwas the release of the “secret cipher” originally intended for use in the Key Escrow program that gave us Crypto Wars I. It was deliberatly designed to only just meet the securiry margin of what was known in the open community[2] it was extreamly brittle such that even tiny changes would make it’s securiry margin tumble way way down.

I could go on at length, but whilst I’m doing that further information would build up and it would become a never ending task.

But it’s not just the NSA other US MIL/IC agencies have done the same, the most visable being Tor, which whilst it might be content secure is not secure against traffic analysis, especially to a SigInt agency that has monitoring at all the major Internet choke points.

You can of chose to ignore the mounting evidence, but now the likes of the FBI are investing in getting to use those backdoors, it’s hard to see how anyone can not be suspicious…

[1] The idea is simple to understand when you realise that the KeyMat was issued by the same US agency. If the cipher machine fell into enemy hands it would almost certainly get duplicated by the enemy or others allied to them at some point if they were not at the same or better skill leve as the US agency. If they did copy it, it’s fairly certain that they would then pick their KeyMat randonly. Which means some strong keys but also some weak keys. When they use weak keys the recovered plain text can then be analysed and provide “cribs” and simpler to make attacking the stronger keys much easier. It also assists in other traffic analysis techniques. However because the US agency designed the system and thus how to predict the strength of each key they could just issue strong keys for their own armed forces use. Thus gaining a significant advantage. This is in essence exactly what the NSA was set up to do.

[2] One of the things history has shown us with the NSA is that they have crypto analytic methods which the open community has yet to find. This was demonstrated very well with DES and what is now kown as differential cryptanalysis. Thus if an NSA cipher only just meets the security margine of Open Community attacks it’s odds on the NSA has an attack that brings that security margin down significantly.

Mee Etherr September 21, 2017 7:59 AM

Trying to think like NSA:

Speck and Simon would use less computing power. Is that not good for those wishing to crack it, also?

Dropping the ‘lightweight’ versions could simply mean they can crack it all, Maybe the heavier versions take a little longer. Maybe, not on the fly.

Involving both hardware and software is very concerning. That might suggest covert, even overt, intervention with major device manufacturers.

BF Skinner September 21, 2017 8:17 AM

@Dan H
It’s not really a matter of trust is it. Trust describes a point of failure. Close source says ‘trust us’. You can’t secure it or prove it was secured therefore you have to trust it won’t fail. (this is separate from trustworthy and trusted. Confusing. I know we’re using the same word, natural language, for related concepts)

What Bruce has said as long as I’ve been listening to him is that protocols and ciphers that are open to inspection and experimentation are more likely to be or become trustworthy. But with NSA it’s always been “trust us” and then they retreat into their shadows. They also have a history of compromising foreign cryptosystems. NSA is closed source by design.

What is concerning to me at the moment is a knock on effect. USG is pushing NIST SP800-171 Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations as a minimum control set for companies doing business with them. 800-171 compliance is now built into the federal and defense federal acquisition rules. Unlike FISMA it covers both systems processing USG data (think managed services or FedRAMP cloud services) and information contingent to doing business WITH the USG (think email from a contracting officer with design specs). As is typical with gov’t security controls a single control clusters a variety of issues and tech.

Buried in there is crypto.
3.13.8 Implement cryptographic mechanisms to prevent unauthorized disclosure of CUI during transmission unless otherwise protected by alternative physical safeguards.

and then this cutie
3.13.11 Employ FIPS-validated cryptography when used to protect the confidentiality of CUI.

That’s all the control statements say. And while many, most(?), private companies DO use AES they don’t use necessarily use it in FIPS mode. There are many tedious implementation, operation and international law reasons why for this. Most auditors are not experienced enough to ask or are careful enough that they don’t ask. I’ve never asked them, too risky. The LOE for taking a modern international corporate infrastructure to FIPS mode is … High.

If weak versions of Simon and Speck become a FIPS “requirement” NSA could succeed in forcing the defense contracting industry (and downstream subcontractors and suppliers; because FAR and DFARs are flow down requirements) into employing compromisable crypto.

milkshaken September 21, 2017 8:24 AM

“It seems hard to sneak in backdoors into open standards, but if somebody could do it, it would be the NSA.”

The lightweight version is more suspect. It does not need to contain an outright backdoor, just a subtle weakness (that still requires a massive computing power for exploiting). For example, a very long cipher text giving away non-obvious hints and with this you don’t have to brute-force the whole key space.

I think the suspect part is that it has been entirely US Gov initiative, that again they are pushing new standards that no-one else was asking for.

Clive Robinson September 21, 2017 8:30 AM

@ Bruce,

The question is do we realy need SIMON and SPECK?

They were designed for a problem that realy does not exist any longer.

Once upon a time we had 8 and 16 bit microcontrolers with extreamly limited resources. These old processors got used because there were significant cost savings to be made, not just in production BOM but in lifetime power usage.

As many IoT devices show, 32bit chips are not just almost the same price as old 8/16 bit parts, they have real advantages in terms of lot more RAM, ROM and I/O but much higher clock speeds.

You can now buy a single chip device for a little over 1USD that has the power and capabilities of a high end Micro Vax that were at one point not just “leading edge” technology they were also “bleeding edge” price wise beying more than the combined anual saleries of ten proffesional engineers.

Thus the need for such light weight ciphers secure or otherwise is kind of gone… Thus they are in effect irrelevancies and should be rapidly consigned to the “quaint history” draw “In a locked file cabinate in a disuesd toilet in an unlit basement with missing stair case behind a door with a sign saying ‘Beware of the Tiger'”[1].

[1] Thanks to Douglas Adams for the location guide 😉

cat September 21, 2017 8:55 AM

Shouldn’t a cipher/algorithm be judged on its merits alone, regardless who submitted it ?

If ISO don’t feel competent judging the cipher, they have no business in deciding standards.

Michael Moser September 21, 2017 9:21 AM

Trust is a good thing. Now if security professionals can’t tell a good cipher from a backdoored one – what good are these professionals to begin with?

Just Sayin September 21, 2017 9:46 AM


Re: “Shouldn’t a cipher/algorithm be judged on its merits alone, regardless who submitted it?”

No, if the source previously submitted corrupted code counting on unverified and unverifiable trust.

@Michael Moser

Re: …”if security professionals can’t tell a good cipher from a backdoored one – what good are these professionals to begin with?”

NSA fools me once, shame on NSA.

NSA fools me twice, shame on me.

Ted Hale September 21, 2017 9:53 AM

@Michael Moser Most security professionals are NOT cryptanalysts. That is a VERY specialized skill. Sure I could read the code, but only an obvious backdoor would be noticed. I would likely have no idea if the crypto had weaknesses. And I certainly wouldn’t know how it would stand up to the advanced techniques that the NSA may employ since I have no knowledge of those capabilities.

BF Skinner September 21, 2017 10:56 AM

@Clive Robinson


blockquote> As many IoT devices show, 32bit chips are not just almost the same price as old 8/16 bit parts, they have real advantages in terms of lot more RAM, ROM and I/O but much higher clock speeds.<\blockquote>

I’m thinking legacy. Particularly in continent/global scale distributed ICS’s. Owner/operators unlikely to scrap old working silicon just to get new chips

TimH September 21, 2017 11:19 AM

@Michael Moser: Honestly? The professional that you trust in ANY discipline is the one who states his/her limitations at the start. Not the one that tries anyway, and gets it wrong. Think Lasik surgery as an example.

I regard someone fulfilling a task as a professional if there is fairly immediate personal accountability for screwups. Thus engineers and house cleaners are professionals, while management roles may or may not be depending on the organisation. Ditto, sadly, police and politicians.

Nick P September 21, 2017 11:20 AM

@ Scott

SELinux is a MAC scheme for security. The list of them for Linux is here. That is just one component from the Compartmented Mode Workstations of old days. Insufficient for full security. For more info on full thing, here’s a historical look, a DEFCON presentation, and a vendor who supports Linux. Due to complexity of OS’s, highly-secure systems used microkernels plus virtualized instances of things like Linux instead of MAC. It can be a nice supplement, though.

TJ Willaims September 21, 2017 11:53 AM

A couple of things here:
– as much as we can distrust NSA in the crypto area, there are no serious attack on Simon and Speck so far (see list in
– the authors have clarified a few things on their design (
– implementation with short word sizes should not be used unless information is short lived
– unfortunately, not everybody is using 32 bit processors and there are plenty of devices deployed every day with 8 bit processors (many of them sensors) or 16 bit processors (IoT and IIoT devices): the main driver is cost, with energy consumption a close second

Security through spycraft? September 21, 2017 12:03 PM

” It’s because the NSA is not trusted to put security ahead of surveillance ”

They’ve been weakening encryption protocols and backdooring stuff we all use for decades, and now they’ve had their bag of magic tricks stolen and sold on the black market.

Change the acronym to National Surveillance Agency.

“just store it all – we don’t have the manpower to analyze even 1% of it until after a major terrorist attack anyway” – translate that into Latin and put an eagle on it.

Trust, but verify September 21, 2017 12:14 PM

” there are no serious attack on Simon and Speck so far ”

TJ… TJ…. think just a hot moment with us here…

We’re talking about new protocols the NSA are proposing, right? Brand new stuff.

You are expecting there to be successful attacks documented in PUBLIC?
BEFORE they release it expecting widespread adoption?

That’s not how this works.

Attacks that would be successful would be VERY, VERY SECRET.
(*For their expected 10-15 year target window, anyway)

If the public had published working attacks on them how would anyone trust they were valid, adopting them and allowing the 5 eyes to see everything you all do in “private”?

Besides, the successful attacks themselves are hopefully computationally expensive in their own right, just not p=!p expensive.

Nick P September 21, 2017 12:58 PM

Bigger reason is we don’t need their algorithms. What we need are more solid, OSS implementations of protocols and algorithms we have plus tools for automating as much analysis as possible. Despite their BS, I do thank the NSA’s defensive side (I.A.D.) for funding Galois’ work on Cryptol and other stuff that can help with this goal.

That’s open-source which is always a good default for security-related things but especially true if NSA funds it. 😉 Another they did for FOSS which had some benefit was SELinux which was a Secure Computing Corp and Mitre work. They also give a lot of contracts to companies such as Rockwell-Collins who make secure CPU’s, compilers, code generators, and so on. They do good stuff but not enough FOSS for the public. Thanks to Snowden, we know the motivation for that. (sighs)

R September 21, 2017 1:43 PM

I understand the lack of trust after Dual_EC_DRBG. It’s also yucky that Simon/Speck are even defined for block and key sizes that everyone takes for granted are unsafe (64-bit key, 32-bit block), and if there’s any way that it would undermine deployed systems, it’d be by applying the imprimatur of standardization in 2017 to an algo allowing such bad params (even if, say, only stronger versions were actually in the standard).

I doubt that with strong parameters it’s actually possible to recover plaintext from ciphertext. You need a vulnerability that’s much more serious than most that are published, and for it to be strong against well-known types of attack, and you need for it to be present in a cipher that’s ARX or just a few Boolean operations. That’s just an awfully small hole to fit your backdoor through. Again, even defining it for laughably weak parameters (32-bit blocks? really??) is a concern, but a “backdoor” as typically imagined (some vuln that no one else can see) is not.

Besides parameters and general mistrust of the agency, there’s another reason not to put it in a standard: we don’t need it. We have hardware implementations of AES that are fast and can go in small places. We have ChaCha20, an ARX cipher that’s faster than Simon or Speck on big CPUs and has been studied more than Simon/Speck. Finally, most important, we have competitions for new primitives for standardization, which seems like a better system than taking them from a single organization, however well-resourced.

Re: the distrust and the basis for it, the NSA’s balance between liking vulnerabilities since they help gather intel and disliking them because they make civilian tech insecure seems outdated and outright tragic right now when it’s becoming obvious everything is vulnerable and we’re starting seeing widespread, damaging, sophisticated attacks. I wish, without any hope to speak of, that they would recalibrate. That probably sounds impossible for such a large agency after such a long time. On the other hand, look at Microsoft in the past few years!

Clive Robinson September 21, 2017 2:09 PM

@ BF Skinner,

I’m thinking legacy. Particularly in continent/global scale distributed ICS’s. Owner/operators unlikely to scrap old working silicon just to get new chips

There’s two parts there so the second part first, “old working” will not only not upgrade working silicon they will also not put new features like crypto in. Because you can not upgrade just one item in a master slave arangment you have to upgrade both the master and the slave to keep them talking. And if you upgrade the master the chances are trying to make it work with only one crypto using client is going to be way to much trouble.

Legacy ICs are not cheap or particularly cost effective as it’s the packaging that robs the profit out of making them.

Look at the price of a 14pin dual latch like a 7474, then at a Microchip 14pin low end PIC mcu then the high end 32bit PIC32 chips and you will see why…

You have to be producing thousands of finished items to make sufficient difference to make using the lesser parts worth while.

But you then run into the “software issue” we actually have very few engineers capable of producing low deffect code. So you need “patch” caoability.

uh, Mike September 21, 2017 4:18 PM

The U.S. government is founded on the principle that it must be accountable to citizens. That’s why it’s patriotic to distrust the government.

dbCooper September 21, 2017 4:53 PM

@ vas pup

Some might find it interesting that Kaspersky products are used in Check Point security appliances. From a political perspective, in my mind, these seem to be odd bedfellows.

Perhaps more odd, the US Department of Defense, Defense Information Security Agency (DISA) lists several Check Point products as approved.

Note that the DISA site will return a certificate error.

Strange world we live in.

Ken Thompson's proprietary crypto blob September 21, 2017 6:04 PM

“That’s just an awfully small hole to fit your backdoor through.”

If they develop their own from-scratch crypto they can build their own ‘rainbow tables’ of collisions that nobody would find in decades of looking or brute-forcing, except the architect. There’s no such thing as “too small” a keyhole, one is plenty, they will likely have several for different uses/users so they can track them by that.

Dual-EC was probably just way too obvious about it. There are ways of adding complexities that are well beyond the mathematics skills of ~100% of the population and because they’re secret proprietary stuffs they will never be realistically audited outside of the agencies’ spheres, certainly not without them knowing what is there to be found.

Why pretend the NSA gives a flying rat’s ass about your “digital rights” as a consumer in terms of privacy or anything else? You play in THEIR sandbox. There’s no internet bill of rights in the Constitution, and you will have virtually ZERO avenues for proving you were damaged by their machinations until well after your limitations expire, if ever.

To sue you have to demonstrate exactly how you were damaged, and since it’s secret… yeah. You might as well be ranting about pleading the 5th at Gitmo.

Mark September 21, 2017 6:54 PM

Good. We shouldn’t trust the American government with anything. Hopefully the world will some day catch up, and we will stop following the Yanks.

diana September 21, 2017 7:39 PM

@ Michael Moser

Trust is a good thing

Trust is a neutral thing: “Confidence in or reliance on some person or quality”; “confident expectation of something”. I trust the NSA, in that I’m confident they’re going to be working against us in the future: backdooring cryptosystems, finding exploitable bugs etc. You should trust them too.

Frank September 21, 2017 8:12 PM

The NSA and US government are as trusted as North Korea for data security these days.

They blame Snowden, but their own people were saying the plan was to capture everything until they were able to decrypt and look through all digital communications.
Just look at the Wired magazine article where they talked about Yottabytes and the Utah facility plans years before anything leaked. The facility has publicly been reported to have problems, but those’re reports from an intel agency opposed to the public.

jdgalt September 21, 2017 10:50 PM

NSA will just submit future proposals through front groups, or individuals not yet publicly known to be connected with NSA. Simon and Speck should consider background-checking future submitters.

Open-source crypto projects should consider background-checking their coders, too.

Edward Morbius September 22, 2017 3:57 AM

Looking through the tagset, and discussion, on this article, I’m surprised to see that Bruce doesn’t seem to have either “trust” or “reputation” among these … though both do turn up in tag searchs. I’d suggest adding those to this article.

The question of identity, authentication, signifiers, and what we mean when we talk about “identity” has been on my mind a lot.

As to the value of trust, David Gerard (author of The Attack of the 50-Foot Blockchain) nailed this in a recent Financial Times interview. Paraphrasing from memory: blockchain and bitcoin are attempts to get around the need for trust, but it turns out that a little bit of trust can save you a tremendous amount of very real resources. Which is why trust is so damned useful.

Trust is an extension of what is known into what is not directly supported. It is not the same as blind faith, that is, belief in the opposition of evidence, but trust is faith in the absence of direct evidence. The presence of trust lets you skip a lot of work, much of it quite costly in terms of real resources.

If you look up the etymological roots of words such as “name” or “fame” or “reputation” or “character”, you find that they’re strongly interrelated. That is, the reason we want to know who someone is is quite often to know can I trust you? Are you friend or foe? (This also tends to presume knowledge of what you have done, which gets in to further notions of identity.)

The problem the NSA have produced as a direct consequence of their past actions is that they are no longer considered trustorthy. The NSA have a bad reputation. And faith and credit are not extended them.

This is equally terrifying and fascinating.

Clive Robinson September 22, 2017 4:43 AM

@ Edward Morbius,

This is equally terrifying and fascinating.

The NSA got only a fraction of what they diserved with the reputational hit.

As has been pointed out many times in business “New customers are hard to find, but that is easy compared to getting a scorned customer back once they have left”. It was the primary thinking behind “The customer is right” that led in turn to “The Customer is King”.

The problem the voting citizens have with the IC is that they do not use the IC’s services even third hand so can not “Send a Message” by “Voting with their feet”. This can as was seen with Hoover and the FBI to a situation where the head of an agency can “capture the Government”.

If you look at the Five Eyes nations you can see that the SigInt agencies have “Set themselves above the elected politicians” and realy only have allegiance to other nations SigInt agencies, not to their own Governments.

It can be seen that the likes of successive NSA whistle blowers that have been neutered or crushed by the NSA seniors resulted in Ed Snowden. If the seniors policies can turn what was a keen and eager employee to one who becomes disillusioned and not just plots against them, outs them publicly, you have to ask what it was in those policies that caused that to happen…

Rather than have a little introspection the seniors carried on full steam ahead with even more draconian policies. Almost the exact sort of behaviour you would expect of psychopaths and tyrants, you know what the future is likely to be “More of the same, and tighten the screws”…

Thus whilst the Ageian Stables sized mess the NSA is continues, piling “the excrement” higher and higher the stench it greates will spread far and wide, as will the resulting corruption. You have to ask why people that decide to treat them like “Typhoid Mary’s” are doing anything other than practicing “self protection”.

Once upon a time “Trust but Verify” was a security statment, now for the SigInt agencies it would appear that standards bodies should “Distrust untill proof of security” is obtained.

Yes we will almost certainly throw a baby or two out with the bath water, but when it comes to security you have to look gift horses in the mouth very hard, and if you can not prove them safe, leave it out, and watch it carefully for developments.

As there have not been any intensive and competative studies on these ciphers, then yes they should be left out of standards. If the NSA want “trust” then it’s upto them to prove themselves “trustworthy” the first step of which would be a lot more openness where it counts, in a way that can it’s self be trusted.

ShavedMyWhiskers September 22, 2017 6:28 AM

Now the challenge is to discover the issue with
Speck and Simon.

Sadly in the convoluted logic of this world this rejection might be a manipulation because these methods are too good.

Doing independent homework on this stuff is hard.

One bit of homework should be “Programming Pearls”
If nothing else it shows that an insight turns hard into easy.

Unroll your education… algebra and calculus are full of insights
and some ar fun to mispronounce so much fun that you might be in the “hospital”.

Clive Robinson September 22, 2017 7:30 AM

@ ShavedMyWhiskers,

Doing independent homework on this stuff is hard.

Maybe, maybe not, it depends on what you want to get out of it (See history of FEAL if you want to see what might happen).

The round structure is very simple and is also amenable to “small scale modeling” to “get a feel”

For instance the “AND” function of two rotated copies of one input reduces down to an AND function of an integer and a rotated copy. Which means it would be quick and easy even using Python to write an eight or sixteen bit counter, and perform a rotation and AND of the two values. The results are small enough that it can all be kept in arrays in core memory with little difficulty.

Thus basic statistical tests like those found in “DIEHARD” and later can be carried out very rapidly. This could easily form the basis of an interesting high school maths/science project.

Going further we could change the sizes of the rotations and see how they effect things. But also as we know the “AND” function is a “One of Four” reducing function, that is the output is zero for three of the four posible states (00=0,01=0,10=0,11=1). We can look up existing information and make predictions. Especially as the AND behaviour is the same as the “carry generator” in the various binnary adders, the inverse of the initial “gating function” of an XOR gate made of four NAND gates and also the equivalent of a single bit binary multiplier.

There is also information on the use of AND/OR gates in nonlinear logic used with LFSR taps in crypto stream generators as well as more advanced information from the European Union NESSIE program subsection for Stream Ciphers that amongst other things gave us SNOW.

With simple experiments you gain the feel, as the experiments progress the feel becomes understanding then insight from which much more interesting hypothesis arise.

Scott September 22, 2017 7:58 AM

@Nice P, thanks, re. SELinux! Maybe I should read up on this topic, I’m by no means expert on it.

Meanwhile, I’ve found this link to my interest: which gives Trusted OS credentials to popular desktop operating systems, giving each one various points on a scale (the trust level), including Windows, OS X, and some Linux distributions, even without the SELinux extension. A notable omission from Examples of operating systems that might be certifiable is OpenBSD, which I find interesting.

I’ve also found that a simple Google search suggests the Pitbell solution a high trust rating. Though I’m not confident you can read its source code. I’m not sure which trusted OS is preferred, with free software credentials.

Nick P September 22, 2017 10:53 AM

@ Scott

The reason OpenBSD can’t make the list is right in the opening paragraphs:

“an operating system that provides sufficient support for multilevel security and evidence of correctness to meet a particular set of government requirements. The most common set of criteria for trusted operating system design is the Common Criteria combined with the Security Functional Requirements (SFRs) for Labeled Security Protection Profile (LSPP) and mandatory access control (MAC).”

OpenBSD folks are opposed to mandatory access controls. Probably a lot of other things in the requirements, too. All That is Wrong explains here the limitations OpenBSD has in event an attack gets through. MAC models try to contain damage in that scenario where OpenBSD design assumes that scenario doesn’t exist at all or can be stopped with syscall restrictions (eg Pledge). It would take major extensions and rewrites of their OS to make it a LSPP-style OS.

Who? September 22, 2017 12:22 PM

@ Nick P, Scott

OpenBSD lacks some useful features like kernel MAC and filesystem ACLs, sometimes as a consequence of lack of manpower, other because they just ignore the community (not only theirs, but other security-conscious communities also). It turns OpenBSD into an operating system that sometimes misses valuable security features or provides half-baked solutions.

However, the fact is that even with these weaknesses OpenBSD remains as the most secure general purpose operating system available right now. “Evidence of correctness to meet a particular set of government requirements” does not really mean anything, but it would be nice if OpenBSD listens to the community.

John Campbell September 22, 2017 1:28 PM

The algorithms derived from the mathematics don’t have to be back-doored if there’s something subtle in the mathematics that provides an implicit “weak” path.

There are already examples of NSA manipulating the “curves” used for key processing.

The NSA is there to protect the Nation, not its Citizens. We’ve reached the point we cannot trust the Nation to protect the interests of its Citizens.

Clive Robinson September 22, 2017 2:08 PM

@ Who?, Nick P, Scott, Wael,

but it would be nice if OpenBSD listens to the community.

There are a couple of problems with listening to “the many voices” in a community, especially a large one.

Firstly is the “You can please some of the people some of the time” but “you can never please all of the people all of the time”. So at some point you are going to be upsetting people.

Secondly is the “Stool leg problem”, a free standing stool/seat is not stable with one leg, only stable in one direction with two legs, but about as stable as it’s going to get on three legs. Four or more legs are unstable unless you can find a surface where every leg is at exactly the right length to meet the surface. The same problem exists with every solution, in that there is a point where adding extras makes the solution less general and more specific to a single case. Further the less the number of legs the less complex it is to make and thus in general is more robust. As many are aware complexity is rarely your friend when you are trying to find a solution.

So don’t be to hard on the OpenBSD folks, they are doing a lot beter than many, even though it may not be optimal for your situation.

Clive Robinson September 22, 2017 2:37 PM

@ Wael,

Do you know what’s happening with TrustedBSD?

They had Open BSM in alpha back in Dec16 and then they appeared to just vanish of the face of the planet. I did wonder if the DARPA / NSA / Intel funding dried up or something else happened.

Wael September 22, 2017 4:16 PM

@Clive Robinson,

Do you know what’s happening with TrustedBSD?

Never installed it. I only followed it once in a while as I was aware things eventually were merged with FreeBSD. Who knows, may be they reached their goal and achieved trust?

So don’t be to hard on the OpenBSD folks, they are doing a lot beter than many, even though it may not be optimal for your situation.

I don’t understand those who complain about whatever operating system. It would be more productive to specify what an acceptable operating system looks like, specification-wise. Capability based? MAC/DAC, … That’s better than talking about what language the operating system was developed with. The high level specifications address security concepts, and the language discussions may address implementation weaknesses. Two different levels, and they shouldn’t be mixed up.

Bob September 22, 2017 8:03 PM

Any day of the week, I would say burn down the State Dept. and solve half of this country’s problems. I would also say fail by default until you like what you see.

However, cryptanalysis knowledge is required here. I actually appreciate the Speck documentation and proofing. I find ARX algorithms novel and ‘next-step’. I would even go as far as saying if there is any math proofing that is readable, it would be the Speck/Simon team document. ARX is at least approachable for anyone to study.

Yes we do. We need a combinational logic worth of algorithm options. Many ciphers today have been around since the dark ages. The point of ARX is a new vector on minimalist design.

People’s emotional response is an opinion. Third party cryptanalysis would be evidence. Grappled with block and streaming fail points, Speck still looks promising to me. If ISO rejects it, let it be from early introduction, lack of analysis.

For the love of truth, stop referencing Snowden. Given CALEA, he exposed what that I care about? People are quickly grabbing on to blockchain, that which relies on external security mechanisms, but knee-jerk away from Speck because it is the govt. I hate our backdoors also, but Speck looks pretty exciting to me, at least a basis for my interest. Never look a gift-whore in the mouth.

Nick P September 22, 2017 11:58 PM

@ Who?

” “Evidence of correctness to meet a particular set of government requirements” does not really mean anything,”

Surviving two to five years of NSA pentesting and 10+ years of field use without a single, reported hack certainly means something. Meanwhile, OpenBSD has piles of bugs they fix with a recent round of vulnerabilities that are related to C language. They also conveniently just call most reported problems bugs instead of vulnerabilities since nobody evaluates whether the mitigations could be broken. I gotta a draft on that here I’ve been too lazy to finish. I mean, I enjoy talking to them and learning from them on so why provoke them unnecessarily? 🙂

“OpenBSD remains as the most secure general purpose operating system available right now. ”

If security assurance plus vulnerabilities, that would be GEMSOS, STOP OS, Boeing SNS, and INTEGRITY-178B. You have to pay good money for them, though. Far as free stuff, it would probably be OpenBSD if we’re talking a big, usable OS. Redox OS would compete with them on code injection, though, given Rust langauge is immune to a lot of forms of it esp the kinds that recently hit OpenBSD developers since they’re very hard to prevent in C. On other end of that, OpenBSD developers have much more domain knowledge to get stuff right in say hardware or business logic. So, I lean toward OpenBSD on that one but Redox-like project would’ve gotten owned less over time.

“but it would be nice if OpenBSD listens to the community.”

They’re probably listening a little more after I saw Theo griping that they were behind on a few things that brought in lots of contributions. They were reluctantly doing stuff like the VMM. We’ll see.

Daniel Azuelos September 23, 2017 5:05 AM

John Campbell • September 22, 2017 1:28 PM

The algorithms derived from the mathematics don’t have to be back-doored if there’s something subtle in the mathematics that provides an implicit “weak” path.

Just imagine the model of public key cryptography transposed to published cryptographic algorithms.
A public source cryptographic algorithm


could have been conceived as the public part of a more complex cryptosystem. The private pending part of this cryptosystem


being kept secret by the author (here, for example, the NSA).
When you compose the private part of the cryptosystem with the public part of the cryptosystem, you will get back the identity:

    Aprivate(Apublic(clear_text)) = clear_text

Even a thorough source examination of the source of the public part of this cryptosystem won’t reveal any evidence of the existence of the private part of the cryptosystem.

Look carefully at the DES source. The backdoor isn’t there! It is much higher in the conception process.

Clive Robinson September 23, 2017 8:04 AM

@ Bob,

The point of ARX is a new vector on minimalist design.

Yes and it’s got problems… The ADD function used in Speck, the least significant bit of the ADD function is directly equivalent to the XOR function of the two LSBits. Which means in Speck the least significant bits on both halves are just XORed all the way through the rounds, with just shifted versions of the halves. You can if you redraw this show it up as the equivalent of a cascade of LFSR generators which have aditional nonlinear outputs. Such N/LSFR generators have been the speciality of “British Industrial Designs” –formaly British Inter Departmental– of GCHQ since it started. And have been used in UK and later NATO stream cipher systems. Thus if there are any backdoors GCHQ would almost certainly be the people who found them… It was also the predecessor of GCHQ that came up with the idea of “finessing” which is what the NSA, tried and eventually failed to do with Dual-EC.

Whilst AND/OR/ADD logic with Rotations and XOR whitening/mixing are light weight AND is the inverse of OR and whilst the bulk of ADD is orthagonal to XOR it’s not in the LSbits.

Thus the LSBits is where I would start looking to find key etc leakage by some covert channel in Speck.

Such a bacdoor in no way has to leak the whole key, just sufficient to reduce the search space to a managable level.

If you examine the Simon round you can shuffle the XORs around and redraw it in a new form which is a sinple mapping of the left half, that then gets XORed with the round key brfore it gets XORed with the right half. Thus the strength of the round is dependent on the reducing AND function for weak nonlinearity and the round key which in effect is a constant. Which means the strength if any comes from the Key Scheduling Algorithn, and as that is a function that is only used in one direction would be a good place to hide a backdoor…

I had a chat with my son about this just now over lunch, he quite quickly understood the problems…

As for bringing up Ed Snowden, it’s not me that has too I’ve been waving a red flag since the NSA stiched up the AES competition to in effect make timing channels in any implementation a racing certainty (which is what happened, and why you should only ever use AES in “Off Line” / “energy gapped” mode).

Both Simon and Speck can be seen as a very definate attempt to reduce or eliminate time based covert channels as a “benifit” because people were finally taking the implementation side channel issue of AES seriously, thus it was going dark on GCHQ and NSA SigInt agencies.

So the trick “give the peons / plebs what they want” and finesse the backdoor to a different trick… and so the “Great Game” goes on from strength to strength.

vas pup September 23, 2017 11:27 AM

@Edward Morbius:
“Trust is an extension of what is known into what is not directly supported. It is not the same as blind faith, that is, belief in the opposition of evidence, but trust is faith in the absence of direct evidence. The presence of trust lets you skip a lot of work, much of it quite costly in terms of real resources.”
Very good point!
Regarding trusting US or any other government. Any government have multiple goals, but ALL of them has in that list self preservation as one of the primary goal (elected officials more concern about reelection, not real needs of constituents, executive branch more concerned not be removed for their actions outside the authority – e.g. Hoover collected compromised information on elected officials to manipulate them and remain in charge of FBI for a long time). Appointed judges could not be trusted because they basically have zero oversight and look at themselves as having crystal ball oracles of Law.
Ronald Regan said “Trust but verify”. I understand this as trust intentions, but ALWAYS verify actions. Anybody could screw up intentionally or otherwise. As you read above, it could be NO trust at all as soon as intentions could not be trusted.

Bob September 23, 2017 1:34 PM

“The ADD function used in Speck, the least significant bit of the ADD function is directly equivalent to the XOR function of the two LSBits.”

This is messed up. Have they released code for Speck? I have several papers, been studying it, but no code. The whitepaper never explained how data horizontally propagates beyond two word channels. I think your statement might be overstated given what modulus add and sub is, but I would have to look at it. A link to this finding would be helpful if you or someone documented this.

I think most primitive operations are weak by themselves. You are supposed to add them up to something strong with rounds, propagation, etc. Ultimately, I thought Speck was so simplistic it might actually have mixing problems. I want to approach per round product at some point (prior to any cryptanalysis attempts). I was still impressed and ARX ditches expensive operations that are not really useful. I never liked nonlinear mappings or the new lattice stuff, for instance. Speck is simple enough to code it yourself. It makes a good study, not to be ignored.

I don’t believe ISO should be doing jack with crypto, especially given any nation’s security law. A good default rule would be to reject projects from any country’s intel agencies. I can only assume the NSA tried this to license Simon for chip use and foreign markets. I think the NSA should admit to their stigma. It should have been expected.

My counterpoint is we should question the objective of NSA. If they hate citizens using crypto at all, they would not bother, even for infiltration purposes. In this case, they would deploy Speck/Simon for themselves. The US already has all sorts of backdoors in place on everyone’s NICs and routers. If the NSA thinks they can license Speck for app development, they are probably smoking crack. They need to read the news every so often.

I would just rape Speck for yourself, and use it for experimentation. Professional deployment in the future? You’ve got to be kidding. On that, even threefish isn’t really gaining traction, which is disappointing. It’s not a money maker.

Bob September 23, 2017 2:42 PM

Are you mostly talking Simon? Yeah, my bad, no mod there. I dunno. I don’t do chip.

They said they could put it on an 8-bit AVR chip. It would be good for comm crypto. Imagine being able to distribute a table of keys representing channels, say up to 100 channels. In the field, switch to whatever whenever. Redo the channel keys all of the time, back at base. Maybe they want this for NATO and need ISO for whatever legal reason. Language standardization for specification. I don’t pay attention enough. Don’t care except for Speck.

Clive Robinson September 23, 2017 2:56 PM

@ Bob,

I think your statement might be overstated given what modulus add and sub is, but I would have to look at it

It’s simple binary math,

0+0=0, 0x0=0
0+1=1, 0x1=1
1+0=1, 1×0=1
1+1=0, 1×1=0

ADD and SUB are the same function (+), the only difference is that you invert one input and in 2’s complement you add a 1 into the carry bit which turns double inverts the LSBit output which is back into the XOR again.

It’s something most assembler level programmers are aware of, C and above high level language programmers not so much, it deprnds on how they were taught and what sort of systems they write software for.

Oh, the AND function is also the bit multiplier (0.0=0, 0.1=0, 1.0=0, 1.1=1).

Clive Robinson September 23, 2017 3:19 PM

@ Bob,

The thing with the AND function of shifted versions of the half and the following XOR is best seen from the following (the number in Ln below is the number of rotation steps)


Thus L1 AND L3 is,


And L5 XOR (L1 AND L3) is


You can go through all these mappings and see what happens and what values may be missing from the mapping.

But her convenience emails persisted September 23, 2017 7:08 PM

“Facebook and Google do more surveillance on individuals than the NSA.”

You meant to say the people who choose to use google and facebook give up more information to the world at large than the NSA could possibly ask for overtly or otherwise.

Bob September 23, 2017 7:11 PM

Thank you for the explanation. I don’t like that. I do real modulus add and subtract with 2^n [n bits word size]. My subtract means y is known to find x, then with cascade from channel to channel. In addition to large word size, my method requires decimal 128-bit so as to kill 32-bit addressing and under.

//I have 4 conditions for mod sub:
(x + y)mod m = p , y will be known upon decryption

If (y=0) return p //x=p
If (y=p) return 0 //x=0
If (y>p) return (p + m – y)
If (y<p) return (p – y)
These four conditions must be coded for modular subtraction(decryption) of 2 words.

Conditions that do not exist:
x or y > m-1 (the mod must be correct and known)
negative variables; when binary in converted to an unsigned integer, only positive numbers exist; no need for absolute

So, given the modulus arithmetic proof, I do not have any hangups. I suppose someone would do it your way for direction but I do not like this ‘culture’ of conditional bits like parity. Binary math is a hangup for me concerning carry. You just proved that high level programmers will generally not go out of their way for parity or state bits.

Bob September 23, 2017 10:17 PM


I guess the mission would be to unspool at least 2 rounds of Simon with that logic, w/wo manual key schedule insertion, and see what can be grabbed.

My initial idea was analyzing what other rotation possibilities are there. In other words, I do not know why they chose those shifts, but would also like to know if you could use rotations as ‘tweakables.’ It is on the backburner but I am sure there are some common sensed lessons about that I am not aware of. There should be some ROT combos that are better than others. They only mention the difference between Speck and Threefish(2 ROT versus a permutation), not specifically why. Clearly, I need time to play and research. Looking for weak knowns is tedious and daunting.

Who? September 24, 2017 5:43 AM

@ But her convenience emails persisted

“Facebook and Google do more surveillance on individuals than the NSA.”

You meant to say the people who choose to use google and facebook give up more information to the world at large than the NSA could possibly ask for overtly or otherwise.

No way! I am not a Google customer, I have no gmail account, I am not using their services at all. However I cannot stop Google spying on me each time a lazy web development team uses Google Analytics. I cannot stop Google spying on me each time a stupid ape sends an email to me using his/her gmail account. Same about Facebook.

Clive Robinson September 24, 2017 8:03 AM

@ Bob,

My initial idea was analyzing what other rotation possibilities are there

The simple answer is that there are as many rotations as the number of bits in the effective integer size or 1/2 of the cipher block width.

But I suspect what you mean is what the relationship or effect two or more rotations have, when either combined in a half round or carry forward from round to round.

Outside of a half round the relation ship is via the XOR function which has a couple of useful properties, not least of which is it’s reversable. But more importantly you can compress the effect of sequential XORs into a single XOR which alowes you to make deductions as you work your way back up the rounds from output to input. The real difference from round to round is the two half round keys. These keys are generated by a key expansion circuit/algorithm. For obvious reasons the expansion wants to be as nonlinear as reasonably possible. Because any linearities are prone to being unwound backwards which opens up all sorts of posibilities to an attacker.

Thus the place to look after analysing a half round for a backdoor is the key expansion circuit/algorithm.

Which brings us back to the half round analysis. Rotations are in effect small shift registers, you can combine them in a number of ways using AND, ADD, MUL, OR, SUB, XOR etc. Only AND and OR are not easily reversable, due to the 3 out of four effect, which is where the nonlinearity comes from. XOR is in effect a single bit adder without carry in or carry out. It just so happens on a bit by bit basis the carry function is also the equivalent of the bit wise multiply. SUB can be either ‘Ones complement’ or ‘twos complement’ the complement is simply to XOR one of the integers with the all ones state (2^n -1) then perform an ADD, if with no Carry In it’s ones complement, if it is with carry in (ie add 1) then it’s twos complement. From this it can be seen that the SUB is effectivly the same as ADD with one input inverted then potentially an increment.

The reason to point out the equivalence to the XOR is to do with Linear Feedback Shift Registers (LFSRs) and an effect called correlation. If you look up LFSRs you will find a rich body of literature, but for another view look up Lagged Fibonacci generators (LFG or LFib). They have all sorts of interesting effects but also defects which means they are of interest and considerable study by those making cipher systems and those breaking them. Likewise they would obviously be of significant interest to those designing a cipher system with a covert method of assisting those having to break the design.

If you are going to look at how the rotations effect the half round or the key expansion circuit/algorithm you will need to know a great deal about LFibs in their various forms and feedback generators. You can get a feel for this by enumerating the half round input values using a short integer and then printing out the results in binary, then compare with different rotation values. One way to get a feel is to print out 1s an 0s but replace the zero with a full stop or even space. Because this enables you to stand well back from the print out and see the shape not the details, which can give an intuative feel for what the effects are.

It’s the equivalent of what an electronics engineer does when using an oscilloscope and changes the sweep time base from uS/div to mS/div or even further out. I use it when looking at the output from supposadly Random Number Generators (RNGs) and many weaknesses can be easily spotted that the likes of the DIEHARD tests miss.

Because the AND function is a binary multiplier, it can work like the mixer in a superhet recever. Thus you can easily see the “product” and if you want to subject it to an FFT or FWT to see what it’s spectral output is like. If the output spectrum is not flat and shows peaks they can be a cause of concern because it can give a crypto analyst a small hook or crack which they can then expand upon. However the converse is true, if the output spectrum has other charecteristics it could be that in effect the “back door” spectrum is being hidden by Spreaf Spectrum techniques in the same way some digital watermarking scheams of the late 1990’s worked.

As I said, plenty of fun for high school kids onwards to play / investigate.

Bob September 24, 2017 12:37 PM

An exhaustive analytic on rotation combos.
1) Analyze mixing (product bits diff from plaintext) using Speck’s 8/3 combo as the baseline for the two channel model

2) Instead of any spiffy stats, roll through the combos and use a count for less mixing, equal, greater vs. 8/3

3) The nesting is important. For each possible key schedule insert, loop through each possible plaintext word range. The combo nesting for 1 round and all rounds (two different runs maybe).

There are other small details and GUI presentation is important.

If I were on the ball, it would take less than a month of free time. I would need to cobble together the Speck operations, even if it were not for legitimate use.

The raw mixing comparisons would probably show quite a bit. I am not storing specific key and plaintext combos in this. There should be some weak spots with bad rotation combos.

Overall, I think Speck looks strong. They put each operation in a certain order for a certain reason. I will leave cryptanalysis to the pros but this analytic has been carving away at my brain. If I get it running, Simon would be next.
The idea is with cryptanalysis, the code/algorithm is considered known/exposed/identified. So what breaks that? A non-static cipher. You can stare at the code all day, but a custom rotation combo increases difficulty with minimal cost. I thought of this thinking about iterative password hashing scheme. I know it wouldn’t be terribly successful and full integration into an app would be necessary.

My argument is that the NSA gives just enough math proof to piss you off. In general, you see math proof for any cipher, not algorithm analysis. These are mathematicians that got into programming, not IT people that need immediate practical explanation. I don’t know if the FTC or NSA squelches explanation down to math proofing only, but it pisses me off. Selling a cipher? Starve already.

Persistence is the key September 24, 2017 4:54 PM


” I cannot stop Google spying on me each time a stupid ape sends an email to me using his/her gmail account. Same about Facebook. ”

You… could block their IP’s and domains entirely. Yes it’s some work.

Are you committed?

Me September 25, 2017 1:57 PM

@Security through Spycraft:
“sicut comportabis apud omnes, et non habent quod analyze ad robora pubis usque ad unum percent post factum a major terroristis impetum usquam est”

FIPS Engineer September 28, 2017 4:39 PM

@ BF Skinner

To clarify, FIPS can’t require an algorithm. It only has a list of approved algorithms and functions you can choose from. I don’t see anybody switching over to these algorithms from AES. I am sure they will add these algorithms as soon as FIPS 140-3 is released though…

Etienne October 3, 2017 1:59 AM

If a specific NSA algorithm can be used for classified material up to Military Secret, then I would assume it is safe. Lacking that, then I would assume it is only good for communications, where an envelope likewise would be used in the Postal Service mail.

nanashi November 5, 2017 1:57 AM

Regarding trusted operating systems (specifically EAL ratings), this is not a standard for how securely an operating system is written. It is entirely based on what type of access control a system supports. Windows XP has an EAL rating (EAL4, I think?), and it is not being revoked just because there are unfixed vulnerabilities in the code. Even the highest level, EAL7+, which involves formal verification, does not necessarily require access to the source code, only formal verification of the access control setup (as far as I am aware). Though because it is so damn expensive, it often comes alongside at least partial verification of the code (for example INTEGRITY-178B. seL4 would probably be eligible too). Don’t think too much into EAL ratings.

nanashi November 5, 2017 2:02 AM

For Speck and Simon, this may come as a surprise but I trust and use them in certain scenarios (specifically, Speck on an 8051 microcontroller). For extremely lightweight ciphers, Speck and Simon are probably some of the best. I wouldn’t use it if I had a more powerful processor, but it is what it is. I should probably compare it to ChaCha8 though.

I absolutely don’t think they should be standardized, but they’re a nice option for extremely low-power systems.

Melvin Ng February 25, 2018 9:37 AM

Just working on this for an IoT device router, do you know if Speck and Simon has been approved for ISO standards yet?

They were suppose to announce it on Feb 2018, and was looking forward to that but still no news online yet.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.