SIMON and SPECK: New NSA Encryption Algorithms

The NSA has published some new symmetric algorithms:

Abstract: In this paper we propose two families of block ciphers, SIMON and SPECK, each of which comes in a variety of widths and key sizes. While many lightweight block ciphers exist, most were designed to perform well on a single platform and were not meant to provide high performance across a range of devices. The aim of SIMON and SPECK is to fill the need for secure, flexible, and analyzable lightweight block ciphers. Each offers excellent performance on hardware and software platforms, is flexible enough to admit a variety of implementations on a given platform, and is amenable to analysis using existing techniques. Both perform exceptionally well across the full spectrum of lightweight applications, but SIMON is tuned for optimal performance in hardware, and SPECK for optimal performance in software.

It’s always fascinating to study NSA-designed ciphers. I was particularly interested in the algorithms’ similarity to Threefish, and how they improved on what we did. I was most impressed with their key schedule. I am always impressed with how the NSA does key schedules. And I enjoyed the discussion of requirements. Missing, of course, is any cryptanalytic analysis.

I don’t know anything about the context of this paper. Why was the work done, and why is it being made public? I’m curious.

Posted on July 1, 2013 at 6:24 AM34 Comments


ThreeTwoTwo July 1, 2013 6:36 AM

“and is amenable to analysis using existing techniques.”

Reviewers note: Please change ‘existing’ with ‘publicly accessible’.

foosion July 1, 2013 6:41 AM

Should we be concerned that the NSA publishes or approves ciphers because it has made advances in deciphering (either hardware or analysis) and its approved ciphers are vulnerable, or at least more vulnerable than others?

nobnop July 1, 2013 6:49 AM

“The aim of SIMON and SPECK is to fill the need for secure, flexible, and analyzable lightweight block ciphers.” – the interesting in that is “analyzable”…

Hugo July 1, 2013 7:03 AM

Sure, like we trust anything coming from the NSA!
Hell, like we trust anything coming from the USA!!

Alex S July 1, 2013 7:18 AM

I suspect that they know that there are lots of talented people outside of the NSA, and they want to know what those folks will say about the algorithms.

They must have figured that for one reason or another, what they’d gain from the outside comments will make up for the loss of secrecy on these algos. Maybe they don’t use them, so they’re not that valuable.

Maybe they have a log of what Snowden took, and they suspect he’s given the Russians information about them, so they’ve been burned anyway.

Spaceman Spiff July 1, 2013 7:33 AM

Interesting? I’m sure. Technologically sophisticated? No doubt. Back doors or decryption-enabled by the NSA? Most likely…

Someone July 1, 2013 7:38 AM

Even the crypogeeks at the NSA aren’t stupid enough to think the industry would adopt either of these without being able to take a look at the source code and underlying math.

So with that in mind, assuming these are cryptographically sound algorithms, there’s no need to deny them outright.

phred14 July 1, 2013 7:47 AM


Can a backdoor or trapdoor be hidden from a good crypto person who has access to the description and source code?

In other words, could someone like you “bless” SIMON and/or SPECK?

Peter July 1, 2013 8:00 AM

A cynical answer to “Why is it being made public?” might be “As part of an ongoing effort to distract attention away from PRISM”.

Bruce Schneier July 1, 2013 8:27 AM

“A cynical answer to ‘Why is it being made public?’ might be “As part of an ongoing effort to distract attention away from PRISM.'”

I don’t think so. The NSA can’t possibly move that fast.

Bruce Schneier July 1, 2013 8:41 AM

“Can a backdoor or trapdoor be hidden from a good crypto person who has access to the description and source code?”

The code is not relevant here; the question is whether a back door could be hidden in the mathematics of the cipher, like this.

It’s hard. Basically, the NSA would need to have a cryptanalytic technique that is 1) powerful enough to practically break the algorithm, and 2) unknown to the academic world. And it’s risky. Once the algorithm is out there, there’s a good chance that we in the academic community would figure out the technique. (When the NSA updated SHA to SHA-1, it didn’t take that long for the academic community to figure out why.)

So, maybe, but I don’t think so.

name.withheld.for.obvious.reasons July 1, 2013 8:42 AM

Verification of cryptographic applications, irrespective of the layer or distribution model. is not well established in the commercial realm. Certificates, hashes, and checksums are insufficient methods to determine and match audited source code to the; the source control methods, configuration management systems, the build process, and binary distribution.

Remember the adobe certificate hijack event? Previous to that discovery was a Tor project discovery that exposed diginotar that seems to point to a GTE forgery. Whether it was a MITM or some other attack is not clear (some theories rotate on a BGP exploit).

Microsoft Root CA,, Google, and some other CA’s. I don’t know what happened in 2011 that seems to have perturbed the PKI system(s). There also seems to be an event related to this in 2006.

In short, securing communications or data on the internet may be a real trick–if it can be done. Like Bruce reiterates over and over again, perfect security is perfectly impossible.

Kevin July 1, 2013 8:43 AM

Releasing new encryption algorithms isn’t going to distract anybody. This story won’t rise above the level of tech blog.

“Backdoors”? no, the only code in the paper is in an appendix and is very short. Vulnerable to an attack? Possibly but anyone implementing this without waiting for some independent analysis is foolish.

arfnarf July 1, 2013 8:46 AM

The article explains at some length that the purpose of these ciphers is to provide solutions for lightweight devices (insulin pumps, car brakes are the examples they give).

This is simply your tax payer dollars at work – no need to invoke conspiracy theories here.

Just because some parts of the NSA are a bit suspect these days doesn’t mean that the NSA Research Directorate is also bad.

Clive Robinson July 1, 2013 9:50 AM

Folks should remember that in theory the NSA has two contradictory missions,

1, Protect the communications of the USA that effect US National Security.

2, Gain access and understanding on the communications of other nations and nationals (that are assumed to always effect US National Security).

The problem is “National Security” has a definition as broad as required by those formulating US policy, and this has in the past been used to spy on any company that is trying to do business in the same market as US companies irrespective of if the other entities are US or not (note this is not unique to the US the French openly indicated that they did this many years ago, and many other nations have been caught at it in the past).

This “dual role” would make many people and organisations schitso/paranoid and this has certainly appeared to be the case in the past with the NSA.

One trick they grabbed with both hands in the old days of mechanical cipher machines was to have a large key space where some keys were strong and others weak to very weak in some non obvious manner. Because the NSA was responsible for keying material used by the US then they could ensure the use of the strong keys. However anyone capturing the equipment and re-using the design without the required knowledge would end up using a cross section of strong and weak keys on a random basis. Frequently this ment that 20-25% of comms could be broken quickly with these providing usefull information (known plain text etc) to break any of the harder keys.

They refined this idea and it became apparent with clipper that they had moved on to the idea of brittle ciphers where any slight change in the design would weaken security down to around 40bit’s equivalent.

With AES they in effect fixed the competition rules such that the code on the NIST site was not only freely downloadable and usable by any one, it was also optomised for speed/efficiency, not security and thus the code that went into nearly all products and code libraries was full of time based side channels etc.

Of more recent times it looks like they are using peoples poor knowledge of random (sequence /) number generators to gain access by way of poorly selected or re-used key material and nonces used in protocols and standards.

We also know from Bob Morris senior [1] who was one of the technical/scientific seniors in the NSA that “known plain text” is still of major interest to the NSA. And as we see with MS Office products, known plain text in very large quantities appear in nearly all user generated files.

There are various other predictors we can use for NSA behaviour, and if people care to read back on this blog various people have pointed out what the likes of the NSA and GCHQ et al have been upto for a number of years.

As has been seen the battle on crypto algorithms is over and the NSA kept winning untill it became clear they could not keep making the equivalent of lies to the politicos. But by then the game had moved on. In that respect nothing realy has changed. However whilst they are still winning the war the tide is turning against them and the technical distance between them and the open / academic community is closing if it has not closed in some areas.

However the NSA whilst not having a monopoly on mathmeticians and people with brains has a number of advantages over the open / academic community. One is funding that whilst not unlimited can certainly be used for what is in effect blue-sky research, whilst the academic community has to find it’s funding in an open and accountable way and thus has to keep it’s feet closer to the ground.


David Magda July 1, 2013 9:50 AM

I’m curious to know if the NSA will also be releasing stream ciphers at some point as well. As Mr. Schneier has mentioned before, there’s a need for them. More so now because of recent attacks on TLS: RC4 is weak-ish, and not much software supports TLSv1.2 (which has AES-GCM).

While it’ll take a while to go through the process, having a few more widely vetted algorithms wouldn’t be a bad idea.

Johnston July 1, 2013 10:12 AM

@David Magda

Salsa20 is a very fast stream cipher introduced in 2005 by djb. It was entered into the eSTREAM competition and made it into the final portfolio. 4cpb, constant time. Best cryptanalysis breaks 8 of 20 rounds at 2^251 work. No attacks on 12 or 20 rounds.

It’s the cipher used in DNSCurve.

Tree July 1, 2013 10:29 AM

“A cynical answer to ‘Why is it being made public?’ might be “As part of an ongoing effort to distract attention away from PRISM.'”

I don’t think so. The NSA can’t possibly move that fast.

Plus, other than also having a five letter acronym, this is possibly the most boring story ever for normal media.

Jeremy July 1, 2013 12:11 PM

Maybe the Prism documents include info about (or leading to) weaknesses in existing ciphers?

That would motivate them to release known-good replacements.

Bruce Schneier July 1, 2013 1:32 PM

“Maybe the Prism documents include info about (or leading to) weaknesses in existing ciphers?”

Possible. I think it’s very likely that the NSA knows what documents Snowden has. Or, at least, a superset of documents that he has.

So when you watch their damage control, assume that they’re already controlling for damage that has not actually occurred yet.

Speck July 1, 2013 1:55 PM

“These and other important differences make SPECK significantly lighter and faster than Threefish in both hardware and software.”

How much faster are we talking?

Carpe_Noctem July 1, 2013 2:04 PM

@ Clive Robinson

“When the NSA updated SHA to SHA-1, it didn’t take that long for the academic community to figure out why.”

Bruce is focused on the security aspect, in which NSA updates SHA and the community studied it to figure out why. What gets me is that, like your references to AES, almost every spec they release most likely has some attack known to them implemented, yet everyone seems to trust their stuff. You and I have talked about side channels before, but they continue to be heavily overlooked and more importantly, hard to find.

One example I like to use for this is the OpenBSD boondoggle. Everyone remembers the allegations, and remembers it coming out that, yeah, they were trying to implement backdoors… but they say they weren’t successful and that it was dropped. What no one really pays attention to is the latest update from Perry, where he says, “I personally believe that the FBI, or at least certain officials within the administration at that time, willingly advocated the relaxation of encryption export regulations only due to their discovery of critical vulnerabilities and weaknesses in the RSA encryption algorithm not exhibited by the predominant public key encryption method used at the time which was Diffie-Hellman.”

My guess is that side-channels have become so (relatively) easy to obscure for the NSA that just being involved in development in a project gives them the ability to implement attack vectors that are not noticeable at all to even very experienced audits. Once one is found for a certain algo, they push for it’s adoption as a standard. Which means almost every piece of “secure” infrastructure has been deliberately weakened.

Another good quote on the subject comes from Eben Moglen, talking about the 90’s cryptowars. He says in 95 at Harvard, Stuart Baker (former NSA General Counsel), after a debate about the right to encrypt, said, “…public key encryption will become available. We fought a long losing battle against it, but it was just a delaying tactic…”

Very telling, if you ask me.

Mike July 1, 2013 2:32 PM

Any glimpse into the NSA’s technology is a rare and exciting treat for an outside cryptographer. Watching how well it holds up under analysis from the crypto community will be especially interesting.

SIMON’s nonlinearity comes only from that single AND operation, which means that if it holds up to mathematical attacks, it will be especially amenable to power-analysis protection.

D. G. July 1, 2013 7:08 PM

There’s mention of Simon and Speck in 2012:
A Do-It-All-Cipher for RFID: Design Requirements
(Extended Abstract)
Markku-Juhani O. Saarinen and Daniel Engel
From Ecrypt II, DIAC Directions in Authenticated Ciphers
5–6 July 2012 Stockholm, Sweden

The U.S. National Security Agency has recently published performance and implementation footprint numbers for their in-house developed lightweight block cipher families SIMON and SPECK [4]. …
…However, in order to fully benefit commerce, industry, and the general public, the algorithm details must also be released. This will make the algorithm standardizable as most international bodies are reluctant to blindly trust technology that has its origins within the security apparatus of any one nation. …

4.BEAULIEU, R., SHORS, D., SMITH, J., TREATMAN-CLARK, S., WEEKS, B., AND WINGERS, L. Performance of the SIMON and SPECK families of lightweight block ci-
phers. Tech. rep., National Security Agency, May 2012

Now, there’s a possible (innocent) explanation for the specification release.

Clive Robinson July 1, 2013 10:51 PM

@ Mike,

    SIMON’s nonlinearity comes only from that single AND operation, which means that if it holds up to mathematical attacks…

It’s quite an “if” when you consider the two inputs to the AND gate is a “self rotation”, thus tends to zero output rather rapidly.

If you look at historic designs of stream ciphers using LFSR style PRNGs with the nonlinearity provided in either the feed back or output, it was usually derived by circuits using two independant AND operations feeding an OR operation or other similar method to try to balance the output state whilst still maintaining nonlinearity.

paul September 14, 2013 3:23 PM

I wonder how Speck compares with Skipjack, another 8-bit friendly block cipher (64/80) released by the NSA some years back, as a remnant of the Clipper chip.

Patriot May 31, 2018 8:53 PM

“And I always like seeing NSA-designed cryptography (particularly its key schedules). It’s like examining alien technology.”

Phone home…

wiley April 8, 2019 4:48 PM

Unfortunately, SIMON is already becoming popular for RFID tags (meaning it will see potentially hundreds of millions of uses) because AES is extremely bloated compared to it. On the upside, the amount of data encrypted in RFID is on the order of a kilobit, which is small enough that any severe weaknesses in the cipher is unlikely to be exploitable. The only reason I’m concerned about its adoption in RFID is the fact that it’s giving people a reason to trust the NSA, which I don’t think they deserve. If SIMON and SPECK turn out fine, then people will more blindly adopt anything else the NSA releases in the future, even things which turn out to be backdoored or otherwise weak.

My only technical criticisms of SIMON are its use of a single AND operation for non-linearity, and the fact that it doesn’t use pre- and post-whitening, which is pretty important for a Feistel network.

I think it would have been better if PRINTcipher was used for RFID instead… Sadly the person who popularized SIMON by publishing tools for verifying it in silicon hadn’t heard of it at the time.

wiley April 8, 2019 4:56 PM

Unfortunately, SIMON is already becoming popular for RFID tags

To expand on that, a chat I had with the person who popularized SIMON in silicon had mentioned that a highlight from the IEEE RFID conference was that SIMON will likely be included instead of AES in the GS1 Gen2 v3 standard for RFID.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.