Two New Papers on the Encryption Debate

Seems like everyone is writing about encryption and backdoors this season.

I recently blogged about the new National Academies report on the same topic.

Here's a review of the National Academies report, and another of the East West Institute's report.

EDITED TO ADD (3/8): Commentary on the National Academies study by the EFF.

Posted on March 12, 2018 at 6:27 AM • 21 Comments


AlexT March 12, 2018 10:09 AM

Unless I missed something I don't see any major new idea / concept / solution in those two contributions.

Jason McNeillMarch 12, 2018 11:12 AM

My reaction to the to the first paper, the one written by the R Street Institute:

The government has demonstrated a poor implementation of security principles in its protection of its own online assets. This is probably because it takes extensive security expertise and experience to build an application that offers sufficient deterrence.

But one way the government can implement security is to store a master encryption key on a hardware device that is (A) designed to be resistant to tampering, (B) is not connected to any wired or wireless network, and (C) is physically shielded and guarded from outside attack. This is simply a matter of making use of a Hardware Security Module (HSM), which can satisfy all of the above requirements.

For the record, I am against measures that mandate encryption back doors in software applications and/or in hardware. But if the government did go down this route, and if the courts declined to intervene, then the best policy that the government could implement is to make use of a technological solution to minimize dependence on human know-how. If they're going to force it on us (and I hope they don't), then at least make the solution technologically sound. I'm convinced that such a solution is best achieved by way of HARDWARE, specifically hardware that is resistant to physical tampering and that designed to store encryption keys and perform a very limited set of cryptographic operations that occur entirely within the device and thus cannot be observed directly.

The key to storing a master key is hardware.

WoMarch 12, 2018 4:38 PM


What? Tamper resistant may as well mean open to all when your adversary is a nation state and the reward is the cryptographic key to the kingdom.

Jason McNeillMarch 12, 2018 4:51 PM



"Tamper resistant may as well mean open to all when your adversary is a nation state..."

Here's what I wrote:

"(C) is physically shielded and guarded from outside attack"

Imagine a Fort Knox-style room, under military guard, physically guarding the HSM, just like top-secret physical documents are guarded. Contained within a complex, and within that complex, contained within a room that is insulated from electronic monitoring (i.e. a Faraday cage). Within that room is a computer with an HSM attached. There is no WiFi, there is no network cable, there is no network connection period. That's the scenario. In order to successfully decrypt information, you have to send it into the HSM, where it is decrypted, and the plaintext is the response from the HSM.

If something is still vulnerable under even those circumstances, then the vulnerability is not in the HSM itself.

Sancho_PMarch 12, 2018 6:09 PM

@Jason McNeil

Wait, there may be a fundamentally flaw (or two):
Fort Knox: The USA is not the only country on our canoe / spaceship.

- Where would be your Fort? China?
- If in the US, who would have access, Trump or Hillary (party)?

Sancho_PMarch 12, 2018 6:13 PM

If, as the EFF wrote, these reports are undertaken in good faith,
then the intended outcome is not to make progress.
On the contrary, the intention is to require more studies.

Mutually broadcasting papers doesn’t help.
To make progress it would be mandatory to have both sides, LE and security advocates, to write down and openly discuss their arguments point by point.
Only I’m afraid it’d be difficult to name experts arguing for the LE side.

Jason McNeillMarch 12, 2018 7:34 PM

The EFF is right to point out that it does a disservice to the security and privacy of citizens to fixate of "how" to require backdoors while sidestepping the issue of "whether" to do so.

I'll say this: if the FBI can simply circumvent a phone's encryption security by physically attacking its TPM (as they appeared to do in the aftermath of the San Bernardino shooting), then they already have their "how" (at least for the iPhone model in question).

Jon (fD)March 13, 2018 1:06 AM

@ Mr. McNeill,

Okay. They already have the how. So what they're asking for is just 'how to make it easier'. Sticking the key in Ft. Knox does not make it easier.

(oh, and without connection, how does the encrypted data get into Ft. Knox and how does the plaintext get out again? Handwritten on paper?)

The easier you make it for them to get the bad guy, the easier you make it for them to get the wrong guy.

And, as Sancho_P already pointed out, what makes you think the cops are always the good guys? Would you like the Saudi Arabian religious police to investigate your every phone message and everyone from your high school?


Jason McNeillMarch 13, 2018 10:15 AM

@Jon (fD):


"So what they're asking for is just 'how to make it easier'. Sticking the key in Ft. Knox does not make it easier."

As I have already written in my first comment on this post, I oppose the government mandating backdoors into privately-made hardware and software. I want it to never happen; the last thing I want is to make it easier, which is the motive of the government. But did you read the academic papers that Bruce posted? Did you perceive the argument that is being made? The authors are claiming that there are two sides to the debate and that they are somehow locked into rigid, ideological positions. The very act of refusing to engage on the merits -- which is what describes you -- merely reinforces their narrative. I am only advocating that if national secrets -- e.g. encryption keys -- need to be kept, they should be encrypted on hardware, shielded, disconnected, and guarded. If you oppose the storing of encryption keys on hardware that is specifically designed to protect them -- hardware on which the keys are born and never leave during the lifetime of the keys -- then what's the alternative? What are you proposing? Public disclosure of national secrets? If not that, then what?


"how does the encrypted data get into Ft. Knox and how does the plaintext get out again? Handwritten on paper?"

I posed the hypothetical Ft. Knox as an example of a fortress where physical access is designed to be difficult, authenticity of one's identity is confirmed, authorization is confirmed, and the entire event is logged and subjected to constraints that are objectively defined by policy. This is the principle.
How the ciphertext to-be-decrypted gets into the room where the computer + HSM exists is up to policymakers. The ciphertext could consist of a mere symmetric key that has been encrypted, and all that would be needed is to feed the short ciphertext of that key -- encoded as Base64 or Hex, into the HSM. It may only be a short string, perhaps 64 characters long. And yes, it may actually be written on a piece of paper. Or not. The principle still applies, which is this: If national secrets are to be encrypted, then the encryption keys should be made secure. If you continue to oppose me, then you're opposing this principle, because I'm not arguing anything else besides the principle.


"what makes you think the cops are always the good guys?"

What makes you think that I think that?

Jon (fD)March 13, 2018 12:21 PM

@ Mr. McNeill,

Thanks for the lengthy commentary. Pardon me if I don't quote it while I respond:

What's on my phone is not a national secret. It's MY secret. What they want to do with their secrets is up to them.

Yes, I am being intransigent because although there are two sides to every debate, sometimes one side is just ridiculous, and should be called out as such. For example, the Earth is not flat, despite some people arguing that it is.

If you can interact with it in any way (even handwritten on paper), there is an attack surface. And anything worth attacking will be attacked.

And finally, I think you think that because you think the government can implement security (with their own backdoor) on MY device. What they do with their own devices is their problem. What they mandate to be done with mine is very much my problem.

And I once again reiterate Sancho_P: Which government?


de la BoetieMarch 13, 2018 2:32 PM

One thing I didn't see in the analysis of these policy documents was the provenance of the sources. Looking at them, I'd not trust either an inch.

They talk about stakeholders. They don't mean the proles, that doesn't mean democracy. They only mean "influential stakeholders". As for R Street, they're free-market wonks.So the EFF is quite right about the false-framing.

As usual with this kind of representation, that means input to policy misses out on many aspects mentioned above, namely a grossly biased view of the total cost/benefits of any particular action, which spookily removes risk from some and rewards them, while transferring risks and costs onto - yes, you guessed it - the proles.

As far the "debate" is concerned, I don't expect any. It'll just be lip service for basically unacceptable stuff served up by the Beloved Leaders, whilst furiously ignoring some realities, both technical and economic.

Jonathan WilsonMarch 13, 2018 7:43 PM

My personal view is that if the only way to catch/convict a particular suspect or solve a particular crime (or gather evidence about a suspect's motives, affiliation, possible accomplices, other crimes they may have committed on the past or might commit in the future or potential crimes that might be committed by other people) is via a general backdoor that affects a whole bunch of users then I would rather that crime go unsolved or that suspect go uncaught or law enforcement not get the information about accomplices or affiliations or other related crimes than have the back door putting everyone's security at risk.

And yes I would still hold that view even if the crime in question involved the potential death of one or more people.

Sancho_PMarch 14, 2018 8:12 AM

@Jonathan Wilson

Me too, but:
What at first sounds brave is actually old school thinking (in dubio pro reo).
Be aware that nowadays all we need for war is a tweet, not guilt.

I guess the reason is in our genes, in nature:
We somehow “know” that limiting the aggressor (= mankind) is mandatory.

echoMarch 14, 2018 6:03 PM

@Jonathan Wilson, Sancho_P

I'm very likely repeating what both of you said just from a different perspective.

The mistake the advocates of backdoors keep making is they forget the overwhelming majority of investigation leads come from the general public. Criminals in part can be created by or nurtured by society but also society is what moderates the worst excesses of human behaviour.

I have noticed the habit of some people within the beaurocratic hierarchy to both shape discussions and only listen to higher ranking job titles or instititions. They don't seem much able to deal with real people.

Sancho_PMarch 15, 2018 1:59 PM


Not sure if I understood everything you wrote, but your last paragraph is at the mark:
Authoritarian states act in the simple military hierarchy: Yes, Sir.
Compliance is rewarded by money, pomp and glory.
Critical thinking is suppressed by all means, of course silenced [1], at least never discussed.
In a Democracy each individual is welcome not only to have their own opinion but also to vent it, in public and privacy (+ the second family, the business). A honest, open discussion would inform, educate and shape generally accepted judgement.

Regarding “going dark” we only from hear the monkeys in suits, the real people there are not allowed to speak - That would be suicidal for them, but now it is for society.

[1] That’s what they actually want the providers to do in the Internet, to hide the “wrong” speech. This is ill-considered in two points: First, what is hidden can’t be discussed, and second, we have law and LE to deal with dangerous content and people. There is a reason why justice is (was) not privatized.

HermanMarch 18, 2018 5:59 AM

The problem is that if the government has a backdoor, then the government can plant false evidence on the device.

In practice, it is much worse. A backdoor allows anyone with sufficient knowledge or money to plant false evidence on the device.

Therefore the presence of a backdoor destroys the usefulness of the device both to its owner and to the courts.

MarkMarch 18, 2018 6:44 AM


that argument has been valid for decades with unknown / know not patched | remote / local exploits. Just because something is encrypted [ disk | file ] does not mean you did it.

Sancho_PMarch 18, 2018 2:09 PM


Right, but “they” would never acknowledge.
Because, taken a step further, even without a backdoor, for anything found on our devices, there is no evidence who has put it there.

Our devices are not secure.
This is at the very core of the “going dark” issue.

Funny is, both sides of the encryption debate don’t stress that fact.

Clive RobinsonMarch 18, 2018 4:18 PM

@ Sancho_P, Herman,

Because, taken a step further, even without a backdoor, for anything found on our devices, there is no evidence who has put it there.

Yup, and it's one of those problems people keep skating over. Let's be blunt about this, even back in the days of MS-DOS 3 and earlier where a computer could be run off of a single floppy disk --and frequently was-- even though Microsoft provided "" it was way beyond all but a very very tiny fraction of computer users who could say what was on every track and sector of that floppy. Worse they mostly could not tell if a .com or .exe file was original or not. Nor had they a clue how the early stages of the boot up proces where MBR malware could put code into memory below the and BIOS interface such that it was in effect invisable...

Now we have multi-terabyte drives, some of which supposadly can hold the entire life experience of anyone less than thirty years old. Thus there is no way that an individual can check a modern hard drive.

And forensic tools are not exactly infallible infact the opposit is true. Thus you are at the point where you can in no way trust the computer infront of you, even if it has never been connected to a network or has the user/owner ever put in any removable media other than that supplied from the OS, Application and Device vendors.

We have seen CDROMs issued by Microsoft have a Word-Macro virus on them, we have seen Lenovo portable computers have factory installed by the vendor malware, Apple ship audio devices with MS-OS virus on them "allegedly" in the supply chain. As for Adobe and Oracle there products are regarded by many as so buggy that they are either Malware or Backdoors by default... The list is long and now with inbuilt wireless technology there is without doubt no way a user can know what is on their computer unless they know how to energy-gap it then fully sanitize it. Due to the way modern tech is designed, even those design engineers at the vendor who designed the computer can fully sanatize a computer.

Which means there is not realy any such thing as "Reliable" technical evidence. You have to be able to demonstrate what is not just HumInt evidence but HumInt with contact evidence otherwise it does not constitute legaly presentable evidence.

In a proper investigation there should be as in all reliable Science to avenues of enquiry. The first is to collect information that might be evidence and try to make connections that unambiguously ties to an individual. The second is to collect all information and try to disprove any ties to an individual.

One of the biggest problems in the UK criminal investigation system is the making assumptions and building cases around the assumptions. Whilst totaly failing to look for evidence to disprove a case or worse hide any such evidence found even to the point of Police Officers purjuring themselves in court... Even when getting caught out witholding evidence that will clearva defendent there is little or no punishment handed out for having totaly destroyed an innocent persons life...

I'm quite certain this is not just a UK issue. In theory the French system where an indepedent magistrate goes through ALL the evidence as the case starts from the report of the crime to the trial is not perfect. But politicians do not like it as it lowers the conviction rate. It's why politicos want "plea barganing" with significant barbs. Thus you will be found guilty of something once the finger is pointed. If you do go opt to go ti trial they tell you "you will never be free" as an opening gambit, they will then bankrupt you and your family such that you can not get competent counsel to defend you. Then if you are found innocent they will fight tooth and claw to stop you claiming back your costs.

I don't know about other people on this blog, but that in no way sounds like justice to me...

Sancho_PMarch 18, 2018 7:13 PM

@Clive Robinson

No, it’s not justice, but business ("plea bargaining").
I’d only want to add one point:
In our legal system, developed over decades, although the prosecutor was already obliged to search also for exculpatory evidence, it was a fundamental principle that the defense counsel had access to the full amount of evidence, pro and contra, and the methods used to find them.
Everything had to be traceable, point by point.
We must not accept to leave this principle.

justinacolmenaMarch 21, 2018 7:04 PM


"defense counsel"

I've had a few of those. They don't have the time of day to look at evidence or weigh pros or cons or any of that. All they ever do is take a deal and plea their client guilty for a reduced sentence, where "reduced" is anything less than the maximum that could possibly be imposed by whatever statute(s) and charge(s) might be brought against you. And it's all statutory law. There is no Constitution when you have a "defense counsel." You find that out mighty fast.

Lose your rights for life over something that was never more than a verbal argument to begin with, and come to find out, you never had rights to begin with in fascist America.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.