New Paper on Encryption Workarounds

I have written a paper with Orin Kerr on encryption workarounds. Our goal wasn’t to make any policy recommendations. (That was a good thing, since we probably don’t agree on any.) Our goal was to present a taxonomy of different workarounds, and discuss their technical and legal characteristics and complications.

Abstract: The widespread use of encryption has triggered a new step in many criminal investigations: the encryption workaround. We define an encryption workaround as any lawful government effort to reveal an unencrypted version of a target’s data that has been concealed by encryption. This essay provides an overview of encryption workarounds. It begins with a taxonomy of the different ways investigators might try to bypass encryption schemes. We classify six kinds of workarounds: find the key, guess the key, compel the key, exploit a flaw in the encryption software, access plaintext while the device is in use, and locate another plaintext copy. For each approach, we consider the practical, technological, and legal hurdles raised by its use.

The remainder of the essay develops lessons about encryption workarounds and the broader public debate about encryption in criminal investigations. First, encryption workarounds are inherently probabilistic. None work every time, and none can be categorically ruled out every time. Second, the different resources required for different workarounds will have significant distributional effects on law enforcement. Some techniques are inexpensive and can be used often by many law enforcement agencies; some are sophisticated or expensive and likely to be used rarely and only by a few. Third, the scope of legal authority to compel third-party assistance will be a continuing challenge. And fourth, the law governing encryption workarounds remains uncertain and underdeveloped. Whether encryption will be a game-changer or a speed bump depends on both technological change and the resolution of important legal questions that currently remain unanswered.

The paper is finished, but we’ll be revising it once more before final publication. Comments are appreciated.

Posted on March 22, 2017 at 6:23 AM42 Comments

Comments

mark hutchinson March 22, 2017 7:40 AM

Part I:
1. “Any computer data can be encrypted” -> “Any data can be encrypted”

  1. “2^256 possible keys, a number twice as long as the previous number.”
    Twice as “long” doesn’t really convey how much larger, since 2^129 is twice as large as 2^128.

Rufo Guerreschi March 22, 2017 8:03 AM

Scalable encryption workarounds, helped by endpoint hacking automation tools (Nsa turbine, Nsa foxacid,HT RCS) are most likely accessible to large amounts of criminal and state actors.

Creating endpoints that are radically more secure could be done by radically increasing oversight, accountability, transparency (as Bruce said in Brussels 2015) and security review relative to complexity.

But then those endpoint should have ways by which LEAs can access specific info remotely and endetectably following due legal and “constitutional” process; all the while not causing unacceptable additional privacy risk in comparison to current alternatives.

We think we can deljver on the challenge, join us:

http://www.trustless.ai

mark hutchinson March 22, 2017 8:24 AM

Part II
1. “Find the Key”
You seem to repeat yourself in your examples. Although I understand what you’re writing in this section, the wording seems awkward. I suggest a rewrite. I like the use of Scarfo in your example.

  1. “Guess the Key”
    I’m suggest you add a “Try the Key” section to break up this material.

  2. “Exploit A Flaw in the Encryption Scheme”
    I think it would be helpful to clearly define the term “encryption scheme”.

Clive Robinson March 22, 2017 8:27 AM

@ Bruce,

I think the “smash the car window” analogy on page 23 for breaking the encryption to be incorrect, using a slide hamner or drill on the lock would have been better. The smash the window would be more appropriate for an end-run attack via another weak application or the OS.

Also the last two paragraphs on page 30, the first of which says four things to be considered, but only covers three. The fourth is covered in the second paragraph. It just reads very awkwardly and jolts the readers thought train.

Also unsprisingly it is US centric with regards the legislation aspects. Nor do you go into using multiple jurisdictions as a way of limiting the effects of judicial or other overreach in a way which would highlight the wrong turn US legislators and Judiciary are currently taking.

After all just because a US magistrate signs off on a warrant from the FBI, it does not make it legal for the FBI to break the law of other jurisdictions. Especialy when in the past the US has talked of using a kinetic response to other countries hacking computers in the US…

Thus a follow up paper with someone who has expertise in treaties etc would be a logical next step.

Daniel March 22, 2017 9:33 AM

@bruce Our goal wasn’t to make any policy recommendations.

Fair enough but I don’t think that you can dodge the issue entirely, the paper needs to at least acknowledge that HOW debates on encryption are resolved trickles up to larger public policy recommendations. In the ultimate analysis law enforcement is just a tool to achieve larger social purposes (such as ridding the country of substances perceived to be harmful to human health). Yet law enforcement is not the only tool to achieve that larger goal, one alternative is market-based solutions. So if debates over the 5A are resolved in favor of defendants that has a negative impact on low resource law enforcement, and any negative impact on low resource law enforcement has a negative impact on the war on drugs, and any negative impact on the war on drugs is just another reason to question whether law enforcement is the right tool to address drug issues.

So while I concur that you don’t need to make policy recommendations I think the paper is poorer for not acknowledging how these issues interconnect.

Thomas March 22, 2017 10:09 AM

I feel like 1-3 are basically the same,. or should at least be under the same chapter.
Whether you find, guess or compel the key – you use the key/passcode to get into the “car”. Thereby saving the time of performing more complex methods (4-6) to get to the unencrypted data.

My Info March 22, 2017 10:15 AM

@Bruce Schneier

Our goal wasn’t to make any policy recommendations. (That was a good thing, since we probably don’t agree on any.)

I really like that approach, and certainly not because I think you shouldn’t make policy recommendations or that you aren’t qualified to make them. No, not at all!

It’s just that those in a political or business position to make policy decisions these days do not seem to be informed, or, if they are informed, even willing to acknowledge the basic facts of how encryption works, how it is being worked around, and so forth.

When we as a society lack a basic agreement and honest acceptance of the technical facts of how encryption works and how it is attacked, we simply have no basis to generate any kind of “consensus” on any kind of policy related to encryption.

It’s like the “unfiltered” Internet access at the local library: a children’s reading room downstairs, men looking up pr0nography on the computers and masturbating upstairs, and women slipping liquor in their coffee at the cafe’ trying to get into their pocketbooks and into their pants. We don’t have any policy for that, either, but our taxes are certainly paying for it.

OverQuantum March 22, 2017 10:42 AM

80-bit keys can be brute-forced by large national-intelligence agencies [21]

[21] links via [13] to “Cryptography Engineering… 2010”, but page is not specified.

Could anyone please clarify page or section in the book or link to this brute-force result or estimation?

Arthur March 22, 2017 10:51 AM

Very good read (although indeed a bit US-centric).

Page 14 contains “The guesser can remove the
encrypted file from the suspect’s computer…” where “copy” should probably be used instead of “remove”.

Another thing that I was not sure about was the distinction between “Compel the Key” and “Compel the Plaintext” (the second seems to be grouped under the first). There may be some relevant distinctions, especially when using a Steganographic file system.

Something that was maybe a bit missing was more details on the pros and cons of certain approaches. For example, key escrow seems simple, but how do you use it cross-jurisdiction, etc.

Maybe in line with that: what was not mentioned was the responsibility of the government to detail how evidence was obtained when exploiting a flaw (the “hoarding” of vulnerabilities discussion).

ab praeceptis March 22, 2017 11:35 AM

I like the paper, especially the approach to also include the legal perspective.

The most important take aways for me are:

  • (Particularly the us of a) government and agencies extremely criminal. Laws – and even constitutional rights – for them are merely formalities that must somehow be bent or circumvented.
  • The question of equality and balance does not even enter the game. Example: The “judge” sending a witness to jail for not telling a password the witness really doesn’t know.

Assume that fact could later be proven. So what? Neither the criminal leas nor the judge who didn’t care a rats a** about the witnesses constitutional rights have to fear anything.

  • Do not design code based on your expectation of “reasonably expected input”!
    Rather always inspect and handle the complete domain of possible inputs!

If you expect input e.g. to never be more than 100 characters (say, for a password) then expressly check for that and handle it properly.

  • Do not assume anything. Neither on the legal side nor on the technical one! Do not rely on any rights you might think to have; act as if you had none and were an agent in adversarial territory. And for the tech side, again: Know your domains and check and handle input properly. Expect and be prepared for evil input and for any opportunity of input to effectively be but an attack surface.

  • Do not rely on biometric checks.

Daniel March 22, 2017 1:09 PM

@My Info

It’s just that those in a political or business position to make policy decisions these days do not seem to be informed, or, if they are informed, even willing to acknowledge the basic facts of how encryption works, how it is being worked around, and so forth.

Exactly. I argue that the reason this is true is because to a large degree these policy makers do not understand the connection between how encryption debates are resolved and the policy outcomes that they desire. As I said, they don’t understand how resolutions of debates regarding the details of encryption “trickle up” to larger policy debates. There is a section in the paper where Bruce and Orin note that one resolution to the debate about the 5A is likely to result in the federalization of certain crimes. But that conclusion is not policy neutral because it ties into a whole plethora of debates regarding federalism, state’s rights, and the administrative state. If one is opposed to the administrative state one should support a weak 5A interpretation but if one supports a strong administrative state one should support a strong 5A.

I don’t think the focus of the paper is to discuss let alone resolve such debates but I do think the paper is remiss in not pointing out these connections. These connections are there by implication in the discussion regarding federalization but in my view the paper would be much stronger if such implications were pulled out and confronted directly in a paragraph or two.

George March 22, 2017 1:47 PM

Good paper, thanks.

The only real question I had was about how encryption might be used by different kinds of bad guys, and therefore how different types of workarounds might be used by different kinds of governments and agencies.

For example, in-house expertise might be available to large criminal and government organizations, whereas smaller organizations, of both kinds, would have greater reliance on packaged implementations that might have known weaknesses. (Or at least, that’s one assumption.)

Nick P March 22, 2017 1:55 PM

@ Bruce Schneier

Attempt at review and revision

I first want to say the first pages are great in terms of laypeople likely understanding them. Now let’s see if I can help improve it. I wrote a bunch of stuff then returned to my intro to note a recurring problem. The authors seem to love using commas to create compount sentences tying multiple topics. I’ve seen research or claims that that creates some mental juggling for readers. Best to break them up into straight-forward sentences ending with periods that create a smooth, mental flow for reader from one point to the next. Many of my fixes are like that.

First, on p3 I suggest this fix to make sentence less redundant:

“which is useless unless the ciphertext can be decrypted into the unencrypted readable form known as plaintext”

Take out the word “unencrypted” as it just adds technical overload. “readable form known as plaintext” is fine by itself. Then overall paragraph is a great, simple explanation.

You can probably drop “we label them as follows” when introducing the workarounds. That they’re categories of workarounds is obvious to reader given they follow a colon and sound general. Then they each have their own section as a section heading. Notice you don’t have a loss of information:

“This section identifies six categories of encryption workarounds. We label them as follows: find the key, guess the key, compel the key, exploit a flaw in the encryption scheme, access plaintext when the device is in use, and locate a plaintext copy.”

“This section identifies six categories of encryption workarounds: find the key, guess the key, compel the key, exploit a flaw in the encryption scheme, access plaintext when the device is in use, and locate a plaintext copy.”

I notice you defined a term, plaintext, for what the government wants. Then you incosistently use either “unencrypted form” or “plaintext.” It might be better to consistently use one. I’d lean toward plaintext since you already defined it above as “a readable form.” However, it’s probably best to look at the various media and LEO reports that politicians probably saw to figure out which terms are common. Then, use the same terms they are to reinforce whatever understanding they already have. And expand it of course. I’m not what terms they’re currently using the most, though.

“For the purposes of this section, we can treat all passwords, passcodes, and passphrases as keys.”

This is actually a good example of what I’m talking about. The better term for laypeople to abstract over the concepts is the word “secrets.” We could say the schemes rely on secrets that come in many forms. However, the debates keep talking about keys and key escrows. To be consistent with what politicians have already seen, extending the usage of the word key is probably the better option. Expand what’s already in their heads. So, I’d keep saying keys even if secrets is more intuitive.

On p 11, there’s a sentence that might be technical overload but might be a necessary evil. I just know it’s worth reconsidering:

“Keys can themselves be encrypted, such that a second key is needed to decrypt the key needed to decrypt the original messages.”

I’ve seen eyes glaze over seeing several pieces of jargon compounded when they just learned the terms. You could get rid of it entirely to go straight to mentioning the common scenario of password managers. If you want to keep it, maybe reword it with less jargon. Just a quick atttempt:

“Keys themselves can be encrypted as an extra layer of security. That means investigators must find more than one key to get the data. That might be done with a different method or the key hidden in a different place. This strategy can also be used to have a master key that encrypts files containing many other keys. The most, common example is password managers that help users keep up with all their usernames and passwords (keys) by encrypting them with one, master key. Investigators trying to access a key to one of those accounts would first need the master key that encrypted it.”

That’s longer since it’s a weakness of mine. It might be shortened. The idea is it has more context laypeople understand than jargon we understand.

On p12, the paper introduces keylogger and then uses it without an explanation. That might be fine given it’s usage will be obvious to many people. The more technically illiterate ones might need explanation such as “That’s a tool that records every keystroke of a suspect as they type them in.”

On pg 12, there’s a pile of text starting with “And since the password…” that uses lots of commas. It’s the kind of sentence that’s easier to say in person than read. Maybe separate sentences like this:

“Since the password unlocks the encryption key, investigators who guess the password can retrieve the plaintext from the target system.”

On p 13, there’s another explanation at the top that might be shortened or cleaned:

“The most secure systems let users enter passwords as long as they want to increase the difficulty of the attacker. This works by increasing the possible combinations of numbers and letters the user can come up with. However, systems limiting the number of characters reduce the number of possible passwords. A password cracker can do less work when attempting to guess all combinations. Therefore, systems that limit size of passwords make guessing passwords easier for both attackers and investigators.”

On p 13, the paragraph on “other factors” could use comma elimination & integration:

“Passwords generally need to be remembered by their users, which means they are often memorable numbers or phrases.” to…

“Users often use memorable numbers or phrases to help them remember their passwords.”

The Lopez example is great. Sentence such as above combined with it is a one-two punch for giving understanding of that concept. I also love the follow-up about dumb, password combinations. LEO’s want to convince people that encryption is impenatrable shield of crime. Good to always counter it by showing just how foolishly many people use encryption. By implication, it might actually be easy to deal with for many targets.

On p 14, you explain what a password guesser can do without explaining why. That might be a problem. I suggest reversing it where you describe those same techniques of modifying passwords with numbers, etc as stuff users do to help remember passwords. Then, follow-up with something such as:

“Modern, password-guessing tools are designed to use the same tricks described above when guessing users passwords.”

I don’t know how to word it. This parapraph has a lot of info. Tricky haha. You had a great finish where the comma is appropriate to get the mental impact of “few people use them” as last thing they read.

On p 15, you use a term they don’t understand. Change “parallel processing systems” to “incredibly-powerful computers.” Maybe modify that with the word expensive. Keep powerful and computers regardless. Interestingly, the Hollywood bullshit readers have seen might make that more understandable since they always just throw a bigger computer at passwords or hacking in the movies.

On p 15, “When users turn it on, however,” to “If users turn it on,”

On p 16, take out the sentence about “rubber hose cryptanalysis.” It’s irrelevant to the audience since they may never hear it again in their professional lives debating laws on legal compulsion. Better to reinforce the latter phrase with few distractions. The rest of the paragraph is good.

On p 17, otherwise great, maybe comma elimination with “The Fourth Amendment limits on compelling keys are fairly modest due to the limited or nonexistent Fourth Amendment limits on government compelling of testimony and documents.”

On p 18 on bottom, change “have begun requesting” to “are requesting” for active voice. Likewise, “has issued” to “issued” since it’s already past tense. We’re getting into my weak area of grammer. However, my English and Persuasive writing classes long ago hammered into my brain how active voice is more effective. Shorter in this case, too.

I’m avoiding pushing rewrites of some compound sentences in p 20-21 since the issues themselves are compounded. I’m not sure it can be better. On p 21, did spot possible fix from passive to active: “to meet in practice, as evidence…” to “…in practice since evidence”

Same page, “The case law is not clear on which standard is correct.”

Same page, maybe break up last sentence w/ some jargon removal. End at “officer’s lawful order.” Then, something like “The punishment for non-compliance must be greater than expected punishment for access to the plaintext on the device.”

On page 22, half of the large sentence on bottom maybe redundant since it was just stated in the paper before that. Maybe drop it starting after “contempt.”

Wow, this is time consuming! I have to stop to go visit family over a corrupt, local government being assholes about infractions that shouldn’t exist to squeeze money out of them. Seems like some jurisdiction attempts this on one of us at least once a year. Gonna go cheer them up. I’ll review the rest of the document later tonight or this week.

George March 22, 2017 2:00 PM

In the “Basic Principles of Encryption” section where you speak of “dual use” it seems that you might be able to emphasize the use of encryption to keep an individual’s private information hidden from criminals to avoid things like credit fraud, identity theft, etc. Along with this is the idea that well implemented strong encryption likely makes us all safer and that these workarounds are also available to criminals to use against individuals (although, admittedly, compelling a key might be radically different). This concept seems to be missing in government arguments against encryption; that any weakness that the government would like to exploit is also available to criminals to use against individuals. This isn’t just a tension between individual privacy and government access for criminal investigation but has much wider implications.

Jeff March 22, 2017 2:28 PM

I suggest that “It is important to understand that encryption and encryption workarounds are ‘dual use’ technologies.” be changed to “Encryption and encryption workarounds are “dual use” technologies.”

It is important to note that the throwaway phrases, “it it is important to [note|understand],” are pet peeves of mine. 🙂

John Beattie March 22, 2017 3:17 PM

I have only skimmed the paper very lightly, so the following may well be already in place.

  • There are lots of different pairs of principals, where one is the attacker and one the defender. You are sticking to a specific case, which makes quite a lot of difference to the moral and legal aspects, so I think you should bring out that this is just one case among very many.

  • I’ve always held that securing data was a matter of first deciding on the attacker and the value of the data. Further, that any sufficiently well-resourced attacker would be able to get anything of mine. Specifically, it is pointless to defend against an attacker with indefinitely large powers. I think this is part of the explanation for the probabilistic nature of the workarounds.

John Doe March 22, 2017 6:42 PM

I disagree with this statement from page 23:

When exploits become known, companies and software writers will try to quickly correct the flaw.

It cites bug bounty programs, which many companies do not participate in. A major reason there are so many security problems is that vendors do not try to quickly correct the flaw. Some sue people for notifying them of flaws (not because of how they discovered the flaw, simply for reporting it privately!). Android exploits are everywhere because most north american telecom companies roll their own flavor of Android, and it takes them YEARS in many cases to patch known flaws. I think most companies behave very unethically with how lax they are in addressing known problems with their software.

mark hutchinson March 22, 2017 7:42 PM

pg 34, last paragraph
“the least resources are required to compel the key”
I think you meant this in economic terms of bang-for-your-buck or path-of-least-resistance. If so, the “are required” should probably read “are used” (or similar).

Dirk Praet March 23, 2017 6:28 AM

Well-balanced paper, be it indeed with the legal aspects only covered for a US context, as others had already previously pointed out. Which in my opinion are covered quite well, but at a bare minimum should contain some references to international law or lack thereof.

A seventh technique that in my opinion is prominently missing is exploitation of user ignorance and lack of proficiency with encryption tools as a step-up to either retrieving key or plain text. For better or for worse, the user generally remains the weakest link, whether it be that he is using the tools the wrong way or has been tricked into a false sense of security that certain tools would somehow offer full protection. It’s a recurring point of discussion in almost every thread of this forum.

An absolute worst case scenario for any encryption work-around would be people massively adopting the techniques and methodologies described by @Clive, @Nick P, @Wael, @Thoth, @RobertT et al. It is therefor in the best interest of any government and IC to either trick people into using known vulnerable tools or spreading a story that nothing is beyond their reach to the point that you may just as well not use any sort of encryption at all.

Dirk Praet March 23, 2017 7:41 AM

@ ab praeceptis

Do not assume anything. Neither on the legal side nor on the technical one!

Thoroughly inform yourself instead, especially on the legal side. It makes a world of difference knowing in which jurisdictions you can be legally forced to unlock your devices or – in a US context – that you don’t have to consent to any warrantless searches or talk to the police, that police can lie to you as much as they want but can’t hold you for ever unless you’re formally arrested and which gives you the right to an attorney.

Martin Russ March 23, 2017 8:56 AM

Page 21, final paragraph: ‘The encryption workarounds discussed so far have all been key base.’ should read: ‘The encryption workarounds discussed so far have all been key based.’

ab praeceptis March 23, 2017 11:02 AM

Dirk Praet

It makes a world of difference knowing in which jurisdictions you can be legally forced to unlock your devices … right to an attorney

Of course, I know the fairy tales, I’ve learned them like everyone else.

What we’ve learned esp. in the last years, however, strongly suggests that they are exactly that, fairy tales.

I stick to what I said: Do not assume anything. Neither on the legal side nor on the technical one!

Nick Timkovich March 23, 2017 11:55 AM

There is a miscitation on page 14 that gets a figure wrong by 2 orders of magnitude:

It would take an iPhone 22 hours to run through the 10,000 possible keys under its default four-digit configuration.

The citation is #55: Micah Lee, Upgrade Your Iphone Passcode To Defeat The Fbi’s Backdoor Strategy, THE INTERCEPT (Feb. 18 2016), https://theintercept.com/2016/02/18/passcodes-that-can-defeat-fbi-ios-backdoor/. From that page:

iPhones intentionally encrypt data in such a way that they must spend about 80 milliseconds doing the math needed to test a passcode, according to Apple. That limits them to testing 12.5 passcode guesses per second, which means that guessing a six-digit passcode would take, at most, just over 22 hours.

Clive Robinson March 23, 2017 12:52 PM

@ ab praeceptis, Dirk Praet,

I stick to what I said: Do not assume anything. Neither on the legal side nor on the technical one!

Actually I do make the assumptions, of “If the laws of nature alow, they will do it sooner rather than later” and “Ethics and guard labour are never hand in hand when money, power or influance are up for grabs”.

Cynical and pesermistic perhaps, but atleast I get a “warm feeling” when I’m wrong, rather than the cold dread optimists feel when they are wrong.

At least I found something to smile at this week, the US and UK have a partial “laptop ban”. Basically you can not bring electronics bigger than a mobile phone from sixteen countries, which gives you the excuse not to carry any electronics with you, which should increase the average joe corporate traveler’s security.

However the cynic in me does wonder if the idea is to drive users to use their smart phones instead, thus significantly decrease their security…

John Doe March 23, 2017 7:38 PM

On page 39, this:

Given the position of today’s technology industry today, ….

might be better written as

Given the position of today’s technology industry, ….
or
Given the position of the technology industry today, …

Also, this on page 39:

In effect, it is a meta-strategy designed regulate products directly
to ensure that there can always be a successful encryption workaround.

seems to be missing a “to”:

In effect, it is a meta-strategy designed to regulate products directly
to ensure that there can always be a successful encryption workaround.

Ash P March 24, 2017 11:07 AM

Very interesting article

What about:

For extra security on my fully encrypted laptops, I wipe the Meta-Data-Blocks that hold the master decryption key. Basically the laptop is sort of bricked because the master key does not exist. I boot into a Live Linux USB and run a script to wipe/restore the blocks. It takes about 40 secs. Works for both Bit-Locker(3 blocks) and MacOS Firevault (4 blocks).

Officer, here is my password but lighting struck my laptop and has bricked it.

Nick P March 24, 2017 6:27 PM

Alright, back for some more. I should’ve noted that my page numbers were for the box on top of my PDF reader. I just noticed there’s actual page numbers that are different as is often the case in PDF’s. PDF reader is one page ahead of the print: print page 23 = PDF page 24. I’m sticking with same numbering scheme for consistency.

On p 24, I agree with other commenter that it’s false that companies and software writers will quickly correct the flaw. Most evidence I’ve seen showed that they didn’t outside some image-conscious companies that want to be seen as secure by actually doing something. Same with a tiny few of FOSS projects that put work into security. There’s even been cases where problems remain unpatched. They described them as a feature with inherent security risk or legacy holdover to work around. The time between knowing an exploit exists and a patch available varies considerably.

Supporting that same point, you say later “or has not yet been corrected for that particular device.” Most attacks on companies or individuals published don’t depend on 0-days. They depend on attacks that have had published patches for a while. In some cases, those patches existed for around a year. Company or person just never got around to applying them.

On p 25, in section E, maybe modify the sentence for consistent use of word plaintext if you followed a previous suggestion about that. I’m less certain here but might reduce jargon impact if you say, “encrypted data must be decrypted into plaintext to be read by them.” Then, it’s still passive. Second mod is, “they must decrypt the encrypted data into plaintext to read it.” First version said “convert the” but maybe they do need to see these terms encrypt and decrypt again and again until they get used to it. They at least see all four… ciphertext, plaintext, encrypt, and decrypt… in one paragraph comprehensibly again.

On p 25, maybe just say what you’re doing. “This approach is similar to the exploit a flaw approach discussed above except that it exploits a vulnerability in the computer or phone itself instead of the encryption software.” Optionally, maybe mention something to reiterate that they have two sources of vulnerabilities to draw on. Maybe along the lines of even perfect encryption software can be defeated if running on a vulnerable system.

On p 26, the opening paragraph is great. The second might be modified in a few ways. First, it might point out that “disk encryption only protects data on a system that’s turned off. If they turn it on and enter their key, the key gets stored in memory until it’s turned off. It stays in memory while on because it’s used to retrieve the user’s files. Taking control of the computer while it’s on allows access to that key or those files.” Part of my motivation for such a modification is that many policy makers might still think that any encryption makes it unreadable. There’s a distinction they should know between the general effect of it and what it does for disks.

On p 28, you say that the NIT is malware. Then, you keep saying NIT when describing malware behaviors. I wonder if you were intentionally doing that to keep the impressively neutral tone of the paper. If not, then you might want to consider just saying malware or government malware so people know what it is. Or exploit or hack. A new term distances a bit in minds of readers from the fact that it’s a straight-up hack.

The rest seemed mostly fine. Just redundant at times. That might be good, too. The one thing that was factually incorrect was where the paper said there was essentially no empirical evidence on the effect of encryption. That contradicts our mutual position expressed in the past that FBI rarely encounters encryption but usually works around it. It’s not a threat at all per their own reports except in rare cases. So, there’s already evidence that could be included as a sentence or two letting readers know that workarounds are currently working for the FBI.

Clive Robinson March 24, 2017 8:14 PM

@ Nick P,

They depend on attacks that have had published patches for a while. In some cases, those patches existed for around a year. Company or person just never got around to applying them.

In some cases it’s not been a case of “never got around to” but a case of “could not” or “could never” for significant reasons.

One of the reasons I keep my personal machines compleatly off the internet is I have to run older OS’s that are not just known to be vulnerable but have not had support or patches for a decade or more. I do this to support people who have no choice but to run such OS’s due to device driver and hardware issues (think industrial control and even phone systems). Migrating these customers often involves designing and producing entirely new hardware using once common communications standards that are not supported in COTS equipment any more but we hope will still be around in another ten or twenty years (think RS232 RS434, IEEE, CanBus and 10M ethernet).

For instance back in the 1980’s I designed equipment for the petro-chem industry especially off shore remote unmaned systems. Those platforms are still there and still in service, but getting replacment parts is now bordering on impossible. However designing replacments is for many reasons fraught if not impossible as well, but those platforms will probably still be in production after I’ve kicked off this mortal coil, so it’s a catch 22 situation.

There is without doubt a “generational issue” industrial systems have a twenty five to fifty year life span, COTS PC hardware generations are measured in months sometimes in as few as can be easily counted on two hands. Thus the “dog years” problem of fourty to sixty COTS PC generations to just one industrial generation. With maybe a five year OS generation. None of which is helped by the demented IntProp behaviour of the entertainment industry with DRM2.5 or whatever you want to call it which results in hardware you don’t own even though you’ve paid for it… Something those “Smart Infrastructure” political drives have not even considered let alone addressed…

Nick P March 24, 2017 8:51 PM

@ Clive Robinson

I’ll add to your post these two:

  1. Can’t for legacy reasons. The system is too old to understand, extend, or whatever. It’s also critical. Its vulnerabilities, in main code or encryption, will likely remain for some time. Happens a lot with companies that started with mainframes, Windows Server, or C apps on proprietary UNIXen.

  2. Can’t for regulatory reasons. This is where they mandate something insecure or company doesn’t want to recertify to do a fix. These users will remain vulnerable, too. Happens a lot with medical equipment.

Wael March 24, 2017 8:53 PM

In this paper, we will call such efforts “encryption workarounds.”

The problem is not the Encryption. It’s the “Decryption”. “Decryption workarounds” is more descriptive. They are trying to decrypt the data with a workaround as opposed to using a proper decryption algorithm. It could be another way of looking at it.

But after reading this

The first three are strategies to obtain an existing key to unlock encrypted data. The latter three are ways of accessing the data in plaintext form without obtaining the key.

It maybe called “Cryptography workarounds” or “Defeating Encryption”. Not important, I guess… just semantics.

Anyone, criminals and law enforcement alike, can employ any of these methods to access encrypted data.

It would seem that “access” implicitly means obtain the encrypted data followed by the ability to decrypt it. Does the meaning of “access” depend on the six strategies above?

FBI Director James Comey has expressed fears that the government is “going dark” because encryption blocks access to communications.

He should have said: “The government is going blind because the criminals are “going dark”

Modern encryption algorithms use the same principle but rely on complex mathematics.

Perhaps one could say something about confusion and diffusion that are implemented using substitution (your example) and transposition.

Fortunately, this is easy. A 128-bit key has 2128

Implicit distinction between key-length and key-space. The relationship between the two is implied. Depending on the audience you’re targeting it maybe something to expand upon.

That’s all the time I could spare today for this paper. Probably all insignificant comments…

vas pup March 25, 2017 10:39 AM

Read 3/4 of the article – looks good – will recommend all interested parties as “need to read”.
Related to ‘compel the key’ section.
My opinion is that in civilized country (positioned itself as rule of Law country), the burden of proof on any step of criminal investigation/process is on LEAs and prosecutors, not suspect, defendant.Judge is not the part of the collection evidence, but by issue particular court orders could help them in collection evidence by checking they follow constitutional protections guaranteed to citizens.
Refusal of the person to cooperate in collection any possible incriminated evidence (as broad understanding of 5th) can NOT be punished by any type of penalties – monetary, imprisonment, etc.). Otherwise we are in the time of Middle Ages Inquisition Criminal Procedure. Difference in extraction evidence is only spectrum- type(torture versus contempt of court jail time), but not differ by the paradigm whatever POTUS is saying on that.
Let say, somebody posses the password, by not aware of that. You know those IC research of hypnotizing people, ‘insert’ password into their mind without their awareness, and to provide/disclose information/password only when trigger sensory input is provided. So, person have electronic device and use it having encrypted information on it which he can’t decrypt because not aware of the password he is in possession of. That is more IC scenario, but I guess still relevant to the subject to consider.

John Galt March 26, 2017 3:23 PM

Why don’t you just outlaw encryption like Communist China and assign a death penalty for its use?

Seems to me that politicians’ use of cryptography represents a danger to the survival of the world. Certain politicians will be happy to launch the entire US stockpile at anyone who knows how to use the “ping” command.

Government decryption of flirtatious comments with your wife while you are at work is nothing short of criminal voyeurism. Multiply that times 400,000,000.

Why don’t we just outlaw locks on front doors and curtains over bedroom windows —
and post politico/military sentries INSIDE every abode and hotel room? Reinstitute Prima Nocta, too?

Seems to be the “fad” these days. The stuff we hear on the ‘news’ is nothing more than street theater conducted by psychopaths in office.

I mean, come’on (!), do you realize that a pattern has shown itself: governments are crippled by sharp instruments these days?

Butter knives and nail clippers are next on the list of contraban, people. In fact, I was even harassed by the TSA for having nail clippers on my keyring!

Michael Corleone (Al Pacino), in the movie Godfather III, said: “Politics and crime – it’s the same thing.” He also said, “All my life I was trying to get up in society…where everything is legal, but the higher I go the more crooked it becomes.”

The motive behind government access is not to protect the public from criminals. The real motive is to surveil targets of opportunity — whether that means a political adversary or commercial espionage to figure out where to place your next big bet on Wall Street.

Mandatory decryption doesn’t stop the car thief who just stole your car or the military sentry who just finished raping your wife and kids.

My Info March 26, 2017 4:34 PM

@John Galt

Why don’t we just outlaw locks on front doors and curtains over bedroom windows —
and post politico/military sentries INSIDE every abode and hotel room? Reinstitute Prima Nocta, too?

That’s a valid analogy. How many hotel rooms, etc. within 100 miles of Washington, D.C. are not under surveillance by multiple foreign and domestic powers?

Butter knives and nail clippers are next on the list of contraban, people. In fact, I was even harassed by the TSA for having nail clippers on my keyring!

So was I. And the stupid part is that there were shops within the secured area selling nail clippers identical to the ones they confiscated at the security checkpoint.

Mandatory decryption doesn’t stop the car thief who just stole your car or the military sentry who just finished raping your wife and kids.

They don’t understand. Encryption only hides one thing: speech, data, talk. And as we all know, talk is cheap. Encryption can’t hide illegal drugs, it can’t hide children, it can’t hide guns or knives.

Their dicks are bulging in their pants when there are no restraints on their ability to invade the privacy of ordinary folks.

… et s’il a délivré le juste Lot, profondément attristé de la conduite de ces hommes sans frein dans leur dissolution, car ce juste, qui habitait au milieu d’eux, tourmentait journellement son âme juste à cause de ce qu’il voyait et entendait de leurs oeuvres criminelles; le Seigneur sait délivrer de l’épreuve les hommes pieux, et réserver les injustes pour être punis au jour du jugement, …

John Galt March 26, 2017 5:50 PM

@My Info…

What do Adolf Hitler, Chairman Mao, Pol Pot, Joseph Stalin, the Queen of England, and the Ruling Committee of the United States Govt all have in common?

FACEBOOK and GOOGLE and the NSA. Dream Come True. Today, Anne Frank wouldn’t survive 24 hours.

[[[ Their dicks are bulging in their pants when there are no restraints on their ability to invade the privacy of ordinary folks. ]]]

WATCH THE FINAL BEDROOM SCENE…after the public executions.

https://www.google.com/search?q=the+forbin+project+youtube&oq=the+forbin+project+youtube&aqs=chrome..69i57.5452j0j7&sourceid=chrome&ie=UTF-8

[[[ So was I. And the stupid part is that there were shops within the secured area selling nail clippers identical to the ones they confiscated at the security checkpoint. ]]]

The TSA was stealing those clippers from you… so they could sell it back to the store… to sell to the next victim… so they could do it again. Meanwhile, the TSA uses the nail clippers as probable cause to keep the money in your wallet.

This is evidence of a piracy and racketeering system.

Fly the friendly skies, right?

They don’t understand. Encryption only hides one thing: speech, data, talk. And as we all know, talk is cheap. Encryption can’t hide illegal drugs, it can’t hide children, it can’t hide guns or knives.

YES THEY DO UNDERSTAND!

All criminals know how to be criminals. They change the subject from the stolen money and racketeering — to “Why do you have nail clippers? Are you a terrorist?”

Even Gilligan (Gilligan’s Island) is more intelligent than the psychopaths who think they are “in charge.”

https://www.youtube.com/watch?v=yZarLN94VZg

Enjoy.

John Galt March 26, 2017 7:51 PM

@ Wael

[[[ The problem is not the Encryption. It’s the “Decryption”. “Decryption workarounds” is more descriptive. They are trying to decrypt the data with a workaround as opposed to using a proper decryption algorithm. It could be another way of looking at it. ]]]

Wanna know how to beat the computer spygrid without encryption?

Igpay AtinLay. Just intermittenly alter the last letter (y) or (ay) with random characters.

Works every time.

If Google/NSA actually figures out how to reprogram for pig latin, just add a number/symbol or two between syllables.

After a few years, when they finally figure all that out, I’ll be reading/speaking/writing ENCRYPTED ANAGRAMIC Navaho, Aramaic, and Anasazi.

Maybe someone wants to give me a large grant of federal dole money to write my own report on how to encrypt/decrypt without using mathematics and block ciphers.

Maybe outlaw Pig Latin, Navaho, Anasazi, Aramaic, and Louisiana Bayou Gibberish, too? Let’s just outlaw every dictionary in the world. Remove the brains of every child born.

Wael March 26, 2017 9:53 PM

@John Galt,

Remove the brains of every child born.

I’m afraid they did this to me. They were nice, though… they left me three neurons, both of which don’t work (both three.)

John Galt March 28, 2017 1:14 PM

@vas pup…

[[[ You can’t changed the human nature, unfortunately:
https://www.sciencedaily.com/releases/2017/03/170314081558.htm
A replication of one of the most widely known obedience studies, the Stanley Milgram experiment, shows that even today, people are still willing to harm others in pursuit of obeying authority.
The banality of the evil is always around…]]]

If you’ll note my other posts… you’ll see the word “psychopath.”

The Milgram Experiment is proof of my point. The world is ruled by psychopaths… and their preferred professions are: law enforcement, lawyers, professional politicians/politics, school teachers, … etc.

Statistically, they represent 2% of the total population — and, they are typically found anywhere they can receive an “unfair advantage” over their fellow man.

The Milgram Experiment… I was beginning to think people had forgotten about that one. One of the hallmark characteristics of a psychopath is the “enjoyment” they experience when torturing your cat, dog, or you. Pavlov was one of them, too. These are also the same people who actually like the hands-on experence of “waterboarding” (aka Chinese Water Torture.)

Psychopaths are in charge of the NSA and Google, too. Unfair advantage… knowing your thoughts and minds.

When the world finally realizes that the real social problem is psychopaths in high places… and that virtually all career politicians and policemen (under any label) are actually psychopaths… the world will be a much safer and more peaceful place.

You can also identify psychopaths by the products they create or “sell.” How?
PAPER PUSHERS. (If they offer to sell you a piece of paper under any “title” …)

The world we know is a world created by psychopaths. And, “they” know it, too.
The Milgram Experiment was one of those mind-control MK Ultra/Monarch slave experiments.

Hannibal (the Cannibal) Lecter kind of stuff.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.