Attorney General Barr and Encryption

Last month, Attorney General William Barr gave a major speech on encryption policy­—what is commonly known as “going dark.” Speaking at Fordham University in New York, he admitted that adding backdoors decreases security but that it is worth it.

Some hold this view dogmatically, claiming that it is technologically impossible to provide lawful access without weakening security against unlawful access. But, in the world of cybersecurity, we do not deal in absolute guarantees but in relative risks. All systems fall short of optimality and have some residual risk of vulnerability—a point which the tech community acknowledges when they propose that law enforcement can satisfy its requirements by exploiting vulnerabilities in their products. The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product. The Department does not believe this can be demonstrated.

Moreover, even if there was, in theory, a slight risk differential, its significance should not be judged solely by the extent to which it falls short of theoretical optimality. Particularly with respect to encryption marketed to consumers, the significance of the risk should be assessed based on its practical effect on consumer cybersecurity, as well as its relation to the net risks that offering the product poses for society. After all, we are not talking about protecting the Nation’s nuclear launch codes. Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications. If one already has an effective level of security say, by way of illustration, one that protects against 99 percent of foreseeable threats—is it reasonable to incur massive further costs to move slightly closer to optimality and attain a 99.5 percent level of protection? A company would not make that expenditure; nor should society. Here, some argue that, to achieve at best a slight incremental improvement in security, it is worth imposing a massive cost on society in the form of degraded safety. This is untenable. If the choice is between a world where we can achieve a 99 percent assurance against cyber threats to consumers, while still providing law enforcement 80 percent of the access it might seek; or a world, on the other hand, where we have boosted our cybersecurity to 99.5 percent but at a cost reducing law enforcements [sic] access to zero percent the choice for society is clear.

I think this is a major change in government position. Previously, the FBI, the Justice Department and so on had claimed that backdoors for law enforcement could be added without any loss of security. They maintained that technologists just need to figure out how­—an approach we have derisively named “nerd harder.”

With this change, we can finally have a sensible policy conversation. Yes, adding a backdoor increases our collective security because it allows law enforcement to eavesdrop on the bad guys. But adding that backdoor also decreases our collective security because the bad guys can eavesdrop on everyone. This is exactly the policy debate we should be having—not the fake one about whether or not we can have both security and surveillance.

Barr makes the point that this is about “consumer cybersecurity” and not “nuclear launch codes.” This is true, but it ignores the huge amount of national security-related communications between those two poles. The same consumer communications and computing devices are used by our lawmakers, CEOs, legislators, law enforcement officers, nuclear power plant operators, election officials and so on. There’s no longer a difference between consumer tech and government tech—it’s all the same tech.

Barr also says:

Further, the burden is not as onerous as some make it out to be. I served for many years as the general counsel of a large telecommunications concern. During my tenure, we dealt with these issues and lived through the passage and implementation of CALEA the Communications Assistance for Law Enforcement Act. CALEA imposes a statutory duty on telecommunications carriers to maintain the capability to provide lawful access to communications over their facilities. Companies bear the cost of compliance but have some flexibility in how they achieve it, and the system has by and large worked. I therefore reserve a heavy dose of skepticism for those who claim that maintaining a mechanism for lawful access would impose an unreasonable burden on tech firms especially the big ones. It is absurd to think that we would preserve lawful access by mandating that physical telecommunications facilities be accessible to law enforcement for the purpose of obtaining content, while allowing tech providers to block law enforcement from obtaining that very content.

That telecommunications company was GTE—which became Verizon. Barr conveniently ignores that CALEA-enabled phone switches were used to spy on government officials in Greece in 2003—which seems to have been a National Security Agency operation—and on a variety of people in Italy in 2006. Moreover, in 2012 every CALEA-enabled switch sold to the Defense Department had security vulnerabilities. (I wrote about all this, and more, in 2013.)

The final thing I noticed about the speech is that it is not about iPhones and data at rest. It is about communications­—data in transit. The “going dark” debate has bounced back and forth between those two aspects for decades. It seems to be bouncing once again.

I hope that Barr’s latest speech signals that we can finally move on from the fake security vs. privacy debate, and to the real security vs. security debate. I know where I stand on that: As computers continue to permeate every aspect of our lives, society, and critical infrastructure, it is much more important to ensure that they are secure from everybody—even at the cost of law enforcement access­—than it is to allow access at the cost of security. Barr is wrong, it kind of is like these systems are protecting nuclear launch codes.

This essay previously appeared on Lawfare.com.

Posted on August 14, 2019 at 6:18 AM45 Comments

Comments

Alex August 14, 2019 7:03 AM

One could indeed argue that Khürt Williams above is right from an American point of view. (Stop and Frisk doesn’t require probable cause in most of Europe). And that should stop Barr dead in his tracks.

Or we can end the debate by telling Barr he can’t actually ban math. David Cameron wanted police to be able to request private keys on any encryption above a certain key size (can’t remember the exact number, but he thereby confirmed GCHQ’s computing power) for this same reason. He said it would be either that or banning encryption completely. You can’t ban math.

Luke August 14, 2019 7:33 AM

You can absolutely ban math, Alex.

Governments obviously don’t have the power to wish away technology but they do have the power to criminalize the possession and/or use of it.

At that point there is no need for law enforcement to break encryption. They just have to identify you as the person who sent the cypertext.

Alejandro August 14, 2019 7:48 AM

I sense all the recent noise about a mandatory backdoor signals a big change coming. It seems like a multinational concerted and organized attack, to me. All they need right now is the right event to provide cover to change the law(s). It seems the technology is ready.

Frankly, I suspect a lot of stuff (devices/apps) is back-doored already, but the legal mechanisms to use it to full effect aren’t in place. As evidence, all the recent stories about FB, Google, MS, etc. literally listening and recording audio.

Bruce focuses on government access, but personally, I think corporate access is worse for the common man on a day to day basis. The data corporations collect allows them to act as a non-representative form of government and we have no control over it at all. For example, censorship and promotion of news events; controlling what products a promoted for sale while others and pushed to the background; all manner of political speech and activity is managed by the corporations now.

Where does it stop?

What about my security?

Mike Acker August 14, 2019 8:08 AM

Barr misses the key point: it is government surveillance that needs to be defeated: read the 4th Amendment.

Barr Jr August 14, 2019 8:09 AM

I think that my freedom is more important than my security.

I prefer to live in a free but insecure world than in a perfectly safe but not free world.

Clive Robinson August 14, 2019 8:12 AM

@ Luke, Alex,

They just have to identify you as the person who sent the cypertext.

Which is where maths can help you again.

There are ways to use plain text to hide a few bits of secret data. For instance,

    We should meet up for a coffee?

Has 1 bit for should / could, which could be expanded to 2 or more bits with “We should” / “we could” / “Shall we” / “Can we” etc

Likewise the beverage could be tea / coffee / beer / drink / sandwich / etc or you could expand to “a coffee” / “lunch” / “dinner” / etc.

Whilst only being a few bits the resulting number can be used as an index into a code book where each number relates to a specific action.

The point is that even if the authorities suspect it is a code, encrypting the numbers with the equivalent of a One time pad means that the authorities have no way to link the code to any actions the recipient might take. Especially if one of those actions is a “NULL” or “No action to be taken”

There for the message can be sent in plain text such as an SMS.

The simple fact is the authorities are handicaped by not being capable of banning all communication, and even watching all communications does not give them anything even remotely close to proof. Thus if both parties maintain good OpSec and don’t talk the authorities end up with no court usable evidence from the communications and importantly little or nothing for a prosecutor to “spin up” in front of a jury unlike the random string of an encrypted message that is also easy for machines to spot.

Tatütata August 14, 2019 8:15 AM

That telecommunications company was GTE — which became Verizon.

The wording is slightly unfortunate.

Verizon was a BOC (Bell Operating Company) called Bell Atlantic. That baby Bell later bought GTE, the leading independent (i.e., non-Bell System) telephone conglomerate.

From the GTE perspective, allowing “The Man” to snoop in your equipment wouldn’t hurt the bottom line, and creates a potential for a quid pro quo.

But from the customer’s perspective…

Alejandro August 14, 2019 8:16 AM

So, the argument might become security v. freedom, rather than security v. security.

Also, the security argument is camouflage regarding the absolute fact governments and corporations are using data access as a way to dominate and control the population, regardless of any legitimate security concerns.

In short it’s about power.

JacklynR96 August 14, 2019 8:35 AM

I am not at all interested in the argument surrounding the greater good or the impact of lawful access on the overall security offered by encryption.

I am the sole arbiter of who I choose to talk to and what information I choose to convey to that person. If I want law enforcement or the state intelligence community to be able to listen in on my conversations I would invite them to do so. If I wish the Govt. to be able to examine the personal files kept on my phone or computer I will give them the password when they ask for it.

The Govt has no business in the bedrooms of the country and they have no business surveilling me as a matter of course. I do not trust them not to when they shouldn’t and I do not trust software providers to resist the considerable pressure that the Govt brings to bear when they request backdoor access to software products that are specifically designed to excluded others from doing this very thing.

If someone contends that I am engaging in unlawful activity, the burden of proof is on them and I am not compelled to aid in collecting that proof to my own determent. I have every reasonable expectation that data I deem to be private, personal or potentially embarrassing is mine to safeguard with all due diligence and those efforts should not be hamstrung from the outset by a surveillance monster foreign or domestic. Absolute power corrupts absolutely – how many times do we need to be slapped across the face before we accept that the interests of Govt (often in our name) often are contradictory to what really is in our own best interest? We have forgotten how to reign in Government excesses and we have abdicated our democratic responsibility to do so.

Every wayward child learns early on that it is is easier to ask for forgiveness than to seek permission. Governments are this wayward child. A child that has also learned to to effectively lie by the same early age.

Humdee August 14, 2019 9:52 AM

@Bruce

“This essay previously appeared on Lawfare.com.”

It also previously appeared on this blog.

Drew Cooper August 14, 2019 9:52 AM

“Some hold this view dogmatically, claiming that it is technologically impossible to provide lawful access without weakening security against unlawful access.”

I don’t think that’s how sensible policy discussions begin. I admire your hopefulness, though.

Sancho_P August 14, 2019 9:55 AM

”security vs. security”

There is no debate because they don’t want a debate.

And we don’t need a debate, because the world is round and not in the USA.
They have the metadata, that’s more than enough.

But:
Susan Landau hints at ”increasing [the] capabilities of law enforcement” and ”vastly increased funding”.
While this is noble, it’s the wrong end to start with.

Solving this deficit will increase the real problem: We have too many “criminals”.

We have a bunch of suspects on the watch, for years.
We have a bunch of even dangerous people on (secret) lists.
We have a bunch of people going in and out of jail their whole life.
Last not least our correctional facilities are full (that’s a business in the US).

While this has a lot to do with bribery, plea deals and corruption (informants), the problem is not LE but our justice system.
We as a society and those who should form it are quick to deny and to change the discussion to big business issues (“never ending growth”).

@WaPo: Democracy Dies in Darkness, thanks.

Alejandro August 14, 2019 10:15 AM

What do they want?

From Barr’s speech:

“Our colleagues from GCHQ have proposed “Virtual Alligator Clips” which allow a provider to respond to a warrant by adding a silent law enforcement recipient to an otherwise secure chat. Ray Ozzie has tabled a proposal for “Exceptional Access Keys” for locked, encrypted phones so they can be unlocked pursuant to a warrant. Matt Tait has proposed Layered Cryptographic Envelopes to allow lawful access to encrypted data-at-rest on disks or other storage devices…”

Tait’s vision is especially concerning. From what I can tell it would involve placing a plaintext copy of ALL encrypted internet data on an encrypted corporate storage drive, which then can be decrypted by the corp opening the drive with it’s private key, and the data decrypted with the police private key (would every cop in the world have a key?).

What a gd mess. Who would NOT have a key?

I have a bad feeling about this.

J. Toman August 14, 2019 10:16 AM

From the article: ‘Barr makes the point that this is about “consumer cybersecurity” and not “nuclear launch codes.”‘

That’s a very dangerous statement. In effect it’s saying that there are two classes: the royalty in government and us peons . I would argue that prioritizing “consumer cybersecurity” is actually more in keeping with American principles of government. If any sort of secure communication should have a backdoor in it, it should be government communication in order to maintain transparency.

Jim A August 14, 2019 11:25 AM

“Frankly, I suspect a lot of stuff (devices/apps) is back-doored already, but the legal mechanisms to use it to full effect aren’t in place.”

I suspect though, that only the big boys NSA FBI counterintelligence, etc have access to those back doors. Legally mandating backdoors inevitably means that those will filter down local police and probably be used in civil court cases. Which means that the security of those backdoors will be greatly diminished.

Dave Crocker August 14, 2019 11:47 AM

Agreeing with everything you say… except the implicit minimization of concern for consumer protection of what really is “just” consumer communication. Enabling more and worse bad actor intrusions on average consumers — and likely at scale — could arguably be viewed as its own national security threat. /d

Humdee August 14, 2019 12:41 PM

@J. Toman

Yes, this is correct. The leaders are responsible to the people, the people are not responsible to the leaders. Transparency for all is OK. Secrecy for all is OK. But what is not ok is secrecy for the elites and openness for the peons. That is one of the surest and quickest routes to totalitarianism I can think of.

Ollie Jones August 14, 2019 1:58 PM

Implementing crypto backdoors probably implies that some entity will have to hold a cache of secrets: the backdoor keys.

Recent history has taught us that all secrets eventually leak. No entity can build perimeter security strong enough to prevent such leaks. Not even state actors.

So, we need defense in depth for these secrets.

  • The perimeter security needs to be strong, avoiding foolishness like security by obscurity.
  • The secrets should be spread into separate caches with its own perimeter defense. Each cache needs to be as small as possible, so any given perimeter breach does not leak all the backdoor keys.
  • Each secret needs to have limited utility: For example, it should unlock a tiny subset of crypto.
  • Each secret needs to have limited useful lifetime. These keys should regularly expire, unlike personal biometric information for example.
  • A canary scheme is needed, to alert the public and the custodians of the secrets to leaks.

Without at least some of these safeguards, leaks of backdoor keys could prove catastrophic.

Do we trust state actors to build and manage these safeguards?

Do we trust private entities to build them and manage them? Can private entities be fairly paid for this work without creating perverse incentives.

Bruce H August 14, 2019 2:05 PM

While true that encryption backdoors would allow law enforcement to break the encryption “bad guys” use, this elides the fact that it would also break the encryption that “good guys” use. I’m far more concerned about that. We all know that police departments and other government agencies routinely violate the rights of ordinary people if it is seen to benefit them.

nobody August 14, 2019 3:00 PM

Unfortunately elided from conversations about mandatory surveillance is that the “bad guys” people like Barr want to spy on are far, far more likely to be ordinary people civilly protesting the construction of oil pipelines, and members of minority ethnoreligious groups, rather than the kind of groups who incite mass shootings against disfavored demographics, much less the kinds of people who collaborate with foreign enemies to rig elections.

Discussions over the (lack of merits) of mandatory surveillance cannot be value neutral with respect to who mandatory surveillance powers will be used to target.

Anon Y. Mouse August 14, 2019 3:26 PM

As before, my response is “you first, Mr. Barr.”

Let the government adopt encryption systems with backdoors for a year
and see how it goes. They should eat their own dog food for a while,
as the saying goes, and demonstrate that we have nothing to worry about.

Drive-By Idealogue August 14, 2019 3:29 PM

With this change, we can finally have a sensible policy conversation. Yes, adding a backdoor increases our collective security because it allows law enforcement to eavesdrop on the bad guys. But adding that backdoor also decreases our collective security because the bad guys can eavesdrop on everyone. This is exactly the policy debate we should be having — not the fake one about whether or not we can have both security and surveillance.

I hate to call out hypocrisy here but… “adding that backdoor also decreases our collective security because the bad guys can eavesdrop on everyone” parses logically to me as still being “fake”. In order to not be “fake” you need to add a few important words, i.e. “adding that backdoor also decreases our collective security because some of the bad guys can eavesdrop on those who utilize the products with the backdoor”. Those added words are inconvenient perhaps as far as getting the important message to the masses. But that’s the tradeoff if you really want to have a “non fake” policy discussion and instead a sensible one. And of course that “some of” needs to get added to the prior sentence as well.

Seriously, framing it as “because the bad guys can eavesdrop on everyone”, is 100% hypocritical in my opinion. Just going ‘fake’ in the opposite direction.

This trump/fake-news phenomenon is worth keeping in mind. He scores points when he points to headlines and short summaries which omit the important qualifiers just as Mr. Schneier did here. And just because he’s evil doesn’t change the nature of the bigger picture problem.

Clive Robinson August 14, 2019 4:23 PM

@ Ollie Jones,

So, we need defense in depth for these secrets.

Logicaly the best defence is “not having anything to defend”.

If there are no “backdoors” they need no secrets, therefore there is nothing to be taken / copied that will hurt or harm anyone. Thus the conversation should stop at that point

However continuing talking about “defense in depth for these secrets” is tacit admission that there is,

1, A need for backdoors,
2, That is actually valid.

And the truthfull answers are “no and no”. No ifs buts or maybes.

As you might have realised all justifications for backdoors are compleat and utter nonsense based on stired up emotion, and have less credibility than “think of the children”.

When pushed proponants come up with “corner cases” that in reality don’t even make it in the deranged “Security Theater” imaginings of a con artist trying to sell “Magic Pixie Dust Detectors” to a CIA funded Venture Capitalist.

Perhaps the US citizens should start a “War on Tyranny” with the same motto as the war on drugs,

    Just Say No

To all things that take away rights from citizens or give power to the Executive downwards into the “Fat blue line” that is the “Guard Labour”

As for the corporations that are manipulating the legislators because they think there is money to be made. Perhaps they should review history and be mindful of what often happens to known collaborators and profiteers…

UDU August 14, 2019 5:52 PM

You don’t understand. The NSA (and similar nation-state orgs) have a toolbox of zero-day vulnerabilities and similar things, that are not available to the top experts outside of the agency. Eventually, they will have enough of these tools that they won’t need a backdoor. They will be able to break encryption that no one else knows they can break.

When quantum computers become practical, it will be very easy for the NSA to develop, very quickly, a new toolbox of code-breaking that is beyond what anyone realizes even a quantum computer can do. Over time, the most secure encryption will not be secure; quantum-secure encryption will not really be quantum secure.

And the whole push to have backdoors will suddenly stop. For no apparent reason. “Oh, okay, you guys are right. We’ve changed our minds. We agree that backdoors are a bad idea. You win. Seriously. Would we lie to you?”

gordo August 14, 2019 7:38 PM

Attorney General William Barr:

The key point is that the individual’s right to privacy and the public’s right of access are two sides of the same coin. The reason we are able, as part of our basic social compact, to guarantee individuals a certain zone of privacy is precisely because the public has reserved the right to access that zone when public safety requires. If the public’s right of access is blocked, then these zones of personal privacy are converted into “law-free zones” insulated from legitimate scrutiny.

https://www.lawfareblog.com/attorney-general-delivers-address-encryption-cybersecurity-conference

From earlier this year:

Jennifer Stisa Granick, Surveillance and Cybersecurity Counsel, ACLU Speech, Privacy, and Technology Project:

Governments’ goal is to ensure that records of whatever we do and whatever we say will be available in case investigators can meet whatever justification their country requires for looking at it. This is an astounding and novel premise.

Before the internet was in widespread use, anyone making these assertions would have been laughed at. No one had the audacity to suggest that, even with a warrant, people could be required to record their conversations, talk only in a place someone could overhear, keep a travel journal, or log their reading and research. No one thought that having a confidential conversation was evidence of a guilty intent. No one thought that having a private conversation created a “zone of lawlessness,” as the Justice Department ominously put it.

Now that technology has unintentionally created exactly these kinds of surveillance windfalls, governments want to keep it that way, arguing that if they are acting lawfully, they are entitled to our private data. This is wrong. Complying with privacy laws may give our government the authority to search, but we are not obligated to ensure our private matters are there for the taking.

https://www.aclu.org/blog/privacy-technology/internet-privacy/if-government-had-its-way-everything-could-be-wiretapped

David Vandervort August 14, 2019 8:03 PM

I disagree that this is the right debate to be having. This is not security vs privacy or good security vs bad security.

What the AG is saying, basically, is, “You can have all the free speech you want, as long as we can monitor it.” This makes the debate real freedom vs fake freedom. Free speech vs approved speech.

I oppose “Monitored freedom.” I think it’s important to debate it on those terms.

Lawrence D’Oliveiro August 14, 2019 8:31 PM

@Mike Acker re the US 4th Amendment

Meanwhile, those of us in the rest of the world are concerned about collection of our private information by any other parties who might exploit that information, whether they are Governments or private companies.

In the US, there seems to be this fixation on what the Government does, while large, faceless, amoral megacorporations can abuse your individual human rights and get off scot-free. Particularly, it seems, if they are friends of those in Government.

Starous August 14, 2019 11:35 PM

I think he should be Barred from making any more hate speech on encryption and think harder on what he is doing. If you backdoor encryption in most cases and only use the pure one for nuclear codes and stuff, it will not be tested correctly and be riddled with bugs. Breaking encryption should be hard, so it is not abused for everyday spying on citizens. There is enough bugs in the software to do that already. The big companies are investing massive amounts of cash into encryption and security and goverments dont want to follow suit and spend money on tools and software to break it. They want to have fix it in some point in time, but security measures needs to go up, because the risks are constantly increasing.

Thomas August 15, 2019 1:00 AM

The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product.
The Department does not believe this can be demonstrated.

Shifting the burden of proof. Nice.

Particularly with respect to encryption marketed to consumers,

Replace “consumers” with “citizens” and read that again.
Sounds very different, doesn’t it?

As for the 99 vs 99.5 security, let’s just say I like made-up numbers as much as the next guy…

A few points to note:
– new/better attacks keep pushing these numbers down
– patches and updates keep pushing these numbers up
– a designed-in weakness cannot be ‘patched or updated’, so once it’s attacked and vulnerable it stays vulnerable.

“consumer cybersecurity” protects information that can be used to identify and influence people with access to “nuclear launch codes.”.
Remember OPM?

Jon August 15, 2019 1:19 AM

@ Jim A
Also noted by Ollie Jones, et. al.

“I suspect though, that only the big boys NSA FBI counterintelligence, etc have access to those back doors.”

Diminished? Howzabout obliterated, because they’re also accessible to the next Edward Snowden, as well as the next (or the same) Booz Allen Hamilton. Recall that Mr. Snowden was NOT employed by the NSA at the time he was lifting documents, he was employed by a corporate subcontractor thereof – which as a company had access to the same items (at least).

@ Clive Robinson

The problem of cyphers is a lot more serious than a few bits here or there. Assume that a Person of Interest (POI) sends an encrypted message to another POI. The message, when decrypted, reads, “The dog barks at midnight.”

Are they just complaining about their neighbor’s dogs? (my neighbors’ have gotten better, recently) Or does that mean ‘attack at dawn’ or ‘all is well, continue as planned’ or ‘get out as fast as you can, burn everything’? Invading everyone’s cryptography does absolutely nothing for problems like that (as you noted).

Point here being that it’s not just a bit here or there (“The cat barks at midnight” would not make much sense) but entire meanings that can be transferred. In a way, an agreed-upon cypher is one-time pad.

Don’t re-use it… 😉

J.

Warren August 15, 2019 2:24 AM

The elephant is the room, as ever in this debate, is how do we account for and neutralize bad actors in government who have access, but aren’t acting in everyone’s best interest?

Greg Lorriman August 15, 2019 3:27 AM

It’s not like “stop and frisk” which is a limited power. Rather it’s bugging the inside of private residences.

It’s Ceaușescu’s Romania.

And what will they eventually use it for when the authorities have just banned so-called ‘sexist’ adverts in the UK? That’s a narrow ideology of Equality being forced on to the private sphere.

Re-education camps next? “We clearly heard you say to your wife in bed at 10:45pm that men and women are different, Comrade Lorriman”.

You can already lose your job by just pointing to the science of psychological sex differences and brainscans.

Western Infidels August 15, 2019 11:55 AM

If one already has an effective level of security say, by way of illustration, one that protects against 99 percent of foreseeable threats — is it reasonable to incur massive further costs to move slightly closer to optimality and attain a 99.5 percent level of protection? A company would not make that expenditure…

I see some sloppy thinking here.

First and most obviously, especially in this age of ransomware, yes indeed, many companies would be happy to spend more to cut their vulnerabilities in half, which is what a decrease from 1% to 0.5% is. Nitpick the numbers but the principle still stands: Whatever you mean by “99% secure,” is that really good enough for a huge target like a big bank or an international credit card company? No.

I know he dissembles by saying he’s targeting consumer-trash stuff, but consumer-trash stuff like instant messaging is already thoroughly entwined in everyone’s lives; if hackers can use IMs as part of a phishing scheme to get access to other systems, then the security of those other systems is only as good as the security of the IMs. That’s something even a biz-worshiping corporations-are-people lizard-person should agree with.

Second, very effective encryption exists and is widely deployed right now. If there are costs associated with that (but it’s cheap!), then they’re sunk costs. It’s weakening that security that would cost money at this point. If he really thinks that companies in general don’t like to spend money on security, how does he think they’ll react to the idea of spending money on weakening it?

Mansour Ganji August 15, 2019 2:07 PM

Well said Bruce. Just to add to the part where he’s compared this to the telecommunication lawful intercept, this is very very different. In a Telecom environment, you give access to the government in each country to have legal access to monitor specific target’s traffic. But that doesn’t apply to services like Gmail, Telegram, WhatsApp, etc. What government in this case is going to have access? US government only? What about UK? Germany? is this going to be per-country? how do we identify which government will have access to which portion of WhatsApp traffic? and then what about Russian government? what about Iran government? Are they in this game with us? or they’re pariah?

Bayard August 15, 2019 2:56 PM

Nuclear launch codes are not more valuable than consumer/citizen information. Barr condescendingly, and misleadingly, compares the safety of a nation (launch codes) to an individual’s daily mutterings. This obscures the fact that he really means EVERY citizen’s EVERY muttering – if you can spy on one you can spy on all, at any time. He thus destroys the nation through slavery while protecting the government’s right to destroy us through violence. According to the Constitution, government rights are limited in scope, and the government is not more important than the governed. As part of the Executive branch, it’s his job to enforce the Constitution; violating it should lead to impeachment.

John P August 15, 2019 3:48 PM

Mansour Ganji check out Project Bullrun. All encrypted communications are breachable.
This conversation is nearly pointless as the FIPS 140-2 has the backdoor directly in it.
All version of SSL implement FIPS 140-2 to my knowledge unless you have some libraries from the Chinese or Russians I am unaware of. Here is a good article on how they break encryption:
https://projectbullrun.org/dual-ec/back-door.html
And this contains a decent diagram to explain on how they calculate private keys and store ALL of them to attempt deciphering.. if the key is not present they request the key from their service (i.e. the back door).

https://projectbullrun.org/dual-ec/drbg.html

Chris August 15, 2019 4:06 PM

@Baynard:

You are correct. Cast this debate any way you want (e.g. “safety vs. security”, “privacy v.s security”, “security vs. security”, ad nauseam). Barr just admitted here that only governments (well, maybe just one in particular) will be allowed to truly have secrets. That’s how this debate needs to be framed: No information possessed by or about private individuals will truly be private whereas the government will be allowed true secrecy through encryption. Free human beings should not accept this.

Dave August 15, 2019 4:51 PM

There is a whole lot of ugliness that it is hard to what to say about it. CALEA came from another era with limited technologies readily available and less security awareness while today it is very reasonable for a teenager anywhere in the world to produce and end-to-end encryption solution, and a world full of sophisticated government and private data aggregation and analysis operations for mundane to nefarious purposes. The distinction between “consumer services” and more critical systems as business operations or “nuclear launch codes” is not so distinct, as information leaked by those consumer services can be aggregated and analyzed for getting more critical data (e.g. “kompromat”)…..

People can create open source VPNs and other solutions that will be a challenge to obstruct. It seems to get around this, I think you’re going to have to intercept data at the mic/speaker on the phone and route through a completely separate communication channel, outside of control via Android/IOS. Suddenly “Made in the USA” or “Made by a company with offices in the USA” becomes a hard sell. Apple might produce a separate iPhone for sell outside of the U.S., but they will have a hard time convincing customers in other countries that it is free of such back doors. Devices unlikely to have these back doors will be readily available to purchase while overseas.

Other countries will also demand access to this backdoor. How will that be managed? Most of our phones are manufactured overseas; will there be any way to verify that hostile governments do not have access to phones sold in the U.S.? Who will those governments or bribed/blackmailed officials in those governments give that access to?

If we implement the back door today with policies that are deemed acceptable and we become accustomed to it, how will those policies change in the coming years?

Overall, it seems that the risk some random hacker will find a way to exploit it is just a drop in the ocean of the mess we will find ourselves in.

65535 August 15, 2019 8:11 PM

“… Barr’s latest speech signals that we can finally move on from the fake security vs. privacy debate, and to the real security vs. security debate. I know where I stand on that: As computers continue to permeate every aspect of our lives, society, and critical infrastructure, it is much more important to ensure that they are secure from everybody — even at the cost of law enforcement access­ — than it is to allow access at the cost of security. Barr is wrong…” Bruce S.

I agree.

It’s highly invasive to backdoor data Transit or Data in Flight. It possible that doing so will destroy huge sectors of the US economy including data centers a huge portion of peoples privacy.

I say don’t listen Barr’s comments. He is person with different motives than he states. He want’s Push-Button law enforcement. Call your duly elected officals and let them know you do not wany any more spying on citizens.

Next, the subject of breaking data in flight and your very recent post entitled:

Evaluating the NSA’s Telephony Metadata Program…Interesting analysis [of the PATRIOR Act]

htt ps://www.schneier.com/blog/archives/2019/08/evaluating_the_1.html

[Link broken and ressamble required]

The two subjects of using the PATRIOT ACT and breaking encryption in flight are closely related. The above link mention the review and possible sunseting [removal] of the PATROIT Act.

To be clear, I am For the sunsetting [removal] of the PATRIOT ACT. It is a classic example of using a False Title and having the text indicate an complete different meaning. It basically allows spying of citizens on false pretenses. Let the Patriot Act expire.

I will return to this site as soon after I complete the cleanup the M$ “patch tuesday” mess for certian clients.

Matt August 16, 2019 12:49 PM

I sometimes like to draw analogy lines to physical security (although this isn’t a perfect way to compare).

Would you support legislation requiring all safe and vault locks to have a universal master combination? It’s OK; only the government will possess the master combination.

Of course not!

Marcus August 16, 2019 1:50 PM

All systems fall short of optimality and have some residual risk of vulnerability — a point which the tech community acknowledges when they propose that law enforcement can satisfy its requirements by exploiting vulnerabilities in their products. The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product.

The fundamental difference between exploiting the unintended vulnerabilities in a channel designed to be secure and building in backdoor access is one of scale. In the former case, breaking the security on ten phones or planting ten shoulder-surfing hidden cameras or pulling off ten feats of social-engineering trickery is ten times more work than doing it once. In the latter case, building the system to access ten, or a thousand, or all, possible surveillance target from the comfort of an agent’s office is technically trivial.

David August 19, 2019 3:50 AM

@Marcus,

Script kiddies and bot herders would detest to that statement about vulnerabilities. The prime advantage of backdoors IMHO is persistence because they will not be patched by manufacturers. Script kiddies and bot herders have a bigger challenge of lifecycling to deal with.

@YADfiles August 26, 2019 3:17 AM

Over the past 1 year I made a stunning observation. I connected to several people using Signal. And in only a few days, weeks, months about 60% where not reachable on Signal. When asked what happen I got very strange replies ranging from iOS deletes apps that are not used often to no space. What the individuals don’t know – of course – is how much I am on the radar myself (read: Telegram @YADfiles) and they don’t know what happen to other Signal users.
Further I noticed that (I think it started somewhere early 2019) messages are send in various speed. Example is the length of the message I’m sending. Does this mean Governments found a way to catch the message written before it is encrypted or is the delay because of low encryption speed? I don’t know.
That said, would there be a backdoor, in my case it would make my life definitely much much harder to manage.
Thanks for this post.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.