Cellebrite Can Break Signal

Cellebrite announced that it can break Signal. (Note that the company has heavily edited its blog post, but the original—with lots of technical details—was saved by the Wayback Machine.)

News article. Slashdot post.

The whole story is puzzling. Cellebrite’s details will make it easier for the Signal developers to patch the vulnerability. So either Cellebrite believes it is so good that it can break whatever Signal does, or the original blog post was a mistake.

EDITED TO ADD (12/22): Signal’s Moxie Marlinspike takes serious issue with Cellebrite’s announcement. I have urged him to write it up, and will link to it when he does.

EDITED TO ADD (12/23): I need to apologize for this post. I finally got the chance to read all of this more carefully, and it seems that all Cellebrite is doing is reading the texts off of a phone they can already access. To this has nothing to do with Signal at all. So: never mind. False alarm. Apologies, again.

Posted on December 21, 2020 at 6:06 AM48 Comments


Jon Jones December 21, 2020 6:40 AM

All they have done is decrypt the locally stored data. Surely if you have access to the device running Signal, then all you need to do it start the Signal client – which is much easier! I’d be more impressed if they’d broken the transport security.

Juergen December 21, 2020 6:44 AM

No they cannot BREAK Signal – they can export locally stored messages from an unlocked (or hacked) phone.

That’s not really the same – it just saves time for the person stealing the messages, but he/she could simply open the app and forward the messages at that point, anyway.

That’s why it’s a good idea to always delete messages after X days locally (which the Android version of Signal allows).

chuck December 21, 2020 7:14 AM

I sometimes wonder if Bruce even reads the articles 🙂

It clearly was an embarrassing attempt at cheap PR. But the number of details they provided made the whole thing laughable. That’s why they deleted the original post.

P.S. And yes, everyone who’s reasonably paranoid uses auto-destruct in Signal. I have a default setting for one week for everyone. For really sensitive stuff it’s 1 day.

Cerebus December 21, 2020 8:49 AM

Shocking that the SqlCipher master key isn’t in freaking StrongBox-backed Keystore. Yes, yes, Android 9 or later, but at least this does something and is forward compatible, and it’s literally one line of code. That would at least interfere with “snag the key from storage and decrypt,” which is what Cellebrite is describing.

A StrongBox KeyStore would at least require them to run code on-handset and require user authentication and authorization to the key.

Clive Robinson December 21, 2020 9:35 AM

@ #1 Schneier Fan,

Using RAM clocked at WiFi frequencies as very very local air gap crossing techbology from Ben Gurian Uni.

I’ve already commented on it over on the Squid page,


It’s not actually anything new discovery wise, infact it’s about as old as it gets. It uses the RAM busses on the motherboard for Carrier Wave (CW) transmission at a ~2.4GHz signal. In a way not much different from a “spark gap cenerator” used to send “Morse code” back before the Titanic took it’s Swan Song on it’s maiden voyage…

Due to a number of issues it’s average realisable signal range is going to be less than three or four meters (9-13ft) in a near ideal arangment. With a fairly significant inverse relationship between data rate and range.

Anyway best place to talk about it is over on the Squid thread fo now.

TimH December 21, 2020 9:46 AM

So the reason for the heavy editing was to hide the inadvertent disclosure that the fantastic Cellebrite product wasn’t that amazinga fter all.

broomstick December 21, 2020 10:14 AM

I think this whole post is a troll, which is why our good friend @Pool LED light was allowed to post (note, post refers to “solar poo” on several occasions).

The Haaretz article’s subtitle refers to Signal as “the most encrypted app”, which should be an early flag about the technical quality of the rest of the article.

Moxie made @Jon Jones’ point 10 days ago:
“This (was!) an article about ‘advanced techniques’ Cellebrite uses to decode a Signal message db… on an unlocked Android device! They could have also just opened the app to look at the messages. The whole article read like amateur hour, which is I assume why they removed it.”


xcv December 21, 2020 12:10 PM


Cellebrite’s details will make it easier for the Signal developers to patch the vulnerability.

It’s gone already. Zero-days, CVEs, and Patch Tuesday.

@ Haaretz article

[Signal (?)] commonly used by journalists to communicate with sources

For the bosses (Michael Bloomberg, Rupert Murdoch, Warren Buffett etc.) to keep the Mainstream Media in line.

Meanwhile, a U.S. report revealed Friday that American school districts have also bought the firm’s [Cellebrite’s (?)] technology.

The kids have phones at school, but they are not the real target. This is adult stuff. School district Community leaders are using this technology to keep the parents in line with their local goverment health mandates and other dictates.

xcv December 21, 2020 7:03 PM


The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states—called the Viterbi path—that results in a sequence of observed events, especially in the context of Markov information sources and hidden Markov models (HMM).

Not only SHA1 but AES has undoubtedly been cracked with a hidden Markov model and an appropriate “annealing” algorithm such as that of D-Wave’s quantum algorithms invented by Dr. Geordie Rose among others, Sanctuary AI, etc.

Cell phone industry techs and execs have been doing this for a long time now. It is not public, and it is not open source. Ma Bell is back, and the Sherman Antitrust Act lies in the dustbin of history along with our freedom.

ResearcherZero December 22, 2020 12:15 AM

Cellebrite – please buy our product, schools, police, anyone. Damn you NSO with your fancy pants zero click exploits.

I can’t see anything wrong with handing spyware tech over to schools, for a nominal fee, then strong arming kids into handing over their passwords or their bio-metrics. Nothing could possibly go wrong here.

(Don’t go and read that click bait story about the IT Teacher abusing his trust and fixing children’s phones, then stealing their nude photos they wisely kept on their phones, like obviously I read).

The problem with this new technology is it’s increasingly difficult to honeypot and turn on the perpetrator, but you can learn something by intercepting the communications it makes.

Erik Prince’s ideas of DIY CIA are have developed a life of their own.
Soon everyone will be able to build off-the-shelf intelligence side stepping capabilities, including that dodgy guy at the local school who no one did a background check on. Once kids just had to be careful near the seminary, but now, those foundations are gone.


in case you get mobbed
Modmobmap is a tool aimed to retrieve information on cellular networks.


In order to maintain an uninterrupted connection to a target’s phone, the Harris software also offers the option of intentionally degrading (or “redirecting”) someone’s phone onto an inferior network, for example, knocking a connection from LTE to 2G:

build your own piece of crap

Your local high school may have something far more useful depending on what level of police state your jurisdiction has matured to, but don’t worry, you can buy your own hunk of crap off a semi-reputable web vendor if you’re willing to get yourself into a whole bunch of trouble. And if you think stealing one from a school is smart, you obviously didn’t read that these devices can be tracked.

If I was a hippy, I’d say there is some kind of planetary conjunction going on, but I’m a hick so I’m going to put my trust in those people that made up all that crap about the Mayans.

ResearcherZero December 22, 2020 1:23 AM


It’s probably highly unethical, but we could test some of these “world order 2020 666.” chips on a couple of Russian spies at one of our black sites?

haraldm December 22, 2020 4:06 AM

Copied from slashdot:

+++ BREAKING NEWS +++ Researchers have found that physical access to a device can break application security and also breach end-to-end encryption. Smartphone users and encryption experts were left stunned.

David Rudling December 22, 2020 5:31 AM

“… AES has undoubtedly been cracked …”

Really? Would you care to share your evidence for this claim?
Undoubtedly the major players like USA, Russia and China are working tirelessly towards this goal.
AES is still approved by the DOD for material up to Top secret so an indication of imminent success by any major player will be much headless chicken running around there and elsewhere.
Does anyone have evidence of this?
Outside of the military (and probably increasingly even there) most AES protected communications once decrypted end up immediately on a computer which is comparatively easily hacked so why bother to reach above the low hanging fruit?

Clive Robinson December 22, 2020 6:48 AM

@ David Rudling, xcv,

… most AES protected communications once decrypted end up immediately on a computer which is comparatively easily hacked so why bother to reach above the low hanging fruit?

AES implementations have been and in many cases still are fairly easily cracked when in use. It’s why you have to be careful when you say,

AES is still approved by the DOD for material up to Top secret so an indication of imminent success by any major player will be much headless chicken running around

The NSA has certified the “AES algorithm” for material upto Secret for “data at rest”.

Which in reality means AES is not certified “in use”, thus extra precautions for EmSec have to be taken to ensure that leakage of data, key, or functional state do not leak via timing, power, and other unspecified “side channels”.

So before anyone gets into a dust up can we atleast agree what we are arguing about,

1, Breaks against the Algorithm.
2, Breaks against implementations.

As I’ve indicated in the past I believe that the NSA set NIST up with the AES competition rules. That is they very knowingly excluded any side channel considerations from the competition, and insisted on having “made for best speed” implementations available for anyone to download. Knowing full well that,

1, Everyone would copy the code.
2, That “best speed” implementations suffer from the “Security-v-Efficiency” problem.

Thus would more than likely when “in use” would haemorrhage data, key, or functional state, due to the use of “loop unroling”, “branch timing”, etc, etc. Which is exactly what happened. Now two decades later you can still find very flakey implementations of AES that leak sufficient information for compromise in new products.

This is made worse by the way some people use the AES algorithm, by either inappropriate “mode usage” or home made modes or algorithms for the likes of “key generation” from user input. That is the designers of such systems are fully payed up members of the “Magic Pixie Dust” sprinklers club.

Key Dervivation from “master secrets” is hard at the best of times. Protecting the “master secret” under human agency is even harder if not impossible once access to the storage device is obtained.

Thus one way of breaking an entire crypto system is to get the “master key”. So knowing how the system keys are derived and how human memory is fairly usless at storing high entropy information… it’s not difficult to realise where a successful attack against an AES protected system could be fruitfully applied.

Rj December 22, 2020 7:01 AM

Has anyone considered that our “pool boy” friend (and others like him making similarly off-topic spammy posts) could be posting steganographicly contaminated noise to this blog to see if anyone can extract the hidden message?

Or can we prove that it is just a brain dead bot doing its thing?

electrolytic capacitor December 22, 2020 8:55 AM

@Clive Robinson

It’s worse.
During weeks of lockdown, I forced myself to wade through several hundred pounds of magazine articles, white papers and sales literature, anything to do with information security, accumulated over a period of about ten years. It had to go, but some was still worth keeping. At one point I intended to write a book.
All this gave me a unique perspective on the thinking in security, and how it has evolved. Some was horribly wrong, even embarrassingly wrong in hindsight. Back then all you really need was a longer, stronger password. Long before “zero trust” was ever even heard of, and microsegmentation was not discussed, encryption was seen as the end-all to security problems. All you had to do was encrypt everything.
But then I began to see something else, something that was never actually put into words by anyone. Encryption has caused a lot of problems.
Encryption is seen as a mysterious and impenetrable topic by many, and those who throw around terms the way investors do, don’t really understand it. They just use the words.
The result is, that no one wants to touch it, if it seems to work don’t ever change anything. And this thinking spread like mold into all of security. It is perpetuated by many things, such as citing various instances in the past where someone rolled their own and it really was bad. So a fixation on bad encryption has actually caused decision makers to think “don’t change anything, ever.”
I also saw thinking compliance is the end-all. Don’t really care what happens, as long as I don’t get blamed for it.
Encryption is really, really hard. The big reason is, that the information is still in the ciphertext. All the plaintext is still in the ciphertext. Think about how hard it is to jumble it up, key or no key, so that you can’t derive anything useful from the ciphertext. It’s hard. If it ain’t broke, don’t fix it. Make everything revolve around encryption.
And this doesn’t even touch the subject of an industry-wide obsession with monitoring, and how encryption is a hair ball.

electrolytic capacitor December 22, 2020 9:21 AM

See techrepublic article “Cybersecurity pros: Are humans really the weakest link?”
Link https://www.techrepublic.com/article/cybersecurity-pros-are-humans-really-the-weakest-link/

“People often represent the weakest link in the security chain and are chronically responsible for the failure of security systems.” This quote is from the book Secrets and Lies: Digital Security in a Networked World, written by well-known cybersecurity expert Bruce Schneier and first published in 2000.

The article makes some good points, but attempts to refute the quote as quaint and not true.

What they should have said, is that poor design and engineering is what MAKES humans the weakest link. Others have used the same examples. What if auto manufacturers turned out cars that, if you tuned the radio to a particular station and were driving as a certain speed, and then just happened to turn to the left all at the same time, the car would blow up. The security industry would say “those stupid idiots, we told them not to do that.” Who would put up with that?

MikeA December 22, 2020 10:54 AM


I thought the problem of those pesky kids having discovered your evil plot that you accidentally posted on your site had been solved. Just fiddle with robots.txt and the file never existed, right? (Any purported copies were of course Fake News)

Have things changed?

florar December 23, 2020 11:10 AM

it seems that all Cellebrite is doing is reading the texts off of a phone they can already access. To this has nothing to do with Signal at all.

It’s not necessarily “nothing”. On this blog, it’s been said that “data is a toxic asset”, and it’s my experience that a lot of programs store data unneccessarily. For example, every web browser for some reason wants to store a full list of sites I’ve visited (WHY!?—for 25 years I’ve been turning that off and never missed it). Shells want to store every command I’ve ever typed. I don’t know about Signal specifically. Is it storing messages by default? That’s a risk, and shouldn’t be done unless explicitly requested by the user.

lurker December 23, 2020 12:51 PM


every web browser for some reason wants to store a full list of sites I’ve visited (WHY!?….)

I’ve found the “History” list to be handy at times, wanting to go back to a tab already closed. But for years I’ve had a logout script that purged history, cookies, Flash storage back in the day, and an ongoing game of tag finding where the new version of browser hides “local storage”.

Shells want to store every command I’ve ever typed.

I still don’t understand why this persists beyond the end of each Shell session…

Sick December 23, 2020 4:45 PM

There is no need for an apology.

Signal IS insecure and the fact you can grab keys from the device it runs on deserved the highlight it got. Marlinspike’s ego can take a lot more reminders he thrives on gullible people and orgs funding a product not EVER delivering what it is meant to.

To elaborate, fact is too many people believe a mobile device is inviolable during runtime. None are designed to be while Signal pretends and sells fake security to the gullible. What else are donations and organizations doing but funding ego-driven code monkeys pretending their code delivers good security?

Bypassing the so-called secure messaging of signal is by signal’s design inherently easy.

Fact is, neither android nor apple want or offer secure systems. Modern cell phones aren’t designed to be secure to begin with, offering apps with purported end-to-end encryption which don’t actually make any difference to those who really want to know borders on criminal deception.

I suppose the TLA’s of any and all nations out there have a big laugh at Signal’s expense these days, all of them can easily read anything written through or with the Signal app, and will keep doing so at any time.

maqp December 23, 2020 6:18 PM


“Signal IS insecure and the fact you can grab keys from the device it runs on deserved the highlight it got.”

Signal is not insecure just because the endpoint can be exploited. Signal is working perfectly fine withing the limitations of the hardware. Signal developers aren’t lying or making excessive claims about what the SW is capable of. If you need more protection for remote exploitation, you need something like TFC and it’s obvious not everyone’s going to invest ~$500 worth to hardware just so they can carry around two extra laptops etc. in case they need to send NSA-proof messages.

Happy Holidays,

They guy who wrote TFC

no.name December 24, 2020 1:16 PM

Of course C/brite can break Signal. Same goes for Threema & Telegram. It’s just like smartphones. Seems everybody here forgot the “Hacking Team” incident. Same applies to Protonmail with the meme ete encryption. And when I post the John McAfee Twitter link, Mr Know It All aka “teacher please notice me – I always write the FIRST comment Chen Weihua” – writes a cynical comment.

Here again McAfee’s Tweet: https://mobile.twitter.com/officialmcafee/status/1218492518003810304.

Bruce & Mr Drivelman really should know better. Wake up, crybabies.

no.name December 24, 2020 1:35 PM

Would have been more purposeful to write about the new paper “Data Security on Mobile Devices: Current State of the Art, Open Problems, and Proposed Solutions”   https://archive.is/n7Xfo  rather than mentioning stuff we all have known.

Avoid smartphones. And for your e-mail you can use cock.li plus an old  PGP version. It’s free & still werks on Win 7.

But yeah, I know. Our “engineer” probably knows again better.

no.name December 24, 2020 2:07 PM

That archive URL to inside-it.ch should work. archive.is/n7Xfo

And here Matthew Green on that topic. archive.is/BbOAe

Me December 25, 2020 3:59 PM

Thanks for apologising, but please update the completely misleading headline.

I absolutely second this.

Goat December 26, 2020 9:48 AM

@Bruce, the incidence of this blog post worries me about the fact that we are being inclined to say more than we read, act more than we think. I see this thing happen with me myself when I replied to some comments without gaining all the facts(due to skimming).

This feels like an information epidemic to me and the only solution that I can look into is reading books and forcing a read. 🙁

diOde January 9, 2021 4:28 PM

@ALL, @ResearcherZero

some great blog work here regardless, thanks.

“it seems that all Cellebrite is doing is reading the texts off of a phone they can already access. To this has nothing to do with Signal at all”.

But that is a principal flaw considering the huge governmental and corporate investment currently being poured out against the client device, highly incentivized, and too big to stop.

Thus, the only remaining fig leaf is crypto added on an
air-gap, and transferred via some one way process: Thereby bypassing endpoint surveillance, device spyware, “trusted” key negotiations, and tomorrow’s brute force.

Rachel January 20, 2021 10:02 PM

Moxie Marlinkspike was interviewed by Joe Rogan quite recently.
Perhaps inspired by Snowdens two recent appearances.

Serious question. Is the grassroots Bay Area Blag Flag listening cypherpunk facade just incredibly useful marketing?

Can Moxie Marlinkspike be trusted?

I’d really like to know what you think. Can he?

Matthew Green has written recently on Signal.


Elephant in the room is Signal has enabled a backup feature without informing users. Creating a PIN to protect your account actually sets into motion a backup process!

quote from end of the article;
More concretely: a few weeks ago Signal began nagging users to create a PIN code. The app itself didn’t really explain well that setting this PIN would start SVR backups.
Many people I spoke to said they believed that the PIN was for protecting local storage,
or to protect their account from hijacking.

And Signal didn’t just ask for this PIN.
It followed a common “dark pattern” born in Silicon Valley of basically forcing users to add the PIN,
first by nagging them repeatedly and then ultimately by blocking access to the entire app with a giant
modal dialog

Love to all, and a hug for Wael and Clive particularly xoxox

Clive Robinson January 21, 2021 3:02 AM

@ Rachel,

Can Moxie Marlinkspike be trusted?

Wrong question… Try,

“Why should Moxie be trusted?”

To which the answer, in a security setting, is effectively “NO”.

This is not an assessment of “Moxie” in human terms, but of “any entity in a security system”.

The same applies from the LED on the front panel of a box[1] or even “apparently passive” components[2] upwards, and that includes the box it’s self[4].

So you quickly realise components / entities in a system are not secure, it’s how they work together to mitigate their inherent insecurities that makes a system secure. To do this requires knowledge, thoughtfulness, insight and above all enforcable control.

So the next question you should ask is,

“If an entity can betray you, how do you prevent or mitigate the betrayal?”

And the answer to that boils down to,

“What is the root of trust?”

Where “trust” is in “security terms” not “human terms”.

And in the case of “Signal”, it’s “developers”, and the “business model” you have no input, no control, and other people such as US legislators and judiciary have it all. So the penultimate question is,

“What effective control do you have on US Government entities?”

To which the sensible answer even for a President is “none”.

Oh and the ultimate question is,

“What can I do about it?”

And that opens a whole new discussion…

But at the end of the day, when people are involved as Benjamin Franklin noted,

“Three can keep a secret, if two of them are dead.”

The point most miss about that quote is “the three” are what we now call,

Alice, Bob and Eve…

And though Alice and Bob may come and go with time, there is always an Eve hanging around, watching, listening, noting, and nodding hiding behind lace curtains with a glass up against the wall or the more modern technological equivalents.

Does knowing this make people feel any better? Nope, will they take heed of it? In general nope, the very opposite in fact.

The reason is few see reality for what it is, thus they take comfort in the cognative bias or illusion of thinking it gives them control thus power. So “Snake Oil Sells” and as I’ve repeatedly noted,

“None of the current secure message apps are even close to being secure.”

And that’s not just on commodity hardware, using commodity OS’s. They are fundementaly flawed by design.

[1] Yes the LED on the front panel of a box can betray you, because somebody wants to minimise components or physical size or a whole bunch of other design time issues. In short they want to show some system functional state to the user/operator, thus they put an LED on the signal line via a current limit resistor, job done. Only that LED has a very fast response time and the resistor does not store energy so does not integrate, or “low pass filter” the signal thus the LED transmits optically everything going on on that signal line… Not good ever, but realy bad as it was with a NATO level signal encryptor where it was in effect connected to the output of the KeyStream generator…

[2] When you get taught electronics, as in physics you get taught a series of more accurate lies as you progress. We call them things like “first approximations” but the reality is they are simplifications to some ideal where the principle or mathmatics is easy to get over to the student so they can learn something usefull in a bite sized lump and move forward[3]. So passive components like Resistors, Capacitors, and Inductors get taught as being “ideal” so you learn how capacitors and inductors store energy by the movment of charge and how they are “lossless”. Which unfortunalely is not true, inductors are a coil of wire, often around some feromagnetic or similar core and it you put a pulse of energy in them that coil being in a magnetic field will behave like a tiny tiny motor and will move. We call it magnetostriction, which means in short it mechanically vibrates in relation to the charge moving through the coil, and it thus also acts as a very small speaker, thus radiates out sound waves… Thus the operation of the circuit is radiated out acousticaly, oh and as an EM field as well, as coils are also antennas of a form[4].

[3] This is the “Grab a bull by the horns” teaching method. Or to put it another way if I toss a pound of beef mince at you gently, you can fairly easily learn how to make burgers out of it… But if I instead chuck you in the bull ring with an angry, frightened, thus enraged bull learning how to make a burger well is going to be a tads harder.

[4] What is an antenna? Well in all honesty they can be anything including a stack of plastic disks of differing dielectrics. What they do simplisticaly is take a movment of charge and turn it into an EM signal that radiates away. Supprisingly to many you do not need a conductor to move charge as the belt on a “van de graaff generator” rather graphically shows. The traditional notion of an antenna is a length of wire but due to what is half jokingly called “The law of inverses or mirrors” if you cut a slot in a sheet of metal it to will act as an antenna. Thus all the joints at box edges, all the ventilation and other holes for displays, indicators, switches, cables etc can act as antennas. Oh and as the size of the hole goes down the bandwidth it has goes up… So not exactly intiative, or something you want when you are trying to keep things “inside the box”.

Rachel January 22, 2021 4:59 PM

Hi Clive

sincere thanks
it was clearly, only, a United States-North American who settled on the name ‘Eve’ to describe that particular party! Bob and Alice on the other hand are innocuous white bread.

while, as usual, your arguments are convincing. I am unclear about whats appears to me as a couple of leaps [over the pond]

I refer to the leap from Signal to ‘US legislators’.
Of course, you mean the comms channel as we are not on a Signal LAN.

I’m also unclear on how the business model is a factor. Signal virtually has no business model!

Your most salient and lucid point for me is ‘the user requires input and control’ This is insightful and actionable.

so much is discussed in absolute terms on this blog.

it immediately becomes a zero sum game at least by perception, and many surely give up in frustration and resignation and decide to ‘let them eat my cake’ and just carry on communicating insecurely as ever because ,it is taught, it’s too difficult to achieve anything else. So, give up. The more you learn the more you realise a safer level of comms that is actually practical, really cannot be achieved.

for the laybeing, mitigating, minimising broad overarching electronic surveillance is going to have a degree of usefulness.

Yet the absolute terms as described, on this blog, general discuss insecurity in terms of a targeted individual. Which we know is Goliath vs a lame mouse. Different story. Very few people find themselves in that ‘targeted’ situation.

So for mitigating universal surveillance( deliberately avoiding the offensive, legalese euphemisms)
Which is where there are surely ‘less insecure’ means, leading us to the question of, for example, Signal instead of SMS or commodity app.

How many of my dinner dates, colleagues, clients, students, is going to use a one time pad with me? I’ll give you a hint. The indians and the Arabs like to claim first dibs but the score has been settled

( I liked your idea about discussing sports, as a code. It does still require a minimum of two people who know about the same sport)

Clive Robinson January 22, 2021 6:42 PM

@ Rachel,

I am unclear about whats appears to me as a couple of leaps

I think you are refering to this,

“And in the case of “Signal”, it’s “developers”, and the “business model” you have no input, no control, and other people such as US legislators and judiciary have it all.”

As far as Moxie and friends are concerned, they want to eat have a roof over their head as well as a place to work and pay for the server(s) Signal uses. Thus they have to have some kind of business model to do the above even if it’s only “pass the hat around” at the bar on a Friday night.

All of which is done or run from the US, which means US legislators and Law Enforcment have a degree of control you and I will never have over Moxie, Signal and it’s effective employees. Being stuck in jail in Special Administrative Measures in full incommunicado because the head of the US DoJ (AG) decides that’s what is going to happen to you is a fairly powerful incentive to do what the AG asks. After all can you see Moxie or any of the others taking an extended out of the country sojourn like Ed Snowden or face the treatment Julian Assange went through. Then there was Lavabit and it’s owner Ladar Levison if you remember back to 2013 (though they appear to be back in business with a much more P2P with E2EE system so there is less of a target in the middle).

With regards,

How many of my dinner dates, colleagues, clients, students, is going to use a one time pad with me?

Oh somewhere between zero and none I should think unless they are crypto geeks. But then that’s just the start of the issues involved with ensuring end point security. In turn thats befor the very thorny subject of KeyMan in all it’s many and messy parts.

Some of the reasons I tend to talk about OTP’s is,

1, They require no technology.
2, The security model is easy to understand, or quickly explainable including the proof.
3, Conceptually they are easy to see how the proof of security transfers across into other activities such as reliable authentication to the 2nd party whilst being deniable to third parties.
4, It’s also easy to see how they can be used for making secure covert channels in plaintext thus having deniability to third parties.

Would I use them in “real life” I guess only if I had to, or I could use them instead of PKcerts to transfer a crypto key and IV securely so a Crypto Secure Key Stream Generator (CS-PRNG / CS-SKSG) could be primed/seeded etc.

For “usability” what we realy need is a “token” that uses a secure smart card to make life easier both in design and adaptability. However I’ve talked about the issues with tokens since the mid 1990’s some years after being the idiot who suggested the idea not just of using SMS messaging as a usable side channel but how to get around some of the “SMS is a secondary service” issues… Not as they say “One of my finer moments”, though it seemed a good idea at the time.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.