Signal Will Leave the UK Rather Than Add a Backdoor

Totally expected, but still good to hear:

Onstage at TechCrunch Disrupt 2023, Meredith Whittaker, the president of the Signal Foundation, which maintains the nonprofit Signal messaging app, reaffirmed that Signal would leave the U.K. if the country’s recently passed Online Safety Bill forced Signal to build “backdoors” into its end-to-end encryption.

“We would leave the U.K. or any jurisdiction if it came down to the choice between backdooring our encryption and betraying the people who count on us for privacy, or leaving,” Whittaker said. “And that’s never not true.”

Posted on September 26, 2023 at 7:15 AM25 Comments

Comments

JonKnowsNothing September 26, 2023 8:16 AM

The front door is working pretty well…

===

HAIL Warning

ht tps://arstechnica.c o m/tech-policy/2023/09/sec-obtains-wall-street-firms-private-chats-in-probe-of-whatsapp-signal-use/

  • SEC obtains Wall Street firms’ private chats in probe of WhatsApp, Signal use
  • The US Securities and Exchange Commission has “collected thousands of staff messages from more than a dozen major investment companies” as it expands a probe into how employees and executives at Wall Street firms use private messaging platforms such as WhatsApp and Signal
  • The executives gave their personal phones and other devices to their employers or lawyers to be copied, and messages discussing business have been handed to the SEC

(url fractured)

Winter September 26, 2023 8:57 AM

@JonKnowsNothing

The executives gave their personal phones and other devices to their employers or lawyers to be copied, and messages discussing business have been handed to the SEC

Another example of $5 wrench cryptography
‘https://xkcd.com/538

Clive Robinson September 26, 2023 10:05 AM

@ Bruce, ALL,

It needs to be recognised by all people who value the few freedoms we currently have left, and more so those freedoms we have lost this past quater century and are loosing more of steadily to those of ill intent masquerading as those who would protect society by destroying it, that we either prevail against or perish under them and the choice is ours.

Two things have to be recognised by every one who want’s even a shred of privacy,

1, If the encryption end points are not fully in your control, then you have no privacy.

2, As long as the encryption process acts as a distinquisher then you can always be betrayed by the other party.

There are solutions to these two problems some of which have been known for millennium, others for “three score years and fifteen”, whilst others for less than a lifetime.

Unless we fight back whilst we still can in ways that can not be stopped then we will loose and perish.

It’s all very well for the president of the Signal Foundation Meredith Whittaker to stand up and say,

“We would leave the U.K. or any jurisdiction if it came down to the choice between backdooring our encryption and betraying the people who count on us for privacy, or leaving”

Those that wish to take the freedoms of those in the UK must be crowing with delight at this because Meredith Whittaker has promised them what they want, for next to nothing.

For Signal to quit the UK might sound grandious but it’s actually saying

“If you threaten us we will retreat”

Meredith Whittaker might as well stand at the foot of the stairs to an aircraft at Croydon Airport waving a piece of paper in the air saying “peace in our time”…

That is not the way to peace but interminable war as history repeatedly shows.

The only way to prevent such bullying behaviour is to fight back with overwhelming response such that those that chose to destroy society by opression find out they are not “strong men commanding” but actually in reality “craven cowards trying and failing to terrorise” others.

The way to defeat such cowards is by education, teach people how to defend the privacy of their own communications beyond the reach of such cowards.

Such education does not start by turbing around and walking away as Meredith Whittaker proposes to do.

Winter September 26, 2023 10:51 AM

@Clive

Those that wish to take the freedoms of those in the UK must be crowing with delight at this because Meredith Whittaker has promised them what they want, for next to nothing.

What can Signal do? Break the law? Signal is not a British company. Ms Whittaker cannot single-handedly undo the folly of the British people, neither should she be able to do that. The British voted for this government, and did so quite decisively.

In the end, a people get the government they deserve. That might be seen as a very harsh punishment, but no way has been found to prevent it.

Winter September 26, 2023 11:16 AM

@Perry

“Don’t Trust Those ‘Secure’ Messaging Apps,”

The first thing the devil will tell you not to trust other people. Every TLA will tell you to not protect yourself.

If end-to-end encryption was worthless, why are governments all over the world working so hard to outlaw it? Why does the UK government go to all the troubles and bad press to outlaw E2E encryption when it does not matter?

Every lock can be broken, but not using locks is still stupid.

Josh Z. Tillman September 26, 2023 12:03 PM

I don’t really see how Signal leaving the U.K. helps anyone in its government. They wanted a backdoor, and it seems they won’t be getting what they want. The citizens will still get privacy using the now-foreign app. If the U.K. bans it from app stores, people can build it from source and install it themselves, unless they were foolish enough to buy a phone that withholds that ability from them.

With regard to people getting access to messages, an important and underlooked part of privacy is to avoid keeping records that could harm you—in other words, avoid keeping any records unless there’s very good reason to save them. I don’t know what Signal’s defaults are here, but a lot of software wants to save everything and makes it hard to avoid that. Programs will pollute your registry and/or filesystem with most-recently-used-file lists, histories of everything you’ve sent or received, maybe every command you’ve entered (cf. Unix shells and text-editors); often, there’s no easy way to disable this stuff, and usually there’s no way to advise your counterparty’s software to auto-delete an exchange (unless the person specifically saves it). It doesn’t matter how good your encryption is if you’ll give up the key to avoid prison.

Signal was originally based on the “Off-the-Record Messaging (OTR)” protocol. It’s rather pointless to pretend to be “off the record” if each party is saving their own records. It’s impossible to cryptographically prove those records are legitimate, but has this interesting mathematical property ever saved anyone’s ass in court? It doesn’t seem that the courts care at all. It used to be that when people communicated, nobody could ever prove they talked, let alone what they said. If they’d exchanged paper letters, burning was an easy and intuitive way to ensure nobody’d be able to read them (postal offices didn’t keep “envelope records” till recently). If they’d used a local phone call, it produced no records whatsoever till maybe the 1990s, unless the authorities had already gotten a warrant. But now people brag that they have like 25 years of e-mail history. I feel nervous when communicating with people digitally, because I know that anything I say could live on for decades. Other people, though, seem to have no concerns with talking about (traditionally) extremely private stuff.

Clive Robinson September 26, 2023 12:47 PM

@ Winter, ALL,

“What can Signal do? Break the law?”

First you should work out,

1, Who’s law?
2, What is the real jurisdiction?

Let me see “British Law” limited to “British jurisdiction”. Hmmm…

Not US law in US jurisdiction, or any other sovereign nation/state or federation of states.

As you note in your very next sentance,

“Signal is not a British company.”

It does not have to be in Britain for people in Britain to use it’s products…

That has been accepted international law for a couple of centuries now.

If the British Goverment want’s to stop people in Britain using Signal’s products, that duty does not fall on Signal, but the British Government and the British Government’s resources.

The problem is the British Government can not stop people using Signal even if they wanted to. They can try… but they will always fail, provided the people are sufficiently educated / knowledgeable.

The more people that succeed the less likely the UK Gov can do anything about it legally. After all “The Great British Firewall” is already failing miserably.

But the thing people should realy wake upto is that all those “think of the children” etc “dog whistles” are pure nonsense.

The law will not stop those sorts of “OMG people” they might loose a few but the others will “get smart”. As the older saying has it,

“If you criminalize technology, then only criminals will have it.”

Look on it as an evolutionary process that does not favour the more catholic or retrograde in nature.

But a few Priministers ago the British Government signed up to a partnership deal, amoungst which is an “Interstate Dispute Resolution” process… The UK Government could get taken to an international court. Judging by what happened with Australia and tobacco a few years back[1], the UK Government could be making a world of hurt for themselves.

But on a side note what worries me is over the past few months or so your outlook has become increasingly parochial… which means you are giving increasingly bad information to people outside of your parish as though you think your parish is lord and master of all… I’ve bad news for you even the USA does not have that power even though it might claim it does.

[1] Whilst Australia finally won the “plain packaging” against big tobacco it took a decade to get there,

https://theconversation.com/big-tobacco-v-australia-taking-the-battle-to-the-global-stage-2027

The reasons the Australian Government won was big tobacco were using the wrong arguments. That is big tobacco were not being stopped from making their product available, nor were the Australian Government stopping competition amoungst tobacco companies.

https://www.afr.com/world/europe/australia-wins-tobacco-case-at-world-trade-organisation-20200610-p5511o

Howrver if the UK Gov “stop the use” of Signal or Signal advertising into the UK then that is open to being found in breach of the trade treaty. Something tells me with the precarious state of UK trade since Brexit, they might not want the publicity an Investor State dispute would raise.

Winter September 26, 2023 1:20 PM

@Clive

It does not have to be in Britain for people in Britain to use it’s products…

But they cannot have a UK presence when they supply goods that are illegal in the UK. Also, they cannot do business in the UK, or with UK citizens, when they break UK law.

Signal also would like to prevent employees from receiving international arrest warrants.

It is one thing “breaking” Russian, Iranian, or Chinese law when you can stay out of reach. It is another to do so when you live in the neighborhood and might end up in a UK airport in an overlay.

And why should Signal personal take the risks? If UK citizens want Signal, they can organize it themselves. It is open source.

unixjunk1e September 26, 2023 1:53 PM

I read once that Signal is really popular with law enforcement personnel, so it’d be interesting to see how the rhetoric shakes out from their end on backdooring law enforcement’s own private comms? I wonder whether it makes a difference when their work day’s side-channel banter is the content that becomes open to inspection, “for their protection”.

Bob Paddock September 26, 2023 4:27 PM

@Josh Z. Tillman

“With regard to people getting access to messages, an important and underlooked part of privacy is to avoid keeping records that could harm you—in other words, avoid keeping any records unless there’s very good reason to save them.”

That leads us to directly to 2028 – A Dystopian Story By Jack Ganssle.

‘http://www.ganssle.com/articles/2028adystopianstory.htm

Where the US Federal Rule of 26 about disclosure of evidence comes up, that I have detailed in the past:

‘https://www.schneier.com/blog/archives/2022/08/a-taxonomy-of-access-control.html/#comment-408943

To summarize that, when nothing is written down we end up in a world were no one remembers how anything works.

iAPX September 27, 2023 6:12 AM

Signal initially wanted to please those that want to spy on us, and did so by requiring your phone number and thus linking that numbers to your signal exchanges, effectively removing anonymity for many people using it.

Yes, there’s way to have access to burning numbers, but that’s not how the mass register their account.

Secondly, Signal chose to use regular https for communication, effectively enabling state-level agencies to have access to these “phone numbers” and link them to the public IP Address used.

Only messages contents are end-to-end encrypted…

That’s a lot of metadata exposed to some agencies, in fact exactly what is used by them to identify network of people.

There are a lot of blind-spot in Signal, or deliberate choices, that make it law enforcement friendly instead of a danger for them, and also mass surveillance friendly.

Josh Z. Tillman September 27, 2023 1:02 PM

@ Bob Paddock,

To summarize that, when nothing is written down we end up in a world were no one remembers how anything works.

I didn’t say “write nothing down”, though. I was talking mostly about communication records stored “by default” being a bad idea. In my experience, such records are just about the worst way to find information. People pretend like they can pull useful information from saved e-mails, chat logs, forum posts, etc.; sometimes they get lucky, but mostly it’s only useful to lawyers (billing by the hour for this service).

At companies, remember that new employees don’t generally have access to the “important e-mails” stored by other employees. They may have access to chat logs via something like Discord, but digging stuff from those is rarely a pleasant experience. It’s almost unheard of to automatically store or transcribe voice calls, with the exception of customer calls, unless legally required to. In-person communication is almost invariably a major loophole (perhaps the last remaining one) to even the most strict retention requirements.

Wikis are the best system I’ve found for managing information, especially in a corporate environment. Because, typically, nobody wants to or knows how to get the corporate bureaucracy to issue an official document. I’ve always tried to get new hires to put any useful “word-of-mouth” information they learn into the wiki, if it’s not already there (the wiki, not a bunch of separate ones for different departments or projects). The same goes for any other employee who asks or answers a question that’s likely to come up in the future, and isn’t yet covered: add it. Sometimes that’s as simple as pasting an e-mail into there. Things like “where do we keep the pens?”, “how can I make a git checkout run faster?” (which don’t really need official policy documents); or “can you help me understand this complex FooBar() function?” (which probably should be in a design document, or means the function should be re-written, but let’s be realistic). Also spend some time linking and organizing stuff.

The key is that one should store information intentionally, rather than relying on “accidental storage”. Those auto-generated records have legal risks, sure, but an even bigger risk is that it just doesn’t work well. People clean up their mailboxes or filesystems, or drives fail, or they leave a company. Different people have different sets of information; poor organization means they waste a lot of time trying to find stuff, and often can’t find information they do have. The company switches to a shiny new cloud system and few people remember how to access information from 2 systems ago.

One could argue that maybe programs like Signal should, by default, ask for consent to save logs for some short period of time—perhaps 24 hours or a week. Mostly so that if someone wakes up tomorrow and thinks “that would be a good thing to save”, they can do it. And to keep track of on-going conversations. But, storing records forever, without prompting, is not a good default.

Frank September 27, 2023 4:35 PM

Could Google (Android) and Apple (iOS) be forced to ‘backdoor’ devices through use of ‘client side scanning’ embedded on the OS level? Thereby using the UK Gov view the msgs app E2EE are not ‘compromised’? Would Signal choose not to operate on any device,in any jurisdiction, with Android or iOS ‘client side scanning’ implemented in any jurisdiction?

Clive Robinson September 27, 2023 5:38 PM

@ Josh Z. Tillman, Bob Paddock, ALL,

Re : It’s not the writing but the meaning in others hands that causes harm.

“I didn’t say “write nothing down”, though. I was talking mostly about communication records stored “by default” being a bad idea. In my experience, such records are just about the worst way to find information.”

Worse they are often ill or not contexted which gives those with ill intent toward you great opportunity.

Appart from making communications unobtainable by either not storing or by automatic deletion or similar via an unreliable “self destruction” mechanism[1] there are two basic options.

1, Ensure “safe context and language” are stored inseparably.
2, Ensure all information is stored in a fully deniable form.

The first is at best very difficult bordering on impossible, thus the second is perhaps an easier option by using a variation of Claude Shanon’s “Perfect Secrecy”.

Most of the readers here are aware of the “One Time Pad” it takes plaintext and via a simple additive process with truely random keytext turns it into ciphertext where all messages shorter than the ciphertext are equiprobable.

What few immediately realise is that the additive nature means that you can have not just multiple keytexts giving multiple messages, but with care you can build multiple keytexts into progressive keytexts such that the addition of another keytext makes an equally valid looking plain text. Thus it’s not possible for an adversary to know when they have the correct ciphertexts to give the correct plaintext.

Look on it as a variation on the more usually described “M of N key sharing” systems.

The fun thing is that the “index” is an encrypted file of integers that index the keytexts required for each plaintext. If done correctly what ever decryption key is used on the index, it produces valid keytexts to decode the ciphertexts into valid looking plaintexts. Thus there is effectively no distinquisher for an attacker to know if the resulting plaintexts are valid or not…

Yes there is a bit more to it but you can see from the above, enough for the averagely bright to sit down and work out their own version.

[1] Destroying stored data was once relatively simple, you simply burned the paper it was written on. Later you could still burn film stock such as negatives because they were made on nitro cellulose plastics. Even with magnetic media burning was still possible as was degausing and random over writing, some did all three along with “ball milling” the ash like remains in a petro-chemicals to disolve distribute and reconflagrate again. Even writable CD/DVD/BlueRay polycarbonate and metal based disks could be made useless in a microwave or sufficiently high temprature combustion.

But modern semiconductor memory is made and used in such a way that destroying data is sufficiently difficult that it is unlikely to be effective. Which is why they are used in aircraft “black boxes” and certain types of smart-munitions. For instance with Flash ROM used in USB thumb drives and SSD’s deleting data does not happen, neither does overwriting, due to “wear leveling” algorithms. Obviously there is no degausing and burning without a very high temprature sufficient to melt silicon is unlikely to work reliably, theory indicates that high intensity EM fields might have some effect but generating them… There are certain chemicals like “Chlorine trifluoride”(ClF3) already used in the seniconductor industry that can make sand burn, but they are so dangerous to store that jugling with “blasting oil” / Nitro Glycerin would be many times safer.

Clive Robinson September 27, 2023 6:59 PM

@ Frank, ALL,

Re : On device rather than crypto back door.

“Could Google (Android) and Apple (iOS) be forced to ‘backdoor’ devices through use of ‘client side scanning’ embedded on the OS level?”

It’s not a “back-door” but a “front-door” but yes it could be mandated but easy to avoid by those who wish to.

There are three asspects to consider.

1, Technical.
2, Political.
3, Societal.

On the technical front, not all phones made in the world would have a front-door built in. Thus using one gets around the ability of the UK Gov to look at sent and received messages.

On the political front the “undesirables” that are currently being used as “dog whistles” by the UK Government would almost immediately obtain and use such no front-door phones. Also as we know there is a quite lucrative market for “secure phones” that criminals and other “undesirables” –like defence lawyers, journalists and financial market players– would pay significant sums “for the conveniance of” rather than do sensible “field-craft” OpSec. If a legal case came up that attracted sufficient public attention, and from the MSM perspective those on trial “get away with it” because they had such phones… The UK Gov looses it’s “arms length excuse”.

But worse is that such a front-door would be a very significant invasion of privacy not just into communications but everything you use your phone for. As Apple has already found out society will not currently put up with such a level of intrusion. Worse as governments already install front-doors on “persons of interest” phones they get a lot of valuable information because the users do not realise (think Pegasus and the like). If every phone had a front-door then phones would by defult become untrustworthy and people would go another way. This would give rise to a call for Smart Devices that do not have any kind of wireless connectivity or the use of old fashioned paper and pencil, and thus “easy access” by the UK Government to the “person of interests” thoughts etc would be lost.

My own view is the old,

“Never eat what you don’t cook, but do cook for others”

Of basic OpSec / field-craft, but applied to technology not commestables.

That is being an electronics design engineer specialising in communications and slightly more recently secure systems for Industrial and up control systems I “roll my own” with probably safe parts, in ways that make them safe[1] as I’ve mentioned a few times in the past on this blog (not quite “monthly” as @Winter implied earlier today 😉

If more people started doing this rather than be led by a “judas goat” down the road to the slaughter house, then the UK and other Govs would not have an where as easy a time “banashing citizens privacy”.

[1] Most people used to make the mistake of trusting what appears to be not just “Open” but “officially stamped” with approval in standards. As a rule,

“I don’t trust I mitigate”

I realised early on that whilst the NIST approved AES was “probably theoretically secure” as an algorithm, I knew darn well it was “Not implementation secure” as it was known it was to easy to put side channels in. And it was exactly what happened the fast software implementations using amoungst other things “loop unrolling” techniques hemorrhaged information through side channels, so it was only secure for “data at rest” thanks to the NSA acting as NISTs technical advisor. Then there was the Dual Eliptic Curve random bit generator. It was supposadly “Crypto-Secure”(CS) but it was overly complicated and slow. Two things that raised warning signals or “alerted my hinky feeling” so I did not use it in my designs, I specifically rejected it. As it turns out it had activated others “hinky feelings” and they went on to show it had the potential for a back-door built in. Which although denied by the NSA gained sufficient suspicion that NIST had no choice but to remove it from the standard. I however do not trust any determanistic random bit generator that I do not fully understand. Even then I use multiple generators designed to mitigate any potential back-doors.

Josh Z. Tillman September 27, 2023 10:07 PM

@ Clive Robinson,

modern semiconductor memory is made and used in such a way that destroying data is sufficiently difficult that it is unlikely to be effective.

Speaking of Butlerian Jihads (cf. the most recent squid thread), maybe we need to open up some old Legend of Zelda cartridges to harvest their SRAM chips; 8192 bytes each! I understand this not-so-modern memory is readily erased when the power’s cut (in theory; my 33-year-old battery is somehow still working).

We could easily stick several hundred crypto keys into such a chip. And, more importantly, delete them to render certain data inaccessible. I think deletion remains within our reach, if our systems are designed for it.

Actually, I suspect such an SRAM chip could be embedded, using modern processes, into a “secure enclave” chip, and still work acceptably. And I’m pretty sure even modern flash chips can delete data, at the transistor level: every bit can be individually zeroed (but bringing them back to “1” has to be done a block at a time). We might need to expose that at the device-level interfaces, and make sure the error-correction layer doesn’t screw us. Maybe some kind of “burn-in” would allow someone to recover the data, but I doubt it’d be easy; error rates are pretty high at current densities.

Clive Robinson September 27, 2023 10:56 PM

@ Josh Z. Tillman,

“We could easily stick several hundred crypto keys into such a chip.”

Depends,on the crypto you use, some PubKeys are 8kbits these days…

RAM chips should loose their memory if you use them correctly, but they can suffer a form of “burn in” if you don’t.

As for modern chips they are mostly DRAM these days as even the top SRAM speeds are mostly not needed.

The downside of DRAM has of course always been the “refresh cycles”. They draw current thus the chips munch power and so battery backup can be a bit of an issue.

“And I’m pretty sure even modern flash chips can delete data, at the transistor level: every bit can be individually zeroed (but bringing them back to “1” has to be done a block at a time).”

The problem with Flash is it comes in two varieties and whilst read can be fast write can be comparatively speaking “Slower than a dog with only two legs”.

The trick in many systems is to use multiple chips and interleave the addressing such that writing appears faster than it actually is.

But the basic point remains, “consumer off the shelf”(COTS) components used in SSDs etc are hidden from the user by interface software which is rarely optimal for speed.

Some Flash chip designs have major block erase but many do not… Which is why it’s probably best to use encryption and just “zero the key”.

Josh Z. Tillman September 28, 2023 10:49 AM

@ Clive Robinson,

Depends,on the crypto you use, some PubKeys are 8kbits these days…

For the purposes of deleting data by erasing its encryption key, public-key crypto is not important. 128-bit to 256-bit symmetric keys should be fine. The use of more than one would allow sub-sections of a storage device to be “deleted”, rather than just the whole thing.

Managing this is part of the flash translation layer could make it a pretty powerful feature—if we could trust firmware writers, which history shows we shouldn’t. So it’d probably need to be part of a filesystem. I’m a bit disappointed that even the “next-generation” filesystems, some of which have built-in encryption, don’t consider reliable deletion (in the face of practically adversarial block devices) to be a necessary feature. The general assumption tends to be that nobody will be able to recover deleted data because they won’t have the encryption key(s), which doesn’t really consider the legal environment.

SRAM is not as popular as it once was, but people still know how to make it with modern processes. I see online references to AMD’s Zen 4 processors having SRAM caches, for example. So you’re right about this being a matter of the “commercial off-the-shelf” market. It’s not that we couldn’t do these things, with enough effort.

Sean September 28, 2023 2:04 PM

Well SRAM will retain state for a long time as voltage decays, and can generally also be persuaded to hold it using cryogenic freezing, then cutting the power pins, and moving to a new board to read it. thus the reason you find that crypto processors, like that used in older card readers, have as part of the security a network of thin wires embedded into the potted module, with it either being made from a thin multilayer flexible PCB, using long zigzag traces, that alternate power and ground, and multiple layers wrapped around, and then soldered to the inside board. This then has a battery back up, and also on the inside a small low power microcontroller, that continuously monitors the state of the mesh protection, and which will erase the SRAM used immediately on detecting tampering, either by traces being broken, or shorted to each other, say by somebody attempting to dissolve the epoxy fill. Those secure enclaves typically have 64k or more of memory, not the fastest, as power consumption of SRAM is proportional to speed, and also size, so the most common 6116LP3 SRAM, a whole 2k, is a power miser compared to the 6116-15, which is the fastest one. But same size, same layout, but slightly differing processing on the wafer, and a sort in production to get the lowest power ones for the low power market use.

SRAM is expensive, but if you are using a FPGA it is doable, at the expense of using a few of the thousands of gates for each cell, but if you go for a full custom ASIC you can put a pretty big chunk on as a single IP block on the die, and work around it. Not going to be low power, but can easily be static with no clock if needed. But blazing fast there, as all paths are on die, and you can choose bit width easily, so if you need 64 bits you simply make it 64 bits wide internally. Still will be possible to read by decapping the chip while powered, just need to get a few to set up the process so you can read the layers using some electron beam probing while powered, to read the bits using the beam current changes. Best defence against that is to use modern stacked dies, and have the SRAM be in the middle of a 5 layer sandwich, and also make the outher die top and bottom be a large area photodiode and integrate a massive silicon capacitor on it, using DRAM processing to get the deep wells needed to store extra charge. That will at least provide that brief moment of on stack power to scramble the RAM contents when exposed to light, though it is also hard to distinguish between light hitting, and any form of ESD or radiation caused blip as well.

Josh Z. Tillman September 28, 2023 6:15 PM

Protecting the RAM of live or recently-live systems, as in hardware security modules or everyday smart cards, is certainly an interesting challenge. It’s probably overkill for the use case I was considering, though, which was making sure that files deleted hours or days ago (such as old Signal chat logs) are not still available in the “garbage data” of a filesystem, readable by anyone with the encryption key—the single encryption key protecting the whole block device, which one could be legally compelled to provide.

There are tools like “wipe” meant to securely delete data, but probably not capable of it. Its manual page has a prominent June 2004 warning about journaling filesystems—and block remapping, the substantially-less-worrying predecessor of flash translation layers (hard disk block remapping is somewhat rare, while FTLs shuffle data frequently). It says “Per‐file secure deletion is better implemented in the operating system”. As far as I know, in the intervening 19 years, no popular and modern filesystem (or device mapper) has actually done that. ext4 purports to have, but it uses no crypto and just trusts the device to delete the data.

Clive Robinson September 28, 2023 8:48 PM

@ Sean, Josh Z. Tillman, ALL,

“Well SRAM will retain state for a long time as voltage decays, and can generally also be persuaded to hold it using cryogenic freezing, then cutting the power pins, and moving to a new board to read it.”

Actually from experience I know you don’t need to remove the chip or even cut the power lines…

The easiest way is to activate the “HLT” or “RST” line on the CPU and the chip select pin on the SRAM chip most of which back in the old DIP days could be done with “chip-clips” used to do hardware debugging.

The trick comes about from the “In Circuit Emulator”(ICE) systems used by all 8 and 16 bit CPU chips.

You put on the chip-clip on the CPU and put it into Halt or Reset. This makes the CPU pins become tristate or inactive you then just manipulate the chip select and address logic either via the CPU chip-clip or via a chip-clip on the RAM chip and read it out…

Back in the 1980’s I worked out a way to make the RAM content effectively encrypted via a stream cipher such that each byte was re-encrypted immediately after a read as an “Atomic sequence” with CPU’s that supported it (see 8086 instruction set). Or by disabling interupts, performing the read+write and then re-enabling interupts. One or two “magic values” that evolved were kept in the CPU registers.

I later went on to use values in different chips that also were re-encrypted. The difference was that you had to decrypt both values then combine them and decrypt the result to get the actual value.

With block ciphers based on Fiestel rounds you can “kill the key” after making the individual “round keys” and you use those as an encrypted array that evolves.

This led me on to develop some hybrid ciphers where the round keys came from a stream cipher and were continuously changing.

It’s not entirely secure, if the attacker can find the “magic values” kept in the CPU it’s game over, but it ups the work factor considerably and was enough for the likes of “set top boxes” for cable and satellite TV.

In one case we got Motorola to change it’s chip design –microcode– such that it had “Write only registers” as seen by the chip pins. That is you could write values into them but you could not read the values back, but the ALU within the CPU could use the values to do certain logical or arithmetic operations.

The history of such tricks went up in complexity as resources became available. In the 1990’s you started to see chips where the external buses were encrypted such that all data in RAM was encrypted and likewise with “Harvard Architecture” the code in ROM as well. The advent of internal Flash ROM ment that encryption keys could like serial numbers be unique to each CPU programed.

A company I worked for in the Broadcast Industry developed some interesting designs of audio processors. For “stock control” reasons the hardware was the same in the whole range, what differed was the software. To stop the software being “copied over” from one unit to another each had a unique serial number that acted as a “primary key” in a database in the factory that held the “encryption key” for each unit, along with what the customer had “purchased”. This was linked to a server that alowed patches and updates to be uniquely generated for each unit shipped.

It fell to me to develop both the factory system as well as the encryption algorithm and mode algorithm used in the units (and written in assembler for two unrelated but communicating microcontrolers).

To this day I’m still surprised at how smoothly it all worked from day one without a problem (proving that if you put the work in up front you don’t have to patch down stream 😉

Tim September 30, 2023 12:06 PM

The problem with Signal’s position is it basically abandons the people of the UK. They have done some work in implementing anti-censorship to help people in oppressive regimes to still access the service, but even GETTING the app in those places is tough due to centralized app stores.

Signal has to get over this whole system architecture of requiring a phone as the master on the account; it makes it incredibly difficult for people in China, the UK, or wherever to get on the service.

Signal desktop needs to be AS functional as the mobile version and it needs to be usable without access to a phone.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.