XZ Utils Backdoor

The cybersecurity world got really lucky last week. An intentionally placed backdoor in XZ Utils, an open-source compression utility, was pretty much accidentally discovered by a Microsoft engineer—weeks before it would have been incorporated into both Debian and Red Hat Linux. From ArsTehnica:

Malicious code added to XZ Utils versions 5.6.0 and 5.6.1 modified the way the software functions. The backdoor manipulated sshd, the executable file used to make remote SSH connections. Anyone in possession of a predetermined encryption key could stash any code of their choice in an SSH login certificate, upload it, and execute it on the backdoored device. No one has actually seen code uploaded, so it’s not known what code the attacker planned to run. In theory, the code could allow for just about anything, including stealing encryption keys or installing malware.

It was an incredibly complex backdoor. Installing it was a multi-year process that seems to have involved social engineering the lone unpaid engineer in charge of the utility. More from ArsTechnica:

In 2021, someone with the username JiaT75 made their first known commit to an open source project. In retrospect, the change to the libarchive project is suspicious, because it replaced the safe_fprint function with a variant that has long been recognized as less secure. No one noticed at the time.

The following year, JiaT75 submitted a patch over the XZ Utils mailing list, and, almost immediately, a never-before-seen participant named Jigar Kumar joined the discussion and argued that Lasse Collin, the longtime maintainer of XZ Utils, hadn’t been updating the software often or fast enough. Kumar, with the support of Dennis Ens and several other people who had never had a presence on the list, pressured Collin to bring on an additional developer to maintain the project.

There’s a lot more. The sophistication of both the exploit and the process to get it into the software project scream nation-state operation. It’s reminiscent of Solar Winds, although (1) it would have been much, much worse, and (2) we got really, really lucky.

I simply don’t believe this was the only attempt to slip a backdoor into a critical piece of Internet software, either closed source or open source. Given how lucky we were to detect this one, I believe this kind of operation has been successful in the past. We simply have to stop building our critical national infrastructure on top of random software libraries managed by lone unpaid distracted—or worse—individuals.

Posted on April 2, 2024 at 2:50 PM70 Comments

Comments

Erdem Memisyazici April 2, 2024 4:01 PM

I think a better solution is to teach people what mobbing is with tools and examples. How did the developer in question not recognize that these people popping out of nowhere are all of a sudden making demands to control his software?

It’s likely that he thought they were respectful individuals making good points online as they should have been.

Real Questions:

Did GitHub warn him? That is a large company, how did they not know state actors are on their platform and lead them to a honeypot instead?

Could he not sell his access himself without state actor influence just like a large company could sell “we accidentallied that code” sort of commits?

Why I Ask Them:

The article says, “… managed by lone unpaid distracted—or worse—individuals.” but companies who hardcoded admin credentials like Cisco etc. were neither unpaid, distracted, nor worse … individuals.

He has no way of knowing if those are hacked accounts or real people, no access to their IPs, or any sort of confirmation but Github does have some of that data.

Linux distros are stakeholders here but companies like RedHat check the code before getting the latest version of anything in. You hope your distro does the same or good luck going over every commit yourself.

That being said most people know that everything doesn’t need a daily update to be secure (i.e. module with the function ‘fibonacci’ is probably going to stay the same for the next millenium).

This goes for hardware too. Most 90s computers weren’t hit by Spectre and Meltdown.

BCS April 2, 2024 4:18 PM

Another consideration, from what I’ve seen, a big chunk of the exploit was implemented in terms of the build and test system. The fact that it’s possible for a build step of the final library to access test inputs seems like a major open door. While there may have been good reasons to use glorified shell scripts run in shared directory as a build process 20-40 years ago, the average cell phone has the compute, storage and OS feature set to do a lot better for isolated and hermetic build today.

While I won’t claim it’s perfect or say more than it would be my preference, tools like Bazel (or any of its competitors with similar priorities) would have made this sort of injection much harder to pull off unnoticed and also make it much simpler to search for other such attacks once an injection vector is identified.

vas pup April 2, 2024 4:26 PM

See with the eyes of autistic people
https://www.dw.com/en/autism-awareness-see-with-the-eyes-of-autistic-people/a-61322416

This is extract:
“>Autistic people train themselves to reduce as many external impulses as
possible. That enables them to concentrate on a single thing — and ignore everything else.

Little things can disrupt that concentration massively. It might only be the flickering of a lamp. This hyper-focus, or absolute concentration on a single thing, can have negative effects. It can be that autistic people stop receiving feelings of hunger or thirst, regardless of whether it is cold or hot outside.

Every effort is concentrated on the one thing. It can be solving complicated
mathematical problems or developing computer software. Autistic people have
become very popular employees for IT companies because they won’t be disturbed by anything or anyone.”

Sounds familiar? Read the whole article if interested before sanitized by no reason.

Nick Alcock April 2, 2024 5:15 PM

@vas pup, it sounds familiar (well, it should), but what on earth does it have to do with the xz backdoor? Nothing.

There is one constraint on the injected code — it must be quite short, because it has to fit in the presented public key’s n value (!) after de-chacha20-ification. But that’s more than enough to start, say, netcat, or an encrypted socat, and then it really is game over…

smb April 2, 2024 5:31 PM

Not a big deal, but the wired article linked at the end is just a reprint of the Ars article, minus the illustrations.

Hales April 2, 2024 6:25 PM

I like Ariadne Space’s take on this:

There is no “supply chain” in reality, but there is an effort by corporations which consume software from the commons to pretend there is one in order to shift the obligations related to ingesting third-party code away from themselves and to the original authors and maintainers of the code they are using.

That doesn’t completely cover all situations here — a distro like Debian or Arch isn’t a corporate paid product — but I think it still highlights an interesting point. Expecting the developer of a small project to up their game is ineffective (they don’t have the resources) and counterproductive (they’ll probably think twice about publishing anything ever again).

The resources (workers & money) are with companies, organisations and governments that use these projects. These same groups are often creatively bankrupt (they couldn’t make & publish a small tool that everyone uses without turning it to garbage) and need the existence of small developers to make up for that.

I don’t think there is a practical solution solution other than getting big orgs to watch their upstreams more closely.

Andy April 2, 2024 8:40 PM

I agree. My money is on Mikeysoft orchestrating the whole thing to discredit FOSS and promote their own garbage, for sure backdoored software. What other software should we building on, oh why not Mikeysoft? It’s developed by many paid but far less qualified developers.

Dave April 2, 2024 9:31 PM

I think a better solution is to teach people what mobbing is with tools and examples. How did the developer in question not recognize that these people popping out of nowhere are all of a sudden making demands to control his software?

He didn’t do that, he just offered to help. And if you’ve got one non-paid, half-burned-out volunteer running everything then it’d be quite easy to come in and help, and then slowly help out a bit more over time.

I don’t know how many things I use, or have used in the past, run by one guy, or perhaps two, who burned out and slowly faded away and the project went moribund. In fact github and similar are full of such things.

echo April 2, 2024 10:58 PM

I posted on this days ago but it kept getting snarled in moderation so I stopped bothering.

What happened was an attack on an individuals mental health then pressure to obtain ownership which was then exploited. This is why human rights and equality matter. This little thing called economics theory alongside not excluding quality of life and mental health. Society matters. Security without any of these is not security.

Talking about “enshitification” tends to jam people’s heads up on a meme they can bikeshed, or getting lost in the weeds of the details of the technical exploit opens the door to navel gazing.

Linus Tarballs April 3, 2024 12:35 AM

The main issue here, explained in CVE-2024-3094, is xz-utils release included a binary blob disguised as a test file, no source for this file is provided. That’s a red flag although is common ground on the linux kernel…
The major issue with Unix operating systems lies in their library dependencies, which require developers to inform users about the specific libraries they need to use. In contrast, Windows provides a certain degree of isolation by allowing applications to rely on their own private libraries. However, it becomes the responsibility of developers to ensure the safety of these libraries, although they still need to trust Windows system libraries and the OS itself. Sharing libraries in an operating system is like sharing a room in your home with your neighbors. I would appreciate a Unix operating system that adopts a more private libraries philosophy.

piotras April 3, 2024 4:17 AM

@Linus Tarballs

“Private” dynamically loaded libraries practiced on Windows is just an unnecessary complicated replacement for statically linked libraries. Any project can choose to link some or all of their dependencies statically. Or if they really like distributing all these .so’s along, add own loader code and do not rely on the standard linker/loader (some projects do it to implement “modules” or “plugins”). This way a project can have full control of what code they include.

But then, they have to maintain security patches and updates of these libraries themselves. And the product will consume more memory and disk space. Such approach has both pros and cons. Shared libraries were invented for a reason; sometimes that reason may not apply, e.g. for specific security-sensitive stuff.

Scott N Kurland April 3, 2024 6:50 AM

The economics response to positive externalities is to subsidize them, right?

Nick Alcock April 3, 2024 7:04 AM

@Cybershow, you are being excessively paranoid. The thing about free software and open source is that the people are what matter: the companies are just a source of funds, really, and at most will tell the people doing the work which way to focus their efforts next. Andres has worked on PostgreSQL while being paid by multiple companies and has not perceptibly been influenced by any of them.

So, no, this is not some sort of MS-sponsored false flag. Among other things, MS’s non-Windows-side developers are in the same big “Linux hacker hiring pool” as the rest of us — unless they also hired straight out of the NSA (and kept those people very tightly segregated from the rest of their developer pool) any hint of the sort of nastiness you imply would leak straight out at conferences with no perceptible delay. Occam’s razor says that the easiest way to perpetrate an attack like this that requires spook-level coordination and secrecy is to be a spook already, rather than hiring spooks to torpedo your competition.

And how would it benefit them? MS is one of the largest Linux users out there, and the majority of their profit these days comes from Azure (which is largely Linux, like all the big clouds). Why on earth would MS want to kneecap themselves? And why do it in such a complex fashion, with an attack that requires a specific private key to launch, when it would be far more terrifying to just have it left open so everyone is vulnerable to everyone? Again, this sort of nobody-but-us-gets-to-benefit is exactly the modus operandi of any number of security services, and not the modus operandi of either commercial organizations or blackhat groups (who are nowhere near this patient, and would in any case likely rather use a reprogrammable key scheme so they could resell parts of their network of compromised machines to whoever they wanted).

Nick Alcock April 3, 2024 7:11 AM

@Linus Tarballs, it’s hard to figure out how to provide source code for test files of corrupted data. I’ve committed things like this in the past (well, not backdoors, obviously, things like this was claimed to be): some customer thing causes a crash so you check it in unchanged. This is obviously not going to be as common in free software because you want it to be license-compatible and you’d like to be able to regenerate it; for corrupted files you could probably get away with compressing using the compressor you just built and then corrupting the result, but how do you test the thing’s function on known-good files? You can’t compress those with the compressor you just built, since that’s the very thing you’re testing…

… but I suppose checking in binaries for valid compressed files is safer, because any curious party could just decompress them and see perfectly well what they contained. You’d need to add substantial padding in the file format where malicious content could be kept in order to ship such things in that case.

(But of course xz-utils’s file format has silly amounts of padding, so that would probably be just about doable. The padding is widely scattered so extracting the bad content without it being obvious would be a bugger though.)

wiredog April 3, 2024 7:40 AM

“I simply don’t believe this was the only attempt to slip a backdoor into a critical piece of Internet software”
Yeah, no one believes that. And there is certainly more than one nation state doing this. This particular attack seems (given names, logs, and timestamps) to be from China, but also it’s not hard to spoof those things, either. And anyway China is only one of the countries that is doing this. Any competent security service would be grossly negligent not to be doing this.

This sort of supply chain attack probably has been going on for decades. Just assume that, unless you’re air-gapped*, you’re vulnerable.

  • Yes, yes, side-channels/tempest/etc. Mitigate for those, too.

Clive Robinson April 3, 2024 9:34 AM

@ Linus Tarballs, ALL,

Re : Why libraries?

The major issue with Unix operating systems lies in their library dependencies, which require developers to inform users about the specific libraries they need to use. In contrast, Windows provides a certain degree of isolation by allowing applications to rely on their own private libraries.

The thing is mostly libraries are bad news these days and most, certainly don’t do what was originally intended.

I can remember when RAM was expensive, I payed the equivalent of $30 USD for 256bits not bytes when I first purchased memory. But a short while before memory cost over $1USD/byte… And back then the US national expenditure was down below a billion dollars…

So as people say these days “Do the math…” The PDP8 was originally entirely transistor based with logic gates on plug in cards and that was “normal”. Most computers of the time the actual “design” was put together via the wire-wrapped back plane.

Memory was expensive, much of what we call “system libraries” were built into the OS and was just on or two KBytes of what we might call CISC code. Which got expanded out by “micro-code” in the CPU, that in turn got expanded out to the Register Transfer Language”(RTL) which actually controlled the logic lines.

*nix came about as a spin off of ideas from a larger government funded project called Multics. Part of what became clear was that software was the road forward, but it cost to much to produce and have resources.

Thus the choice was to make programmers more efficient and likewise the use of RAM.

So we got not just system libraries but high level languages with shared libraries that would stay loaded in memory and shared by multiple programs. Eventually this turned into the notion of “code reuse”.

Back in the 1990’s I was explaining the danger of this by use of the diamond diagram. And how everything that was shared was a danger to all code above it. My main concern back them was “driver code” and what became later know as “shims”. Eventually in what was called the naughties of this centuries first decade we got shims appearing, and we got other issues to do with system libraries.

Why do you think we’ve got “Address space layout randomisation”(ASLR) and similar?

As we close down vulnerabilities at the bottom of the diamond the attackers are moving up it and the computing stack.

The other aspect that people do not consider with “code reuse” is the “Kitchen Sink” issue. It’s especially bad in FOSS code which after a moments thought you will realise is the logical way things had to go. With it the increase in complexity and thus vulnerability.

But also a linked library is “all or nothing”. You might only use one tiny thing, yet have to load a metric 5h1t&ton of garbage into memory, and worse in many cases “initialise it”. So not only an incredible waste of RAM but CPU cycles as well. So these days rather than,

“Save resources by sharing”

We waste resources in just about every way possible…

It’s one of the main reasons I stopped writing “public domain” software for approximately open hardware, I could see back last century what a disaster it was turning into and I wanted no part of it.

The other two reasons,

Firstly no bugger wanted to pay, yet secondly they would make heavy handed hints if you did not do what they wanted then you would be made to pay…

Telling them they had two choices,

1, Pay to have me be nice.
2, Do it themselves.

Apparently made me “a bad person” and when I got that crap from one of Bill Gates immediate right hand men, I think you can guess what my reply was…

It appears others are still learning the lesson, and the result will be FOSS becomes steadily more problematic and a broken liability.

Grima Squeakersen April 3, 2024 9:40 AM

cybershow asked: “The proximity of Microsoft to this accidental discovery leaves me with
questions and concerns. Am I the only one?”

That was my very first thought before even reading the rest of the text.

Grima Squeakersen April 3, 2024 9:52 AM

Hales said: “Expecting the developer of a small project to up their game is ineffective…”

Agree. This should really not be an unexpected outcome of the free / FOSS ecosystem. That may have been a somewhat valid and workable metaphor a decade or so ago, when quasi systems software was reasonably simple, but to expect truly robust coding and testing, let alone effective long term maintenance, on products running in today’s very complex infrastructures, from a developer who is (or was) coding as a hobby to please himself and/or impress others, with no renumeration, is imo highly unrealistic. Unfortunately the use of “free” modules by for-profit enterprises is a practice that is not going to end soon, or easily.

Morley April 3, 2024 10:13 AM

Let’s “secret shopper” our own projects. Like when IT sends fake phishing messages.

Morley April 3, 2024 10:17 AM

“A thousand eyeballs” let it through and also caught it. So, insufficient but helpful.

echo April 3, 2024 10:45 AM

Hales said: “Expecting the developer of a small project to up their game is ineffective…”

Agree. This should really not be an unexpected outcome of the free / FOSS ecosystem.

Neo-liberalism. The rest is victim blaming…

Lars April 3, 2024 11:37 AM

Who is Jia Tan? For all we know, it could be Lasse’s neighbor, if they really wanted to mess with him (and some of the early pressure on Lasse seems that way). Move into the house next door.

Winter April 3, 2024 1:20 PM

@Grima Squeakersen

but to expect truly robust coding and testing, let alone effective long term maintenance, on products running in today’s very complex infrastructures, from a developer who is (or was) coding as a hobby to please himself and/or impress others, with no renumeration, is imo highly unrealistic.

That is not only realistic, it is a description of the state of software.

The amount of software in use is in the trillions of lines of code range. Github sees some half a trillion lines added per year, and some 200 billion deleted. By far most of the millions of modules in NPM (or PyPi) have one or two maintainers. It is not different elsewhere.

The world of software is FLOSS, and the majority of dependencies are packages with only a few maintainers. There is simply not enough money in the industry to do it otherwise.

Another way of looking at it, xz was not a project in “active” development. It did not need new features or brilliant code. Who wants to work on such a project? Who wants to invest money in it? There is enough work-of- love to do for a few part time developers, not more. The same holds for millions of other projects.

alyson April 3, 2024 2:47 PM

@Linus Tarballs,

Static linking works just fine on Linux. Better than dynamic, according to some people, because it avoids “dependency hell”; the kernel developers maintain ABI compatibility with ancient binaries, and will fix bugs (even, in one case, when it was an old virus that broke). I understand the BSD people do not, however, which makes static linking impractical there.

As far as I know, dynamic linking of shipped libraries works fine on all Unix platforms; a “launcher” script that sets LD_LIBRARY_PATH should be sufficient.

The isolation we really need is libraries being isolated from each other, and from the process using them. Right now, unless one goes to great lengths to put them in subprocesses (and I’m a bit surprised OpenSSH didn’t), every library has access to the entire process state. CHERI is seeking to change that; it’s not compatible with existing hardware, but maybe allowing libraries to be run via emulation (including security) would be an interesting research direction.

cybershow April 3, 2024 5:23 PM

@ Nick Alcock

Hey Nick, I do appreciate the compliment, but you are too kind, I am
not sure it is possible to ever be too paranoid in this
business 🙂 In my tradition we call it radical scepticism.

You’ve inferred and extrapolated a lot from my open question. Since I
have no theory of Microsoft’s involvement, and assume Andres Freund is
not only an honourable chap but something of a hero, what remains is
the rather important question you raise; Cui bono?

And how would it benefit them?

Let me preface any further remarks with the fact that I am vehemently
anti-Microsoft (and indeed “Big Tech”), lest there be any confusion 🙂
I’ve consistently written that I believe Microsoft have set back
computing decades with their poor quality, and in particular that
they’ve done grave harms to computer security. Those are my opinions
others may agree or disagree with.

Grounded more in fact, we observe that Microsoft are multiple
convicted criminal monopolists and serial offenders who have been
fined by almost every nation on Earth and found morally wanting. One
can also quite objectively observe Microsoft always benefit when
common, open, interoperable standards are damaged, and indeed have
gone to extraordinary lengths to sabotage them.

So as I say, this leaves me with questions. Not a
theory. Certainly not a proof. But what philosopher Rick Roderick
would call an “elegant suspicion”. And indeed, I think, a well
justified worry when criminals are in the vicinity of things I value.

Now as you say, there are much better, simpler explanations.
Nonetheless, and regardless any whataboutism around “spooks”, I am
predisposed to distrust Microsoft a priori qua convicted
criminal monopolists. Let’s call that a well justified prejudice.

Regardless then the perhaps ridiculous accusation of whether and how
Microsoft caused this issue, the question of how could
Microsoft benefit from it is a separate, good and worthy one I
am pleased you ask.

The story of the backdoor so far is two-fold. It’s a technically great
hack one has to admire, with undetectable RCE in the auth phase of the
most used critical protocol. Hats-off!

But it’s also a story of sinister social engineering. A dark night. A
lonely and isolated maintainer. Some well meaning visitors drop by “to
help”…

What we’re left talking about is the very nature of open source
development, of supply chains and trust models. Perhaps a long-overdue
conversation, no?

But who have positioned themselves “to help”?

Who have replaced the entire pre-2010 ecosystem of individual and
autonomous development with a single GitHub?

Who might we expect to soon come riding in on a white stallion with
“solutions” to the vulnerability of FOSS supply chains? To protect the
lonesome, unpaid, overworked and socially unskilled FOSS maintainer?

most respectfully.

ResearcherZero April 4, 2024 1:11 AM

6,000 code changes and no trail left behind is a lot of work for one person.

There are suggestions that it could be the work of APT29, given clues left behind that suggest the work hours during when these changes were made. Who ever it was, it was a well planned and tight operation with very good coding and obfuscation. Very hard to tell if bluffs are double bluffs. It may have some hallmarks of previous ops run by the SVR.

There are other groups that have run supply chain attacks, few are in this league. It is still early days though. It will take quite a long time to investigate and discover. The many aspects and techniques of this operation are more interesting than who is responsible.

ResearcherZero April 4, 2024 1:18 AM

@cybershow

If microsoft wants to let someone in the back door all it has to do is not patch the hole.

ResearcherZero April 4, 2024 1:59 AM

This operation was conducted with military precision. Eastern European or Middle Eastern time zone? Preparation and insertion took place between between 2021 and February 2024.

It also has some similarities with past supply chain attacks and techniques used by the specific actors in those, and other intrusion operations they took part in.

“This was not a one-off attack by the SVR. This is a broader global-listening infrastructure and framework,” he says, “and the Orion platform was just one piece of that. There were absolutely other companies involved.” He says, however, that he doesn’t know specifics.

Some have suggested the government wants to avoid a deep assessment of the campaign because it could expose industry and government failures in preventing the attack or detecting it earlier.

‘https://www.wired.com/story/the-untold-story-of-solarwinds-the-boldest-supply-chain-hack-ever/

“Federal agencies do not have the visibility into their networks to effectively detect data exfiltration attempts and respond to cybersecurity incidents,” the report states bluntly.

Perhaps most troubling of all: In 38 percent of government cybersecurity incidents, the relevant agency never identifies the “attack vector,” meaning it never learns how a hacker perpetrated an attack. “That’s definitely problematic,” says Chris Wysopal, CTO of the software auditing firm Veracode. “The whole key of incident response is understanding what happened. If you can’t plug the hole the attacker is just going to come back in again.”

https://www.wired.com/story/federal-government-cybersecurity-bleak/

ResearcherZero April 4, 2024 4:09 AM

test ELF binaries to check for potential implants

‘https://xz.fail

Anyone in possession of a predetermined encryption key could stash any code of their choice in an SSH login certificate, upload it, and execute it on the backdoored device.

I’ve used OpenSSH for connecting to devices using a terminal, or flashing firmware, and many other uses. You could use it load a custom ROM onto a phone for example.

So you might install it to accomplish a fairly quick and simple or temporary task and in the process inadvertently get burned. OpenSSH is used for many other purposes. You could get a payload onto devices that many might then imagine would remain secure afterwards.

“OpenSSH, the most popular sshd implementation, doesn’t link the liblzma library, but Debian and many other Linux distributions add a patch to link sshd to systemd, a program that loads a variety of services during the system bootup. Systemd, in turn, links to liblzma, and this allows xz Utils to exert control over sshd.”

“IFUNC, a mechanism in glibc that allows for indirect function calls, is used to perform runtime hooking/redirection of OpenSSH’s authentication routines.”

spiral April 4, 2024 4:23 AM

@wiredog “This particular attack seems (given names, logs, and timestamps) to be from China”

Actually the fact that the username (Jia) is a Chinese name suggests to me that the culprit is not China. He/she/they would obviously want to be as misleading as possible about the source of the attack.

Winter April 4, 2024 5:08 AM

@cybershow

Regardless then the perhaps ridiculous accusation of whether and how
Microsoft caused this issue, the question of how could
Microsoft benefit from it is a separate, good and worthy one I
am pleased you ask.

Microsoft ships Linux as part of WSL. The targeted Linux distributions are the main deployments on Azure. Azure generated $45B of revenue (23%)[1]. That is more than Office or Windows. Azure is the biggest growth market for MS. AFAIK, MS have nothing to replace Linux available.

This means that anything that damages Linux will damage Azure and hence, MS’ bottom line. I find your “attribution” rather unrealistic.

[1] 2022 ‘https://www.kamilfranek.com/microsoft-revenue-breakdown/

Winter April 4, 2024 5:12 AM

@spiral

Actually the fact that the username (Jia) is a Chinese name suggests to me that the culprit is not China.

Possible. But it is also possible that Lasse Colin has met him either face to face or in online video. Then a Chinese name might be necessary to stay in character.

Gert-Jan April 4, 2024 6:52 AM

We simply have to stop building our critical national infrastructure on top of random software libraries managed by lone unpaid distracted—or worse—individuals.

Yes, there is an issue here. But such knee jerk reaction misses the point. The question is, how can we guarantee a particular level of quality and security?

This is incredibly hard, if you ask me. Having a big corporation behind it does not provide any guarantees. And a particular project of a lone developer might have higher quality. It all depends.

If you think about it, the amount of trust we place in the software ecosystems is massive. But it only seems based on experience. And just like any other statistic, this doesn’t say anything about any particular individual piece.

cybershow April 4, 2024 8:15 AM

@Winter

There is no attribution. Not even one with scare quotes. I
deliberately and clearly avoided any. It is others who filled in those
gaps, which says more about how they think than I. Indeed, you just
quoted me calling it “ridiculous”.

However you continue to miss my point.

Microsoft stand to gain from widespread loss of trust in the FOSS
development chain regardless any impact on specific technologies they
make use of.

Reducing everything to business and money is simplistic and
shortsighted. There is much more at stake around power and long term
strategy in the computer security world than that.

Clive Robinson April 4, 2024 9:18 AM

@ ResearcherZero, ALL,

Re : Times arrow flies but one way.

With regards,

“Perhaps most troubling of all: In 38 percent of government cybersecurity incidents, the relevant agency never identifies the “attack vector,” meaning it never learns how a hacker perpetrated an attack.”

If you think about it, this is too be expected with non “low hanging fruit” cyber attacks. Also some percentage lets call it 15% if not a lot more is going to be incorrectly identified as well.

Why?

Well first off it is the nature of nearly all organisations, be they groups or individuals not to record the present let alone the past.

Thus the actual attack goes “unrecorded”, thus it is mostly circumstantial evidence at best that such attack “attributions are made”.

It’s now fairly well established that APT type attacks leave more than one door or window open so they can quietly return if they get detected and removed.

So it is to be expected that they will find a way in, and quietly look for another one or two ways in, then do nothing for a while.

Then come back via one of the other ways and use that to put in place another backdoor. It’s that backdoor that they then use in anger.

As a defender you find the APT presence and go vulnerability hunting. You find the placed backdoor and close it. Now if you are a “hinky thinker” you might assume correctly that “that was too easy” and go digging a little harder and find the other way and close that as well.

But this leaves the APT hacker not just the original way in but other known to them ways as well.

But even if you have recorded every byte that crossed the perimeter just how far back are you going to go looking?

As I keep pointing out, there is only one way to not loose at this game and that is as said back in 1983 at the end of the film War Games,

“A strange game. The only winning move is not to play.”

And the way to do that is not to provide access of any kind to an attacker. That is,

“If they can not reach it they can not attack it.”

Back in the early days they talked of “air gaps” but even back then engineers like myself were in the late 1970’s working on using light and sound to link “Personal Digital Assistants”(PDAs) and similar to 8bit and early Personal computers etc to “main frames” and similar. Even European Broadcasters were sending out digital files that you could record on cassette and play back as sound into a computer.

So an “air gap” as they say “Does not cut the mustard” thus you have to use proper “Energy-Gaps” and correctly designed “Gap-Bridging”.

Whilst it was manageable for a personal two computer system using non Smart PCs from two decades or more ago… The more computers you add or the more users you have and the more modern the hardware, the harder the problem becomes.

I’ve built solutions that work, but I’m not a normal user, and I’ve good reason to be cautious. Most users however care not a jot about security and will find any and every way they can to get around it to make work easier and play faster.

Clive Robinson April 4, 2024 11:55 AM

@ Winter, spiral, ALL,

Re : Front man for the group

“Then a Chinese name might be necessary to stay in character.”

As others have noted, either the individual was a genius workaholic turning out a “hack-n-hour”[1] or a frontman very much larger group.

Though not an “actor” they would need to be quite technically competent to a high level to sustain any direct interaction without drawing suspicion.

[1] Apparently 6000 changes in three years have been found. In theory we only work just under 2000 hours a year, so “hack-n-Hour” it would be.

But I suspect that an analysis would show if more than one set of changes were made at any one time. If it were found to be the case the “group v genius” would tip very definitely toward group.

Winter April 4, 2024 12:59 PM

@cybershow

It is others who filled in those
gaps, which says more about how they think than I.

The British have a saying for that “nudge nudge, wink wink”.[1]

[1] ‘https://m.youtube.com/watch?v=STTL-jOrnDQ

Clive Robinson April 4, 2024 2:24 PM

@ Gert-Jan, ALL,

Re : Something’s can not be done.

“The question is, how can we guarantee a particular level of quality and security?”

We can not is the short but honest answer.

Less formally if you think about it our approach to just about everything is from the top of the heap down. We put down foundations then build up, each step is actually on-top and so a small pile becomes given time a heap.

A heap if made of rock and careful design might last five thousand years or more. Or if tailings/slag just dumped without consideration because it’s seen as “just waste” untill it slides down like an avalanche or worse tusnami of doom sweeping away all before it.

Working rock takes both skill and time, shoveling slag well upto a point anyone can do it, with next to no training or skill at all.

It’s the same with software, but management and marketing want fast, big, and cheap… so we get one slag heap after another.

But as I’ve said a few times in the past,

“Security is a quality process”

And like all quality processes,

“It needs management buy in at the highest level, and should be in place before the project is thought of let alone be the pre-specification wish-list thought up.”

Even then, basic information theory tells us it can not be shown to be secure…

Because to “process” information it has to be “communicated”.

Claude Shannon proved for information to be transmitted then there has to be “redundancy” in the resultant communications channel.

Gus Simmons proved that where there was a channel with redundancy then another channel could be created within it. Importantly this “side channel” could be made not just covert but impossible for an observer to show existed.

From that alone you can see it can not be secure.

I could go on and bring in work from Gödel from nearly a hundred years ago that pre-dates the work of Church and Turing that in effect gives further evidence, but there’s not enough space to go through it[1]. If you want to try you first have to get your head around the implications of the “Axiom of choice”(AoC) and Cantor’s Diagonal Argument both fundamental to set theory and both Gödel and Turing proofs.

But from a simpler perspective take a “black Box view” but with a slight difference…

There are two sets of inputs and two sets of outputs.

You as the observer can only see one set of outputs, and as a tester can only see and manipulate one set of inputs. Your task is to show that the set of outputs you observe are only generated by the set of inputs you control and some internal function that has both state and feedback and not in anyway effected by the other inputs you can neither control or observe.

[1] I’ve four hard back books on Gödel’s work and two on Turing’s in my dead tree cave, they are all hard work to read let alone get your head around…

echo April 4, 2024 4:01 PM

@Clive

And this is why you need governance standards and human rights and equality in the loop. There’s studies and papers on governance and equality and how management will self-select clones if there is a lack of incentive and pushback. There’s also a fair chunk of stuff on corporate culture and work loading.

Nobody will be reading up on systems theory let alone Godel or spend ten seconds wondering how to apply it. They’re useful tools but don’t say that much really. By focusing on technical-security and information you’re not articulating a robust case for a multi-domain model which is the point of weakness you complain about.

Getting there but you have a way to go…

ResearcherZero April 5, 2024 2:09 AM

@Clive, @echo

Document everything and ensure you have evidence to back up any formal claim.
Governance standards and human rights are far more difficult in practice.

Security is a people problem. But without good tools and process – people are useless.

Human rights and standards account for very little when people often won’t play a part in the solution. Humans are stubborn, easily emotionally manipulated, often selfish, uncaring without regard for the impact or effect on others. It does not matter if it is a formal setting with clear protocols and a code of ethics, or a public setting where we rely on individuals to report transgressions and maliciousness. Trust is only paper thin.

Often people do not know how to report something because they never have the courage to.
In other cases the reporting process is overly cumbersome and instead acts as a deterrent.

The constant refrain is that, “I am but one individual. What can I do? It’s not my responsibility. They probably did it to themselves. Where there is smoke – there is fire.”

“Why do you care what happens to somebody else? You should mind your own bee’s wax.”

Variations which are repeated in medical facilities, outside courtrooms, by police officers attending a scene, by politicians behind closed doors and at conventions not open to the public, within the offices of businesses and other organisations, and most importantly, within people’s homes. Often complaints are not passed on, not acted upon or destroyed.

Then, even if someone is willing to act responsibly, they are often alone. Left to confront a long and occasionally endless number of obstacles and obstructions, often put in place by groups with significantly greater resources and experiences at being massive, smelly turds.

So clearly everything has to be documented, filmed, recorded and scrutinised when making a complaint, as people can simply not be trusted to act in an open and transparent manner.

People require a little incentive, exposure or encouragement in order to be truthful. And that is just to get institutions to behave when filing a complaint, regarding the actions of those responsible. Often followed by denial and self-serving rebuttal by the accused.

“This figure shows almost zero detection from VirusTotal for recompiled implanted versions from one of the common Linux distributions. This demonstrates outdated methods such as byte string matching and blacklisting known bad file hashes are of limited use when checking for particular code properties.”

Fortunately there is an improved tool using newer methods.

‘https://www.binarly.io/blog/xz-utils-supply-chain-puzzle-binarly-ships-free-scanner-for-cve-2024-3094-backdoor

Bruce Schneier April 5, 2024 6:00 PM

@echo:

“What happened was an attack on an individuals mental health then pressure to obtain ownership which was then exploited. This is why human rights and equality matter. This little thing called economics theory alongside not excluding quality of life and mental health. Society matters. Security without any of these is not security.”

I am with you. Here’s another take on a big-picture solution: https://widerweb.org/@trevorflowers/112215137141805964

Clive Robinson April 5, 2024 8:25 PM

@ Bruce, ALL,

Re : Universal Basic Income (UBI)

UBI exists already in a form which is called “Basic Tax Allowance”(BTA)

However there are two problems that need to be thought about.

The first is that tax is not uniform due to political kickbacks and the like the more income you have be it earned or unearned the less tax you pay. This is made worse by offshore corporations used as tax free financial instruments.

The second is what has already been seen around Silicon Valley. As peoples income rises, others artificially raise prices to “gouge” what they see as their “rightful entitlement” and hide it away behind a faux “supply and demand” argument. In most cases they acquire as much of a resource as they can and then create an artificial supply reduction thus faux price rise. In turn a great number do the same in what is vaguely politely called “rent seeking”. Thus “fiscal wealth” rather than “asset wealth” becomes immaterial as faux-inflation takes over and the numbers go around faster than those on a taxi down a mountain.

The question is how to stop these two issues.

On the plus side back in the 1980’s in the UK “unemployment benefit” had real value and many people with passion got an opportunity they would not have otherwise had. Culturally the UK was very much the better for it and the UK “Exchequer” also did well out of it for many years there after.

Unfortunately due to certain politicians who very mistakenly thought the UK could survive as a “service sector supplier” rather than an “industrial production supplier” we did the equivalent of hand a teenager a quart of whiskey and the sports car keys. We allowed the self entitled psychopaths to take over and it’s not hard to see the supposed “hidden hand of the market” doing “smash and grab” every where they can. Worse they bribe legislators to make their acts that should be crimes legal…

You only have to look at the legislator bribes Google pays over and in return have been more or less left alone in the UK and US.

The problem we have is that UBI will work for the majority, but unfortunately the minority will find ways to ruin it for their benefit.

Less than a day ago I linked to a story about fraud in the EU carried out in Italy. The people behind the fraud are the same “self entitled” types you find behind technology bubbles. The difference is for some reason the “pump and dump” behind those technology bubbles are “legal” even though to ordinary mortals it is just another “fraud” and those behind it should be doing 120 or more behind bars.

Thus the problem would appear to be legislators holding their hands out or actually doing a “shake down” thus acting as “enablers” to crime for which ordinary citizens end up paying endlessly.

How we solve this very real “security” issue is a tough one, and something Ross Anderson was trying to work towards in a number of ways.

It’s something we should all think more about.

German Engineer April 5, 2024 11:34 PM

Well, some people call it paranoid, but I would say: don’t weaken security, just because you don’t see right now that your decision would weaken security.

This was not very concrete, so more to the xz-issue: why to replace a safe_fprintf() by a fprintf(), just because someone asked you doing this and you just can’t see at the moment (even after investing 5 or 15 minutes?) , that this would create a problem?

Well, why did someone write a safe print function, and also even use it in the code?
Just pure paranoia? Maybe the person that wrote that code (implementing the function and also using it in the same place that you are looking at, when you review a code-change), had some more knowledge/experience/insight in the problem than you? Does this weigh less than your good intentions to be inclusive to changes of unknown people? Wouldn’t it be mean to say no?

Well, the term Chesterton’s Fence is a thing, you might want to consider.

https://fs.blog/chestertons-fence/

Clive Robinson April 6, 2024 6:58 AM

@ German Engineer, ALL,

Re : What is a fences reason to be.

“Well, why did someone write a safe print function, and also even use it in the code?”

Mostly lacks answers in the software industry.

I was trained as both a “researcher” and “engineer” in a number of disciplines before the software industry as we see it existed.

In all those disciplines two fundamental rules were always impressed on people,

1, If it’s not written down it never happened.

2, Those who do not learn from history are condemned to forever relive it.

The second of these is attributed to the philosopher and writer George Santayana, and his original writing was,

“Those who cannot remember the past are condemned to repeat it.”

But how do you “remember the past” if you have not lived it?

Well that is what the first of the two fundamental rules is all about.

Engineers were encouraged to always carry two note books with them,

1, A “log book”
2, A “Diary”

For recording “what is past, and what will be”.

It’s something that is said “all successful people do” and for good reason.

Perhaps the most important part of research thus design thus engineering is not just knowing something happened but why. We call it “the scientific method”, by which we fundamentally learn all there is to know about the world around us. And how we derive “The laws of nature”.

https://en.m.wikipedia.org/wiki/Scientific_method

In the scientific method you will see six steps, all but one of which require written records.

As part of “engineering” every thing you do is either “a project” or a part of one as “a project within a project”. Whilst not a “Turtles all the way down” it is very certainly recursive.

All projects carry “documentation” as a necessary requirement and information is abstracted from the log book and diary to do this.

One of the most important documents is the list of problems that were come across and their attributes,

1, why they were problems
2, how they were recognised
3, how they were diagnosed
4, how they were mitigated
5, how they effected other parts of the project.

This formal documentation goes by several names but informally is “The History File”.

One thing that is often lacking in the software industry is project “documentation” and especially the “History Files”.

So from the perspective of the first fundamental rule nothing ever went wrong (which we know is impossible), and so from the second fundamental rule nobody ever learns…

Which fairly adequately describes the software industry.

The proof of which if needed is the same mistake or slight variation there of is made over and over…

It does not matter if it is Chesterton’s or Santayana’s observation that gets quoted, the fundamental cause is,

“Not learning / wanting to learn”

From past issues and knowledge.

Something I’m known to bang on about from time to time 😉

echo April 6, 2024 8:49 AM

Perhaps the most important part of research thus design thus engineering is not just knowing something happened but why. We call it “the scientific method”, by which we fundamentally learn all there is to know about the world around us. And how we derive “The laws of nature”.

Anyone going anywhere near anything political, or corporate, or involving DEI needs to keep a log, and where appropriate complete copies of all relevant policy and documentation and commentary, audio or audio visual recordings whether covert or overt, and in some cases snapshots of entire websites.

In fact I just read a peer reviewed document this morning which is relevant to work done 30 years ago. It doesn’t tell anyone they didn’t know at the time. What it does do is provide a piece of paper frustrated subject matter experts can wave in the face of managers and politicians and even journalists if they don’t get brain fade assuming of course they even understand what the paper says without being spoonfed.

Winter April 6, 2024 9:18 AM

@echo

In fact I just read a peer reviewed document this morning which is relevant to work done 30 years ago.

Now you make us curious. What paper? And why do you hide it for us?

Clive Robinson April 6, 2024 11:04 AM

@ Winter,

Remember the Doh-gnarled trumps on with any old nonsense, that comes into the void above the lizard brain, pretending it is the beholdent truth when it’s entirely fictitious.

In practice such words hold less water than the bucket discussed with “Dear Liza”[1]

And those who fail to learn from it or similar get outed, and thus just end up looking really silly, evasive or both over and over…

[1] It’s a nursery rhyme song That originates from the 1690’s if not earlier as found in the “Little Mountain Songbook” (Berglieder büchlein) collection of German folk songs. It’s an example of songs and rhymes by which children are taught by indoctrination of rote learning through organised play,
https://www.songsforteaching.com/folk/theresaholeinthebucket.php

Only some can never figure out what it’s all about, thus go through life making the same old mistakes over and over.

44 52 4D CO+2 April 6, 2024 6:20 PM

It’s been a minor annoyance, but I finally looked up how to read Mastodon posts without enabling javascript or installing an application.

Just add /embed to the end of the URL, like:

https://widerweb.org/@trevorflowers/112215137141805964/embed

I also just learned about HTMX, which sounds very promising. Although browser makers have had 30 years to figure out a better solution than javascript, it may be time to revisit that again.

I won’t expect it from
<marque>Appooglesoft</marque>
because they have different incentives, but maybe from somewhere else

acommenter April 8, 2024 11:46 AM

I recommend all the pundits here go read Ken Thompson’s 1984 Turing Award paper – “Reflections on Trusting Trust”. Then explain how a free for all development is better than one more tightly controlled and tested. I wonder if anyone really checks the GNU compiler commits. I hope so.

Winter April 8, 2024 12:43 PM

@acommenter

I recommend all the pundits here go read Ken Thompson’s 1984 Turing Award paper – “Reflections on Trusting Trust”.

I recommend anyone pointing to Ken Thompson’s paper go read David A Wheeler’s page on “Fully Countering Trusting Trust through Diverse Double-Compiling”
‘https://dwheeler.com/trusting-trust/

I wonder if anyone really checks the GNU compiler commits. I hope so.

What about all the proprietary compilers? Who checks their commits?

JonKnowsNothing April 8, 2024 12:50 PM

@acommenter, All

re: does anyone really check?

Generically no one checks anything.

They may check their own stuff but maybe not. The rate of bug creation in systems indicates few do basic sanity checks on their own code.

The same problem happens at Wikipedia which is a public edited encyclopedia. There are lots and lots of rules on how to do edits and update topics but in practical application it’s a mixed bag.

A noob editor just wants to put up information and is not aware of all the minutia of edit formatting. The majority of articles are maintained by a cadre of long term wiki-editors and/or their government aligned versions.

  • Governments are highly aware of what is in an article and will do a lot of edit-wars to keep the information “light and friendly”, which is hard to do on intense topics like Gitmo. Gitmo 20yrs ago was so hot edits were locked down and the USA version was very favorable to the CIA offiical view.

Long time editors are only familiar with some of the information on a given topic. They don’t necessarily know A from Z or any nuances. Sometimes all they are running are formatting bots marking too many reference links or topics with too little or too much detail. They are not looking at the content-information.

Lots of articles have orphan editors. Perhaps someone who watched it and updated it for years and years and then stopped. Death happens along with boredom of fighting government bots over details.

  • Canada starlight tours / a recent similar situation where police took people’s phones and boots, then dropped them far away where they were left to walk miles in the snow barefoot.

So, content testing follows a similar pattern. When you have the possibility of N+ people modifying your code base, at some point you just go “Wha???”.

Consider:

  • How often do programmers clear every single warning or error message on the compiler run?

Yep about that often…

Winter April 8, 2024 12:54 PM

@JonKnowsNothing

How often do programmers clear every single warning or error message on the compiler run?

I have heard from developers that this is a pointless waste of time. My own limited experience supports this. Errors, maybe, but warnings?

JonKnowsNothing April 8, 2024 1:32 PM

@Winter, All

re: this is a pointless waste of time

Precisely the problem. If you don’t look you don’t know. It’s a form of trust. You trust that the warning is nothing. You trust that the error is meaningless. You make a value judgement: that the 10min of your time to clear them or verify they are not a problem, is worth the risk of a critical failure.

The only thing that gets checked is a compiler halt.

No worries, nearly everyone does the same.

Nobody knows, nobody sees

Nobody knows but me

“Long Black Veil” is a 1959 country ballad, written by Danny Dill and Marijohn Wilkin and originally recorded by Lefty Frizzell.

Winter April 8, 2024 4:21 PM

@JonKnowsNothing

You trust that the warning is nothing. You trust that the error is meaningless. You make a value judgement: that the 10min of your time to clear them or verify they are not a problem, is worth the risk of a critical failure.

I don’t know about your experiences, but I always got endless reams of warnings that were just that, warnings about possible problems. So many warnings that I could spend not 10 minutes, but all my time, fully 100%, chasing them. And they all tended to end in false alarms.

A compiler error, on the other hand, tends to end in an aborted compilation.

Also, the real bugs, or backdoors, do not raise compiler warnings or errors. So even if you could clean up all warnings and errors, you still would not have a secure program.

JonKnowsNothing April 8, 2024 7:21 PM

@Winter, All

re: they tended to be false alarms / still not be secure program

Firstly, chasing errors and warnings, or not, never guarantees a secure program and I never said it did.

Your trust is just that. Your excuse is just that. You justify doing or not doing something. This is on YOU. You make the call.

  • well nobody else does it…
  • it’s a waste of time
  • it’s false alarms (one might ask how you know if you didn’t check but probably not worth asking)
  • cannot resolve the condition (one might ask why not but probably not worth asking)

In many sciences and maths problems, the outliers are just as important as the “good data points”. Perhaps more useful because they show “something”. That something is not supposed to be there. There are loads of companies and science projects and software programs that hide “what is not supposed to be there”.

There are simple warnings/errors and complex warnings/errors. Nearly all have a remedy but some do not. Avoiding fixing what can be fixed only reduces the potential that a more serious condition will be hiding in the streams of error message.

Message like those behind internet pages with all their layers of untraceable run time libraries are almost impossible to resolve because at the level where you see them you have no control at all over what’s below it.

Compiler messages tend to be more personal. You cannot fix a stack problem if you don’t have access to the stack, but I would suggest you not ignore a warning about it.

Winter April 8, 2024 8:25 PM

@JonKbowsNothing

it’s false alarms (one might ask how you know if you didn’t check but probably not worth asking)

I am starting to doubt whether you ever have tried the -Wall flag in a compiler. Or contemplated the reason for having levels of compiler warnings.

In short, I suspect you have the wrong idea about the intended function of compiler warnings.

Compiler warnings remind me of these warnings on food labels about the possibility that their might be traces of nuts or other allergenic substances in the package. Or these infamous California Prop 65 warnings about cancer risks of roasted coffee beans.

These warnings are not wrong, but only helpful in specific circumstances (or not at all).

JonKnowsNothing April 8, 2024 9:11 PM

@Winter, All

re: I am starting to doubt whether you ever … / real purpose of

Is THIS the best you can do? Really? Claiming you know about what you don’t even check for?

I know what I check for. I know what I fix. I know other folk’s code too. I know because they are the source of the errors. I know because their errors end up causing a whole lot of other problems. Problems someone else gets to fix.

I am starting to doubt you ever programmed anything at all that required rigor.

Winter April 8, 2024 10:03 PM

@JonKnowsNothing

I know because they are the source of the errors.

Warnings are not errors. Trying to resolve all compiler warnings is as useful as trying to buy food without label warnings. You should heed the warnings that are relevant for your situation and let the other warnings be.

I just looked at Redhat’s explanation of gcc warnings [1]. I really cannot find any sense of “resolving” all warnings being the recommandation.

[1] ‘https://developers.redhat.com/blog/2019/03/13/understanding-gcc-warnings

JonKnowsNothing April 9, 2024 12:25 AM

@Winter, All

Re: fix what you can … or don’t

In complex systems, a warning can become an error. It might be a warning for your segment but once incorporated into a larger system, it becomes an error. An error of syntax, type, class, enumeration, logic flaw, Big-endian, Little-endian and myriad other forms of FKUPs.

Please do continue in your belief of sloppy work. You are not alone if that makes you feel better.

===

ht tps:/ /e n.wikipedia.org/wiki/Endianness

  • In computing, endianness is the order in which bytes within a word of digital data are transmitted over a data communication medium or addressed (by rising addresses) in computer memory, counting only byte significance compared to earliness.
  • Endianness is primarily expressed as big-endian (BE) or little-endian (LE)
  • The adjective endian has its origin in the writings of 18th century Anglo-Irish writer Jonathan Swift. In the 1726 novel Gulliver’s Travels, he portrays the conflict between sects of Lilliputians divided into those breaking the shell of a boiled egg from the big end or from the little end.
  • By analogy, a CPU may read a digital word big end first, or little end first.

Winter April 9, 2024 1:02 AM

@JonKnowsNothing

Please do continue in your belief of sloppy work.

I will.

I will also keep eating food containing peanuts and nuts, and drink coffee, with utter disregard of the food warnings on the labels.

JonKnowsNothing April 9, 2024 10:06 AM

@Winter, All

re: I will also keep eating food containing peanuts and nuts, and drink coffee, with utter disregard of the food warnings on the labels.

Better you than me. I’m allergic. Thankful for the labels.

You can have my share. Vas-y.

Winter April 9, 2024 10:34 AM

@JonKnowsNothing

Better you than me. I’m allergic.

My point, warnings are situation dependent. What is relevant for you might be irrelevant for me and vice versa.

JonKnowsNothing April 9, 2024 11:04 AM

@Winter, All

re: What is relevant for you might be irrelevant for me

For sure, I am not coming to your house for dinner…

Bon Appetite

Winter April 9, 2024 11:23 AM

@JonKnowsNothing

For sure, I am not coming to your house for dinner…

That is quite an insulting remark.

If I feed a guest, I will make sure to abide by every requirement and wish.

However, it would be utterly foolish to every day serve veganistic kosher halal meals, avoiding every possible allergenic, just because someday, some guest might come who has some special requirements.

Clive Robinson April 9, 2024 5:16 PM

@ JonKnowsNothing, Winter,

Sometimes you need to remember the ages old saying,

“One man’s meat, is another man’s poison”.

We all have foods we are less tolerant of than others are, such is the way of life.

When I used to regularly have guests, I used to make several dishes for each course and alow the guests to help themselves.

For most of us rice/potato is fine for the carbs and so are the likes of carrot and parsnips, especially if roast or air fried as game chips.

Yes making food for all needs can ve wasteful but it does not have to ve expensive and mostly the foods that are not eaten can be given new life as something else (you can use mashed potato instead of flour in many bread and cake recipes).

One trick is to make a “main” for say a vegetarian into a small side for someone who is eating fish or meat.

It’s actually not difficult to add dried fruit and thin sliced vegetables such as peppers onions and carrots and even raw or cooked peas, fresh fruit and similar into a salad and have side sprinkles of cubed cheese, slivers of anchovies, nuts and cured meats for guests to chose from.

The problem I find is many these days have a rather strange notion of a salad being,

“Not to enjoy but punishment for imagined sins thus great big bowls of “bitter leaves”…”

Heck I’d rather eat nettles, dandelions and burdock that are generally considered “weeds”.

JonKnowsNothing April 9, 2024 6:06 PM

@Clive, @Winter, All

re: plus ça change

Eons ago, the USA had dueling sisters writing Etiquette Advice columns. They were likely the most read and the ones my family always talked about as we rarely talked about the headlines.

There was a column about going to a restaurant and asking to take the uneaten portion of your meal home for the dog. It often was half of a 24oz steak, so a sizeable hunk of meat.

(It was also before BigAg and BigVet made commercial and vet supplied dog food the only pet food people are told to feed.)

Essentially it was an embarrassment problem. Should you or shouldn’t you. The debate lasted a long time.

Eventually it was not so embarrassing and the wait-person would either bring you a bag (sometimes waxed, sometimes not) and if they were nice they would put the items into the bag for you. Otherwise you had to do it yourself while trying to look bourgeois chic in your dress up clothes while dumping a huge hunk of dripping rare meat into a popcorn size bag.

Sometime later, the doggie part of the bag disappeared and the real reason emerged as the “take home box”. Sometimes remarked as “for my lunch tomorrow”. Salads are not easy to manage on the Take Home list but Ive been known to eat well wilted greens.

Today’s take home boxes are often multi-compartment ones so you can take home part of the entree, the side dishes and all the basket bread-bread sticks too. There were places where servers would slip you another bread basket mid-meal because they knew you would take it home.

  • We learn to ask when we get hungry enough

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.