Twelve-Year-Old Vulnerability Found in Windows Defender

Researchers found, and Microsoft has patched, a vulnerability in Windows Defender that has been around for twelve years. There is no evidence that anyone has used the vulnerability during that time.

The flaw, discovered by researchers at the security firm SentinelOne, showed up in a driver that Windows Defender — renamed Microsoft Defender last year — uses to delete the invasive files and infrastructure that malware can create. When the driver removes a malicious file, it replaces it with a new, benign one as a sort of placeholder during remediation. But the researchers discovered that the system doesn’t specifically verify that new file. As a result, an attacker could insert strategic system links that direct the driver to overwrite the wrong file or even run malicious code.

It isn’t unusual that vulnerabilities lie around for this long. They can’t be fixed until someone finds them, and people aren’t always looking.

Posted on February 24, 2021 at 6:19 AM33 Comments


Nick Alcock February 24, 2021 9:32 AM

“The researchers hypothesize that the bug stayed hidden for so long because the vulnerable driver isn’t stored on a computer’s hard drive full-time, like your printer drivers are. Instead, it sits in a Windows system called a “dynamic-link library,” and Windows Defender only loads it when needed. Once the driver is done working, it gets wiped from the disk again.”

This gibberish doesn’t make me want to believe that the journalists who reported on this have the least idea what they’re talking about. (Something being a DLL does not mean it is transient: almost none are, and most of Windows is implemented as DLLs which necessarily stick around for the life of the system.)

Opening a library dynamically doesn’t mean it’s not on the disk when it’s not loaded! It means it’s not in that process’s address space (and actually Windows can elect to hang on to it even after the application asks to close it, so even that is not always true).

Clive Robinson February 24, 2021 10:41 AM

@ Bruce, ALL,

They can’t be fixed until someone finds them, and people aren’t always looking.

Some people are always looking, but the reality is there are too few eyeballs in a “target rich environment”.

So what gets found tends to be on somebodies “favourite itch list”…

That is somebody comes up with an idea then goes looking for ways to do it.

It’s one of the reasons we get multiple instances in a newish class of vulneravility. Once the class is characterized finding other instances becomes much much easier.

Admittedly some take an industrial aproach where by they tend to try to shake things out with fuzzers and the like, but the percentage of bugs found that are exploitable tend to be low that way or low grade.

But some bugs are more interesting and not specific code related. Some are known to have been discussed since the 1960’s or five or six decades.

Space:LifeForm February 24, 2021 1:53 PM

Eyeballs and bug hunting

Why would one suspect the guard you hired to be evil?

Maybe if it was open source, it would have been discovered sooner.

Reverse Engineering takes more time and effort than reading source code.

Me February 24, 2021 2:24 PM


The guards are among the first you should investigate.

This is why many jobs require background checks.

AL February 24, 2021 2:49 PM

I run browsers in “incognito” mode not for privacy, but to prevent writes to disk that then results in interaction between the virus scanner and the browser cache.

The virus scanner runs with too much privilege to be asking for trouble with more disk writes than is necessary.

Clive Robinson February 24, 2021 4:12 PM

@ SpaceLifeForm, ALL,

Reverse Engineering takes more time and effort than reading source code.

It does to a certain extent.

However what can be hidden in a higher level source code like say C, can usually be seen easily in a Disassembly print out of the “machine code”.

I’ve always prefered working close to the metal, at one time all the way down to using a “wire wrap tool” as a debugging aid. But sadly I have to admit that compilers these days can beat me on various optimizations without problems, when it was the other way around just a decade or so ago (I’ll leave others to judge if it’s compiler improvment or my degradation with age 😉

The point is even with some compiler “loop the loops” the disassembly print out is generally quite readable. Certainly enough to work out likely names for lables and “comments” for code.

The problem is finding tools that do good disassembly[1] at a price that are not going to make share holders weep.

[1] Lets just say I harbour a grudge against certain people who developed a certain framework by which blocks of code get loaded into memory. Lets just say the grudge streteches to wax effigies and long pins 😉

fed.up February 24, 2021 5:21 PM

Did anyone see this Microsoft hearing yesterday? Thoughts?

(link broken)
ht tps://

notice FE testimony to Congress said put everything in the cloud for safekeeping
then this PaloA video is also very interesting
says the opposite

(link broken)
ht tps://

who do you agree with?

Clive Robinson February 24, 2021 10:41 PM

@ fed.up,

notice FE testimony to Congress said put everything in the cloud for safekeeping
then this PaloA video is also very interesting says the opposite

I’m not going to watch it because I’ve been through the variois arguments in the past.

You will find that the answer selected very very rarely has anything what so ever to do with security and almost everything to do with reducing cost. Such is the sad state of afaires within even the likes of the CIA and other IC and LEA entities.

To see this you can plot what is close to two curves on a graph, one for “In House” one for “In Cloud”. The y-axis being frequency of data set access/usage and the x-axis being size of the data set.

The results I’ve ploted tell you a couple of things. Firstly that the decision-making was “cost based”, with just one or two outliers. Secondly the cloud is being used as a dumping ground for rarely processed data.

This “dumping ground” is oddly not to disimilar to how the NSA Blufdale is assumed by many to work. A kind of “write continuously, read rarely” which once used to be done by “tape farm” archiving.

My data set of points is a small fraction of the number of entities and also in a way “self selecting”… So as they say “Your mileage may vary”.

I suspect other peoples information is likewise covered by “Confidentiality” or “Non Disclosure Agreement” thus getting the true picture is not goneng to be easy “for commercial reasons”.

That said my view has always been,

“Is your data an asset or a liability?”

Far to many people are storing way to much data and not only is it building up a longterm cost tsunami, the reality is many will not be able to justify the cost of pulling it out of the cloud onto their own secure servers.

That is any financial gain in keeping the data is at best speculative and in the main very questionable. So the reality for many is their mountains of data are actually very costly to store and use, for little gain, so at best not exactly a worthwhile asset.

Now, when you consider the data in terms of “how can it harm me” few if any appear to consider this aspect or correctly if they do so. So they build massive databases of every scrap of information the comes into, is generated by, or sent out by, the organisation, in effect documenting everything good, indiferent, or bad. But… It is estimated that upto 1 in 5 people will do things they know to be moraly or ethically wrong or clearly illegal for the sake of “quick results” and likewise the majority will do things that can be viewed as questionable when looked at from a certain perspective, and this has been true for centuries, yet we still…

So whilst many attribute to Cardinal Richelieu,

“If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him.”

It turns out it is probably not true. There is no contemporary documentation. However in a memoir written[1] after the Cardinal had passed it was reported that,

“as I have heard his friends tell, [The Cardinal] was in the habit of saying that with two lines of a man’s handwriting, an accusation could be made against the most innocent, because the business can be interpreted in such a way, that one can easily find what one wishes.”

So just “two lines to make an accusation”…

Thus to some a trove of information that some organisations keep, is without doubt a liability waiting to happen, when an opportunist sees advantage in doing so.

Thus the old quip of “One man’s meat, is another man’s poison” can be seen to apply.

Thus my advise to most people is “If you must keep data, then only keep processed data not raw data, where the processing removes or minimizes liability”.

In most cases the data can not be processed to remove or even reduce liability, and as it’s probably an asset of little worth, destroying it in an appropriate way is probably best…

But it may well be to late for many. Because it’s probably fair to say,

“What goes into the cloud stays in the cloud beyond your control.[2]”

So as my advise for a longer period of time than this century so far, has been “Don’t let your data out of your control”.

You can guess probably guess where my view point is since before the cloud was seen as anything other than one of the SaaS’s or “Stotage as a Service”.


[2] If the data traveled across a network like the Internet, or most Telco networks to get into the cloud. There is a fair chance it’s been “hovered up along the way” so is in other peoples databases very definately beyond your knowledge let alone control.

lurker February 25, 2021 12:02 AM


So the reality for many is their mountains of data are actually very costly to store and use, for little gain…

Oh for the good old days when the data was all stored on paper. Then every now and then you’d have a plague of mice, or a flood or a fire, and suddenly there was a whole lot less data to worry about.

Bob Paddock February 25, 2021 12:24 PM


“Reverse Engineering takes more time and effort than reading source code.”

Long ago I had an Avocet 1805, update of 1802, Assembler that produced pretty and correct listings. The .HEX output file created did not match what the source code or the listing showed. Took a bit of time to figure that out.

Some of the higher end software standards for flight grade software require inspection of the generated output for this very reason.

Clive Robinson February 25, 2021 12:56 PM

@ lurker,

Oh for the good old days when the data was all stored on paper. Then every now and then you’d have a plague of mice, or a flood or a fire, and suddenly there was a whole lot less data to worry about.

Very popular with UK Civil Servents at “senior levels” so much so that a still famous early 1980’s UK TV Satirical Comedy series “Yes Minister” had it pop up occasionaly in the story plots,

The series is actually seen by quite a few as being a requirment to those wishing to gain an understanding of the “mechanics” before entering politics, as much as but more enjoyable than “The Prince”, “The art of war”, and “1984”.

fed.up February 25, 2021 1:09 PM

@ Clive – thank you!

The big tech companies profits depend upon growing data.
That’s diametrically opposed to the best interest of their customers and society.
I agree with you 100%. The biggest risk to every institution is keeping too much data. Outside of science and healthcare, data’s value is on par with bitcoin. It is only valuable until people find out it isn’t.

The more data you have, the more data you have to protect. The more data you have to protect, the harder it is to protect it.

Back to the original post about finding such an old vulnerability. It is not an accident. Software is purposely designed this way.

I now wonder if this SW attack is really some type of ransomware event.

Overreliance on outsourcing – 3rd and 4th parties. Inability to tell the good guys from the bad. Not enough attention to qualifying who gets Privileged Access, which in institutions with mature cybersecurity is never given to anyone offshore nor any contractors.

During COVID the bottom dropped out. Borders closed. Hiring ceased.
Is SolarWinds the revenge of the contractors? Keep us on payroll – or else?

Why is no one asking if this is an insider attack at M or FE?

What’s the Government’s and regulated sector’s plan if this is a M insider? Business Continuity plans should have included this scenario.

M tells everyone to assume they are hacked. But sorry that’s not enough advice.

I certainly was hacked, but it wasn’t an innocuous ‘look see’. They damaged all my Microsoft files (just Microsoft).

They said Authenticator (2FA), Intune (MDM) and their 365 E5 Compliance tools may be compromised.

But what about AD and logs? I thought logs were unmodifiable, but if they are not to be trusted, then doesn’t this imply that the source code wasn’t just “looked at”?

During the Congressional hearing M said that users identities are compromised. But does that mean they turn that identity into privileged access to move around? If it isn’t insider involvement, it doesn’t make sense to me that the attackers assume identities to move around because most users have very limited access rights. Nothing about this makes sense, especially FE’s assertion that 1,000 coders developed this exploit when they haven’t even identified what the exploit is.

The most wicked thing about this hack is that even if Cybersecurity engineers in the private sector notice this is happening, there’s no way for them to report it to the Government without getting fired. And if it is true that identities have been compromised, how does anyone know whether they’ve been compromised?

I’m thinking about sinister scenarios whether funds could be exfiltrated and then accounting records modified to hide that theft.

NIST’s Cybersecurity Framework’s (CSF) first step is “Identify”. There’s no ability to perform any type of cybersecurity protection unless you first identity what systems you have installed, where they are installed, what data they process, who has access to each system/data type and where the data flows. If M cannot do this internally, then we need new laws. NIST CSF is a fabulous framework in theory, but in practice if M AD and Logs aren’t to be trusted then Cybersecurity is unachievable at present.


We need to make Cybersecurity achievable. It presently is NOT feasible.

We license doctors, massage therapists and manicurists. But technologists can cause far greater damage to our national security.

The FDA and USDA inspect our medical devices, medicine and food. But technology can cause far greater harm to our health and welfare.

The SEC couldn’t possibly manage Wall St. without FINRA, NYSE and NASDAQ and the OCC, FDIC and FRB examining. But they cannot be expected to assess Cybersecurity too. This is not their wheelhouse.

We need a new Government agency or NATIONAL LAB entirely dedicated to being an independent Cybersecurity Protectorate. US Sarbanes Oxley laws requires that cybersecurity be independent. So it makes no sense for it to be under DHS/CISA which is affiliated with Federal LEO. That means there will never be information sharing because of the 4th Amendment. Besides CISA was attacked in this hack and they also never produced a single report on ICS since 2016, the research they posted was from outside the government and usually always from outside the USA. Plus they aren’t providing the United States with any meaningful guidance on this hack. We desperately need to acknowledge that there’s so few people in this world that get Cybersecurity so for this reason alone it needs to be CENTRALIZED.

This new Cybersecurity Protectorate should perform thorough evaluations of all technology to be sold in the USA. Rip it open, down to the chip level. And just like meat or eggs, it could be graded and qualify for sale to specific Government sector or regulated institutions, SMB or consumers. This would increase entrepreneurship and funding opportunities too for startups with viable tech. It will stop the preponderance of vaporware or dangerous tech. There are so few people in this world that have the capability to analyze technology, yet performing this “Third Party Risk Assessment” is a law in all regulated sectors and Government too. But the private sector and even the Federal Government agencies cannot do this because this is a unicorn skill set. And to be fair I couldn’t analyze the accuracy of a financial statement or safety of a flank steak or pacemaker either. I wouldn’t try to do a CPA’s job so why are they doing ours?

But once you have a dangerous tool like SolarWinds in your environment, no amount of monitoring or security can protect you. If the Federal Government complied with their own regulations then SolarWinds should NOT have been in their environment due to it’s design, dependence upon M EOL components and also its finances. That’s the hearings we need to have. No experience is for naught so long as you learn from it.

JonKnowsNothing February 25, 2021 2:21 PM


re: The New Cybersecurity Protectorate

And what do you expect will actually change? What do expect will happen? Even should your entire list of wants be enacted; what will be different?

At the risk of being redundant… Nearly everything on your list is already required.

We have lots of experience with New Law Enforcement Agencies; War on Drugs, War on Immigration; War on COVID-19; War on Terror (getting a bit old that one).

Have you checked your list to see how that works in real life?

True: we get a few more people wearing black uniforms; carrying out No Knock No Warrants; shots in the dark and spit polished boots on the ground.

True: we get more lock ups; lock downs; lock em ups; don’t ask questions and just shoot procedures.

Security is like Maya… insecurity is treatable.

ht tps://

  • an illusion where things appear to be present but are not what they seem

(url fractured to prevent autorun)

Clive Robinson February 25, 2021 3:16 PM

@ fed.up,

I thought logs were unmodifiable, but if they are not to be trusted, then doesn’t this imply that the source code wasn’t just “looked at”?

You’ve hit on one of my favorite points about the difference between “paper records” and “data records”.

Paper records are such that trying to remove or alter a single record whilst not impossible is actually quite difficult, and extreamly difficult if somebody decides to have a closer more forensic look.

Data records however are “made to be altered easily” in some cases where things like records and file block size align it’s way less than trivial and can be “done in place” on the hard drive with a low level driver that just flips bits in a block[1]. Thus all the file system stuff like time stamps does not happen, and the less well known forensic technique that looks at the order blocks on the disk platters are used in, fails as well.

However the idea of “immutable logs” is seductive even though with a few seconds thought you realise that the usefull form of “appened only” is not fundementaly possible, thus needs the OS and more importantly the file system to support it properly which still does not stop the raw write issue. One way to make the raw write issue hard is to use a randomising encrypted file system. Whilst it can be got around the work involved is way beyond most peoples capabilities.

Which is why “net-logs” using UDP packets to a server on a “write only cable” that is also hardened in both software and hardware is the way some people go these days. The old way was to send the logs to the reserved / system console that was actually a genuine output to paper TTY but back in the 70s and 80’s sending a few thousand form/line feeds intersperced with random charecters to get all the paper out of the TTY became a standard but anoying trick, that some sysadmins solved using a filter and tape backup unit that could write just a block at a time.

Yes I know it makes me sound like one of those suspender wearing long white bearded guru tropes that certain cartonists take the Michael out of. But the upshot is unless you do take what looks like extream care to protect logs an attacker will blitz them some how. Which means your other points become valid.

But what is extream care? If you want the modern way to do it, create a network based log-server or net-log system, using a second network that only the servers are attached to via a second NIC. So then use an IDS to silently watch packets over the wire, and if it senses odd behaviour it can pull various plugs to stop an interactive attacker dead in their tracks and raise all sorts of alarms via pagers or SMS. Not perfect but then nothing that gets used for real work ever is.

[1] Back in the early days of *nix, there were certain file system limitations that positively encoraged certain software companies like Oracle to use entire file systems as raw devices. That is the application compleatly bypassed the OS file systems and security. If you knew what you were doing you could get Oracle’s software to do inplace block overwrites with no audit information recorded anywhere… Not sure if you can still do it I’ve steared well clear of Oracle stuff this century. However the down side of developing your own drivers for raw devices, is that when the original reason for doing it is gone, even long gone, there are always “new reasons” used to keep doing it. Often those new reasons are entirely spurious to cover up the real reason of “because costs / profit”.

fed.up February 25, 2021 7:10 PM

@ JonKnowsNothing

Cybersecurity is now on the honor system. And there is no honor whatsoever in the system. It has drawn the most spurious of characters, with the exception of Bruce’s followers.

M and the DHS/DoD say “No Trust”. This means we need laws.

@ Clive

I understand you. It took me a while.
The UK is about to pass a SOX equivalent law which mandates unalterable logs. If it is similar to the US, which they say it will be, this includes IT and cybersecurity.

ht tps://

JonKnowsNothing February 25, 2021 8:52 PM


re: we need laws

The problem with laws is that they only apply to some and not others, a select few ignore or circumvent them and the other parts of the globe do not necessarily agree to “our laws”.

In the USA, we cannot even get people to wear a face mask to save the lives of their nearest and dearest and the lives of every stranger they meet.

Getting a bunch of NSA-types to agree not to break the laws of other countries is going to be a tough sell; as in No Sale.

The NSA-types break the laws of their own country on a regular basis. They get some cover from FISC and sundry cop-support laws to get a free pass in court. Provided they even go to court, those zero-days are just too juicy to give up.

Perhaps if you started with the top three: NSA – CIA – FBI? Maybe you could pull their Diplomatic Immunity?

Anne Sacoolas is still hanging on to hers. Now she claims she Was A Working CIA Secret Agent when she ran over the kid.

The laws don’t apply and they don’t apply equally. Passing more Dos and Donts won’t fix the problem.

ht tps://
(url fractured to prevent autorun)

SpaceLifeForm February 25, 2021 10:12 PM

@ fed.up, JonKnowsNothing, Clive, name.withheld.for.obvious.reasons

Attribution is hard.

The cybercrooks do not care about any laws in any jurisdiction.

There is zero need for new laws. Existing laws work fine.

New laws would probably be a mistake.

The task is to catch the cybercrooks, and then prosecute.

As to logging, if the attacker is good, the evidence will not be logged.

We still don’t know who committed this crime. A big reason is that the UK-based Vodafone Group, one of the largest cellular providers in the world, bobbled its handling of some key log files.

These activities had to be kept off all logs, while the software itself had to be invisible to the system administrators conducting routine maintenance activities. The intruders achieved all these objectives.

To simplify software maintenance, the AXE has detailed rules for directly patching software running on its central processor. The AXE’s existing code is structured around independent blocks, or program modules, which are stored in the central processor’s memory. The release being used in 2004 consisted of about 1760 blocks. Each contains a small “correction area,” used whenever software is updated with a patch.

Let’s say you’re patching in code to force the computer to do a new function, Z, in situations where it has been doing a different function, Y. So, for example, where the original software had an instruction, “If X, then do Y” the patched software says, in effect, “If X, then go to the correction area location L.” The software goes to location L and executes the instructions it finds there, that is, Z. In other words, a software patch works by replacing an instruction at the area of the code to be fixed with an instruction that diverts the program to a memory location in the correction area containing the new version of the code.

It’s impossible to overstate the importance of logging. For example, in the 1986 Cuckoo’s Egg intrusion, the wily network administrator, Clifford Stoll, was asked to investigate a 75 U.S. cents accounting error. Stoll spent 10 months looking for the hacker, who had penetrated deep into the networks of Lawrence Livermore National Laboratory, a U.S. nuclear weapons lab in California. Much of that time he spent poring over thousands of log report pages.

fed.up February 26, 2021 6:16 PM

@ JonKnowsNothing

I apologize on behalf of all Americans. Truly. I wonder if the USA was in the Commonwealth would there be any diplomatic immunity? I don’t think so.

@ SpaceLifeForm

If a US corp doesn’t have secure logs that is jail time. WorldCom was a $129 Billion company in 1999 and it disappeared in a couple of a months due to this. Their CEO went to jail for 20 years and just died a few weeks ago. If software logs cannot be trusted and the software manages any regulated data then the software cannot be sold in the USA.

Clive Robinson February 26, 2021 7:03 PM

@ fed.up, SpaceLifeForm,

If software logs cannot be trusted and the software manages any regulated data then the software cannot be sold in the USA.

Trusted in what way?

As a general rule of thumb, data on hard drives or removable media,

“Are not worth the paper it’s written on”.

You have to have some way of ensuring,

1, Time accuracy.
2, Order accuracy.
3, Data accuracy.
4, Log isn’t a fake construction
5, Log isn’t a substitution.

Doing all of them is not an easy thing to do without some “interesting mathmatics” and some interesting equipment, and even then not that reliable.

Take “time accuracy” obviously any computers two internal clocks (Tick and Wall) can be fritzed with in some way, the mechanism to do so is built in for administration and syncing reasons. So you go for an extetnal clock of some kind and sync to it. But how do you authenticate it is real? GPS clocks can be fritzed if you know what you are doing, and GPS does not have a reliable authentication mechanism anyway… But what about “replay attacks”?

Similar can be said of the other requirments.

As I point out to people “You are not your cell phone, and your cell phone is not you”. That is there is a very distinct disconnect. I could walk your normal lunch route carrying your phone whilst you were off at some secret rendezvous. I could even send a couple of texts you had told me to send as well.

So if people believe the technology is authenticating, then who was at the secret rendezvous?

Authenticating “data” and the “logs” it appears in is even harder and subject to just as many work arounds.

Whilst there are partially secure ways, most of them are based on mathmatical assumptions or publication of the data either to “the public” or an independent but trusted third party. For an overwhelming number of cases such sharing is not just undesirable but potentially not legal.

fed.up February 26, 2021 8:49 PM

@ Clive

There was another Congressional Microsoft/FE/SW hearing today.

It was long, but often focused on logs. Apparently M might not keep any despite various US laws. M also charges customers for logs.

Perhaps Congress is reading this blog. One can only hope.

My phone company keeps great logs. I often download my metadata when I need to remind myself of an important conversation or text. I have not tried to query data older a few years but I’m told I can if I need to. It probably isn’t a well known feature amongst the general public, but I find it so important that it keeps me tethered to this carrier for decades.

Another topic discussed during the hearing, employee whistleblower laws. M wasn’t supportive. at all

Rep Katie Porter at the 4:09 mark is very impressive. She gets tech and wants to make important changes especially concerning logs. I like her.

ht tps://

JonKnowsNothing February 27, 2021 12:18 AM

@fed.up @Clive

fed.up: phone company keeps great logs.

4, Log isn’t a fake construction
5, Log isn’t a substitution.

Phone company logs are not reliable. Doesn’t matter what the metadata download shows you, they can be changed and not all changes happen in an audit trail.

Not too long ago, there were proofs of trackless and traceless methods employed to alter secure documents. This included pdfs and all standard documents as well as database documents such as PACER court filings. These techniques are in use by the 3-Letters to alter documents, timestamps and doc logs and other details. The disclosure asserted that these methods are not detectable by data forensics. Subsequent studies confirmed the finding. (ymmv)

Company files are accessible on the front end by company representatives and these interactions are generally logged in an audit trail. On the back end tech support can alter data some of which may be logged to an audit trail and some may not. On the tech side, data manipulation can be done directly by anyone with the correct access levels.(1)

Forgery and substitution have long histories. Anyone who has dealt with ID Theft know the value of a log is next to Nil. Securing documents and processes isn’t going well overall. (2)

Just because the ticket said you ran the red light with a photo, date and timestamp and payment demand; did you really?

1, ht tps://

2a, ht tps://

2b, ht tps://

(url fractured to prevent autorun)

traveller February 27, 2021 8:02 PM

I used travel regularly between Right hand drive and Left hand Drive countries and to be honest I’ve lost count of the number of times that I have driven on the wrong side of the road.
I don’t know what more to say then that it happens.
Country roads are the worst because they often lack visual clues like parked cars or lane markings etc to indicate that you’re on the wrong side of the road. Even if there is a giant sign your brain ignores it because in your own head you are doing nothing wrong.
I’ve found that this wrong side of the road thing can be triggered by some need to make a sudden change in your plans this distraction diverts your attention at a crucial turn and then suddenly your on the wrong side of the road. even when another car is coming towards you it never occurs to you that you are the one that is on the wrong side of the road. I know this all sounds implausible but I have done this myself on more than one occasion, fortunately it never resulted in an accident but one time it came damn close and I was thanking my lucky stars that there was space to swerve onto the other side of the road even than it took me a moment to realize that I was the one at fault.

I don’t know what to say except that this happens.
Generally I try not to drive for the first few days after I’ve changed countries. the worst possible combination is to be driving a Left hand drive car in a Right hand drive country (as used to happen with American armed service people). That along with a country road is just asking for problems.

Cassandra March 1, 2021 3:54 AM

@Clive Robinson

Re: Yes Minister and loss of records

It was the floods of 1967

James Hacker: Was 1967 a particularly bad winter?
Sir Humphrey Appleby: No, a marvellous winter. We lost no end of embarrassing files.

Yes Minister: The Skeleton in the Cupboard hxxps://

Cassandra March 1, 2021 4:05 AM

Re: ‘Immutable logs’

As well as print-outs on fan-fold paper, which are reasonably difficult to alter when un-split, when Write-Once CD-ROMS became available, some people started using those, as they take up less space.

Ideally, you use CD-ROM blanks with a factory embedded serial number recorded elsewhere, otherwise it is trivially easy to produce a duplicated version with suitably amended key records. Duplicating a box of fan-fold paper is not so easy, especially if the printer is a daisy-wheel/golf-ball/chain-printer with unique imperfections in the letterforms.

These days, one can use blockchain techniques to render logs difficult to amend or forge.

Clive Robinson March 1, 2021 5:26 AM

@ Cassandra,

These days, one can use blockchain techniques to render logs difficult to amend or forge.

Their “proof” relies on a “public ledger” held as “multiple copied” in “multiple locations” by “multiple disassociated entities”.

Which renders their use in general not “practicle” for the hundreds of millions if not billions of log files filling up every day.

But the “public” is a real issue even when using “encrypted data”, it leaks data by the bit bucket load via “traffic analysis” amongst other things, which is technically a “no no” under various pieces of legislation.

And that’s all before we talk of that minor technical problem of using electricity by the oil barrel full just to support the “work factor” of the desired security level against attacks.

The blockchain was and still is an interesting idea, but it got hyped beyond all credability by people on the make and it’s shortcomings fairly quickly appeared.

Cassandra March 1, 2021 7:53 AM

@Clive Robinson

Re: Blockchain logs.

(Perhaps) I was overcomplicating the issue. Periodically closing the log and placing a cryptographic hash of the logfile just closed in the newly opened next log file makes it difficult to amend closed log-files. In a low-traffic log, you can also record the hash of the previous entry in each new entry. This is the basic, simple principle behind a blockchain, to which you can add the complications of making it public if you wish.

Changing the log-file requires recalculating hashes and rewriting all the files after the change; or having some means of breaking the hash and adding plausible entries to manufacture a hash collision.

This technique allows one to assure the integrity of a log-file, but not reconstruct one that has been damaged: for that you would need error correcting codes applied after the hash.


Clive Robinson March 1, 2021 8:46 AM

@ Cassandra,

(Perhaps) I was overcomplicating the issue. Periodically closing the log and placing a cryptographic hash of the logfile just closed in the newly opened next log file makes it difficult to amend closed log-files.

Ahh a “different beast” upon which “Merkle trees” ride, and the “blockchain” rides like an evil elf.

But the original idea goes back to the equivalent of a “chain cipher”

The problem with such a system is if I know you only have upto block n[i] I can append as many blocks after as I wish without you being able to say if they are genuine or not.

Also they depend on there being no easy way to find a collision. If I can find a colision then I can change the contents of the block.

But unless the hashes are “publicaly verifiable” on all blocks as they are created then there is no security.

The problem with that is the blocks of data leak meta-data to a public space. Thus size of block or frequency of blocks gives away information about activity, at the very least.

So we still need a private logging system that is secure there are some ways potentially to do this but there is the question not just of QC that can in theory break such a system, but what follows on after QC. Humans are creative little critters just like chipmunks. And just like chipmunks, we have a habit of destroying that which comes before in the name of progress.

I remember a time when “experts” were making bold claims of asymetric crypto… Likewise the irreversibility of “one way functions” that in a practical form give us the basis for CS-hash functions. If we lookback we see the litter of those high pedestals on which the claims were placed smashed and broken…

There is no reason to think that such pedestal smashing is going to stop any time soon…

Cassandra March 1, 2021 9:35 AM

@Clive Robinson

I’m all for pedestal smashing. By that way we get progress.

As for the chain of log files, one possible technique is to generate a list of random numbers, which you keep secret. Each time you roll over to a new log file, you record the next random number in your list in the new log file.

As long as your adversary does not get a copy of your list, he or she will be unable to predict the next number in the list, so cannot generate false ‘forward logs’.

Please note, I’m not claiming there is a solution for producing immutable logs. I’m not that silly. However, you can make things hard for adversaries, and it is fun to try and keep ahead of the ‘hinky’ techniques used by adversaries.

(And yes, if someone were to get a copy of your list of random numbers without you knowing, they could cause problems, also if they can rewrite it without you knowing. There are things you can do about that, too. Setting tripwires for people who think they know how a system works is also fun.)


Clive Robinson March 1, 2021 9:56 AM

@ Cassandra,

one possible technique is to generate a list of random numbers, which you keep secret. Each time you roll over to a new log file, you record the next random number in your list in the new log file.

What you realy mean is not “random” but “verifiable, but unpredictable to others”

Two such ways exist in most litrature,

1, Running hash,

S = H(H(H(H(seed))))

That is you take a secret seed value and just keep hashing it each time you need a new Secret.

2, Crypto counter,

S = Ek(CNT+n)

You have a counter that is loaded with a secret seed and you Encrypt it with a secret key k. Each time you need a new secret you simoly increment the Counter.

Both of these have issues, and you want to make it more difficult to forge.

One thing suggested in the past is you take the front page headline of a well known newspaper, and you encrypt it under a secret key which is changed daily, you could use either method 1 or 2 to genetate the daily key.

Whilst all methods jave some advantages, they also have disadvantages as well.

However bringing in “verified public daily data” to form the start of a log record has merit, so ideas along those lines are interesting.

Cassandra March 1, 2021 2:46 PM

@Clive Robinson

Actually, I did mean random. They have the benefit of absolute unpredictability. It does generate the problem of how you attest that your list of random numbers is the correct one, which is an intriguing problem in its own right.

However, I have to do ‘other stuff’, so I will thank you for an interesting side discussion.


SpaceLifeForm March 2, 2021 4:47 PM

@ Cassandra, Clive

It does generate the problem of how you attest that your list of random numbers is the correct one, which is an intriguing problem in its own right.

A secret can be kept secret if there is only two that know.

More than two? Not good.

It’s more secret if one of the two is dead, but that is not even true if the secret can be found later on a computer.

I touched on this last Halloween.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.