Agree. Someone is trying to conflate that a side-channel attack is a timing-attack, and attempt to put words in your mouth.

While a timing-attack can become a side-channel attack, the converse is not necessarily true.

And, you clearly did not imply so.

]]>Unless you come up with some proof, your claim that under influence of the NSA ‘the reference implementation of AES was deliberately written to allow timing attacks’ is just a classic conspiracy theory, nothing more.

First off it appears that you are not reading what I wrote, then making at best incorrect assumptions.

You will see from what I wrote,

“It comes about when you disassociate “theory” from “implementation” which the NSA did through hoodwinking NIST in the AES contest as I’ve mentioned several times in the past.”

I made no mention of

‘the reference implementation of AES was deliberately written to allow timing attacks’

Which you quoted, presumably in the vain hope that people would assume you were quoting me, which you are not.

So you came up with something entirely of your own creation so that you could go on and say,

“just a classic conspiracy theory, nothing more.”

Which is a clasic example of some call a “strawman attack”, which has been of such over use by shills, trolls and similar of recent times, I find it odd that some one would think they could get away with it.

Well you’ve clearly failed and been hoist by your own petard.

You should have realised when I said,

“as I’ve mentioned several times in the past”

I’ve already been through the evidence in the past on this blog more than once…

So as I’ve said to those “strawmaning” in the past,

“Go look it up”.

]]>which the NSA did through hoodwinking NIST in the AES contest as I’ve mentioned several times in the past.

A good example of “arguing from known effect back to some wanted cause” as you put it yourself just some days ago.

Unless you come up with some proof, your claim that under influence of the NSA ‘the reference implementation of AES was deliberately written to allow timing attacks’ is just a classic conspiracy theory, nothing more.

]]>Thanks. That paper was an interesting read.

My conclusion is that the biased real random source couples the original sequence to the output. And given enough redundancy in the input ‘signal’, the original signal ‘leaks out’ well enough to be at least partially decoded. Work force reduction!

I am thinking of some experiments I did a long time ago with distorting analog signals with a slow sigma delta converter to test various modem code. While it worked well as a controllable distortion simulator, the real signal leaked out amazingly well.

I am not sure my example makes sense to you? I hope so.

Which brings me back to your original point that you can never test for randomness only for non-randomness and then find non-randomness maybe only by accident!!

I remember thinking many years ago that combining several pseudo-random shift registers would give ‘random’ numbers to power a random noise ‘generator’. Which brings us back to whether our ‘random’ number are ‘good enough’ for our current use. Kinda untractable! Oh well :).

warm regards,

John

I don’t understand how a ‘biased’ random source compromises the encrypted information?

In an unbiased source the number of ones equals the number of zeros over a sufficient period of time thus the probability of ones and zeros is at 0.5 or 50%.

So lets consider collections of bits as numbers. Simply writing down all sixten states of a 4 bit count,

1, 0000 0001 0010 0011

2, 0100 0101 0110 0111

3, 1111 1110 1101 1100

4, 1011 1010 1001 1000

Shows that not only are the number of ones and zeros balanced, they are balanced in smaller groups. If any one bit state was changed to give bias, you would not have 32 of 64 bits set or clear but 31 of 64 or 33 of 64. But you would also then loose one unique 4bit count state and end up with two states the same, thus go from 16 of 16 to 15 of 16 states.

It does not matter how you change the bit bias if it is unbalanced you will end up with a change in the number of states available at the output to a lesser number. Which by definition of information entropy being the log base 2 of the number of states means that the entropy has decreased.

But also check the transitions between pairs of bits you will find when the raw source is unbiased the number of transitions to transitions remains balanced (what you would expect with a parity function). It’s using the 01 or 10 transitions to give “zero” or “one” respectively that makes the John von Neumann debias circuit work but for a significant loss of potential entropy recovered from the source output.

But again note the patterns

1, 0.0 0.0 0.0 0’1 0.0 1’0 0.0 1.1

2, 0’1 0.0 0’1 0’1 0’1 1’0 0’1 1.1

3, 1.1 1.1 1.1 1’0 1.1 0’1 1.1 0.0

4, 1’0 1.1 1’0 1’0 1’0 0’1 1’0 0.0

Any bias would cause disruption to the paterns –in time or sequence– again as you would expect from parity functions. You will also find it relates to “Walsh Transforms” as well which takes us into coding theory which is an interesting part of information theory (but not directly relavant to this explanation).

The point is :-

“Any bias also breaks the structure of events in time or sequency as well, and that comes through the debias circuit.”

As John von Neumann was well aware[1].

The fact the change in structure appears after a von Nuemann de-bias circuit, even though the bit count is debiased makes the point information “leaks through” to the ouput about bias even though you try to stop it doing so[2].

[1] The John von Neumann paper where he describes the de-biaser is more infamous for the “a state of sin” sentance. However the ultimate sentance of the second paragraph says,

“The resulting process is rigorously unbiased, although the amended process **is at most** 25 percent as efficient as ordinary coin tossing”

Shows he was well aware of the structural bias showing up through the bit de-bias circuit. However he did not amplify on it as it was not relevant to the two points he was making.

Read the paper for yourself, it is after all a piece of history in many fields of endeavor,

https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf

[2] This leakage via a side channel is something that those studying cryptography and or ICTsec in general realy need to get their heads around (and few do). It comes about when you disassociate “theory” from “implementation” which the NSA did through hoodwinking NIST in the AES contest as I’ve mentioned several times in the past. To see this with supposed “True Random” bit generators, you need to know that several popular types of hardware “True random’ bit generators –infact mostly all “in silico”[3] these days– use one or more oscillators which tend to be somewhat stable in time. This means that the output from the raw entropy “true source” is also frequently stable in time.

However the von Neumann de-bias circuit at best gives 25% of the output rate. If the actuall “true source” is or becomes biased then it will be less than 25% which is easily measurable with quite some precision. As I’ve mentioned in the past on this blog to much incredulity, you have to take considerable care how you de-couple parts of TRNG’s not just from each other but observers of the output or other parts of the system.

[3] The expression “in silico” (in silicon) is a modern “latinism” added to “in vitro” (in glass), “in vivo” (in life) etc to describe an experimental classification. Meaning not “in silicon” precisely but anything to do with computer simulations,

https://en.m.wikipedia.org/wiki/In_silico

The flip side is the use of algorithms in our heads such as mental arithmetic would be “in vivo” 😉

]]>“Do not cause issues, you need to test the raw physical sources on a “bit by bit basis” and upwards.”

This just comes down to filtered outcomes. The measurements filter the signal. That means that the outcomes sample a reduced part of the state space. And the entropy used is that of the reduced state space.

Or, reformulated, if your hardware is garbage, no good signal will come out. But that does not say anything about the quality of the input signal.

Entropy is not magic. It is just statistics. And like any statistics, you need to know how to use it effectively.

]]>“it makes sense to talk about the entropy of a single bit, such as the outcome of a coin toss, or the spin of an elementary particle produced by a certain interaction.”

A coin toss samples a large mechanical system, the body of the tosser, the air. The predictability of the outcome depends on the predictability of the whole process. The same of preparing an elementary particle into a specific quantum state.

I look at a tossed coin as a measurement of the body that does the toss.

As entropy and information are about the predictability of outcomes, it makes no sense to talk about entropy and information for a person who already knows the outcome. But the fact that you know the outcome does not tell me anything about the outcome.

]]>Probably better if I follow your wording … it *makes sense to talk about the entropy* of a single bit, such as the outcome of a coin toss, or the spin of an elementary particle produced by a certain interaction.

If I understand your reckoning, it only makes sense to talk about the entropy of a physical system if it has a vastly greater scope of variation.

Different animals.

]]>I don’t understand how a ‘biased’ random source compromises the encrypted information?

As an example suppose the random source averages 0.4 ones and 0.6 zeroes.

John

]]>In my opinion, if you want to extract “truly random” numbers from a physical system, pick a system that has such a large number of free parameters,

Importantly they need to be fully independent of,

1, Internal Influence.

2, External Influence.

And to ensure that those possibilities and other posabilities, including,

3, Drift

4, Degradation

5, Fault

6, Failure

Do not cause issues, you need to test the raw physical sources on a “bit by bit basis” and upwards.

After all as Audio Engineers will point out “hum will get in where ever it can” and likewise “microphonics”. Old school communications engineers know about “cross talk”, “magnetic coupling”, “capacitive coupling” and that little nasty I mention from time to time called “injection locking” then if you are realy old school “Parametric oscillation / Amplification”. All of which can br used to transfer energy into a physical source such that it effects the source ouput in a number of detrimental ways.

]]>