Comments

Clive Robinson July 15, 2010 7:29 AM

@ Bruce,

“Not that we need more ways to get random numbers,”

Hmm not sure I would agree with you there.

As people are finding out TRNG’s are not as good as they appear on paper, and are extrodinarly sensitive to influance from the environment they operate in. And importantly it is not always easy to tell the TRNG has been influenced by an external entity via a modulated RF carrier etc.

As was shown a little while ago by a couple of bods over at the Cambridge Labs, an unmodulated RF carrier can take a TRNG from 32bits to 8bits equivalent.

Now although that was detectable what would happen if the attacker decided to do the injection fault at an optimal time for them and only for a very short duration?

For the time they attack the TNRG is 8Bit equivalent the rest of the time 32bit equivalent.

The question the arises as to how you detect this brief transition?…

The J July 15, 2010 8:09 AM

“The current consensus is that the best source of randomness is the quantum world where uncertainty rules.”

Hogwash, as far as I’m concerned.

Then again, I could probably get a hefty grant to study randomness in hogwash 🙂

aikimark July 15, 2010 8:40 AM

unofficial fact:
Bruce Schneier has a specialized brain region that serves as a TRNG.

Clive Robinson July 15, 2010 9:26 AM

@ The J,

“Hogwash, as far as I’m concerned”

You might well be right.

The level of concensus on this is low at the moment.

One reason is that Seth Lloyed and others belive that the Universe is in effect a self calculating computer. Which effectivly means that our physical universe is a very very small subset of the information universe.

Now there are two problems if the physical univers is a computer then it is determanistic in some form. Which our normal definition of randomness indicates that randomness therefore cannot be a product of our physical universe.

So it must either be caused by something external to the universe. Or our assumptions that give us our definitions are wrong….

Which just shows how much fun physics bods get up to these days…

John July 15, 2010 9:37 AM

@Clive Robinson … Seth Lloyed and others belive that the Universe is in effect a self calculating computer …

Having flash backs to the Well World series by Chalker now, thank you very much.


Not only is the universe stranger than we imagine, it is stranger than we can imagine.
–Sir Arthur Eddington

W July 15, 2010 9:46 AM

I don’t see any use for the use of true random numbers except to inject entropy into a cryptographic PRNG. And in that case the quality of random numbers doesn’t matter that much. The thing that matters most is the total entropy generated.

And one thing I don’t understand is what use websites like random.org have. I know nothing but cryptography where the difference between a good PRNG and true randomness matters at all, and you can’t use them for that since they are from an untrusted source and sent over the internet on top of that.

spaceman spiff July 15, 2010 9:56 AM

If one wants true randomness, just monitor my thought processes! I can guarantee that there is no linearity (though possibly some hilarity) there. 🙂

Andrew Yeomans July 15, 2010 10:38 AM

Even the Ferranti Mk 1 was designed with a hardware random number generator (www.computer50.org/kgill/mark1/RobertTau/turing.pdf ) back in 1951.

@clive/seth: So why shouldn’t the instruction set of that maybe-hypothetical Universe Computer also include one, to allow for non-determinism?

@W: some types of cloud computing can generate demand for large quantities of random numbers. PRNGs may really not be adequate to guarantee that a process on a shared infrastructure is sufficiently isolated that it can’t be used to reduce the search space for keys generated by another process. And if you are using a cloud to create lots of short-lived TLS connections, you will need a lot of random numbers, and will soon run out of the inherent entropy in the system, requiring external entropy to be used.

Clive Robinson July 15, 2010 11:03 AM

@ W,

“I don’t see any use for the use of true random numbers except to inject entropy into a cryptographic PRNG.”

It’s possibly the best way to do it for many reasons, BUT determinism cannot make entropy. (I’m not sure what can but it’s one to ask the philosophers 😉

I’ve seen way to many systems were a hash or other function is used to add “magic pixie dust” to the likes of a low bandwidth noise source (I blaim Intel engineers for this bit of silliness).

If you are going to “inject entropy” there are a number of ways to do it but one of the better ways is to “spread” it across a high speed stream cipher and couple it with a non determanistic sampling system.

Over simplisticaly… for example get yourself a Gausian White/Pink Noise generator and use this to drive a voltage controled oscillator which either supplies the CPU clock or an analogue input for something like a PIC micro. The micro sits there running a modified version of ARC4 or other stream cipher as fast as it can but also generates timer based interupts where the time out period is taken from the stream cipher update algorithm. When the external system needs a random number it sends an interupt to the PIC which outputs a byte or word before going back to running the stream cipher. The result within reason is not predictable (but the Devil is in the implementation details).

However it is not going of necesity going to be imune to power supply noise or EM fields etc. Thus you need to protect against EM coupling attacks where the attacker injects a fault signal on an RF carrier or even a magnetic field.

To get around this you need not just good PCB layout and shielding but a series of entropy pools that get used to update over exponentialy increasing time intervals. and one or two other things.

Marcos July 15, 2010 11:25 AM

That’s interesting. What is better is that differently from quantum cryptography, quantum mechanics say that quantum unceirtanty is exactly what the researches are saying it is, a mean for generating truly random numbers. That beats any pseudo-random algorithm, and may also beat non-quantum physical generators (or may not beat, but can not be worse).

It is nice too that laser based generators are imune to EM fields (except when those fields are strong eough to change the logic level of the used electronics, but that would defeat any kind of device). But they are a bit more expensive than purely electrical generators.

RH July 15, 2010 11:34 AM

@ Andrew Yeomans

I don’t follow the theories fully, but I believe the Universal Computer doesn’t allow non-determinism because the authors of the theory don’t like the idea.

From what I understand, the whole point of UC is to have determinism. If you don’t have that, then there’s FAR simpler non-deterministic models they can use.

Joe July 15, 2010 1:52 PM

I apologize for this but I couldn’t help remembering the “bistromatic drive” from Hitchiker’s Guide to the Galaxy.

Geek Prophet July 15, 2010 2:03 PM

@The J and others

Seth Lloyd’s theory may require universal mechanistic behavior without randomness. However, if this is the case, there is not one whit of evidence for this theory. If his theory requires quantum dynamics to be mechanistic, it is up to him to come up with some solid reason to believe this is true. No one ever has.

It may be that quantum uncertainty does not produce true randomness. However, quantum dynamics falls apart if it doesn’t, and every test ever devised has come down in favor of quantum uncertainty.

Seth Lloyd or his followers can either A) revise quantum dynamics so that it functions at least as well as current theories and removes the randomness without violating empirical evidence, or B) show an actual experiment showing that something quantum dynamics says is random isn’t.

Until this is done, any mechanistic theories for the universe are not science. They are philosophy, and philosophy directly contradicted by science, to boot.

Geek Prophet July 15, 2010 2:11 PM

Sorry. Almost forgot. Seth Lloyd himself says that quantum mechanics is clearly non-probabilistic. His Universal Computer is fed randomness as its input.

Relevant quote: “Einstein said, God doesn’t play dice with the universe. Well, it’s not true. Einstein famously was wrong about this. It was his schoolboy howler. He believed the universe was deterministic, but in fact it’s not. Quantum mechanics is inherently probabilistic: that’s just the way quantum mechanics works. Quantum mechanics is constantly injecting random bits of information into the universe.”

See: http://www.edge.org/documents/life/lloyd_index.html

jacob July 15, 2010 2:28 PM

Interesting article and subject bruce.
1. truly random bits generated fast enough to be useful.
2. I can encrypt with an OTP? I just am glad I can touchtype since I am such a klutz that I am probably going to blind myself with the laser.
3. the universe is a computer? Then does bruce see all of us like neo in the matrix? Soorrry, I could not resist a matrix joke on this.
4. If the universe is a computer/hologram could we hack it? To what end? Politicians must be the alpha release of the software.
5. Theoretical physics makes my head hurt and convinces me they must do some strong drugs to come up with the stuff they do.

Tracer Bullet July 15, 2010 2:35 PM

@ Spaceman Spiff
Aren’t you thinking of your thought processes when you play Calvinball?

@W
Most people in this forum know a lot more about cryptography than I do; and
I’m not building cryptographic systems. I will use random.org in place of my own Mental RNG if I want to create a long string for a one-time password or something. I wonder if it strengthens or weakens the pseudo-randomness to change a few characters or move them around.

I also distinctly remember using “random number tables” in my math classes 20 years ago but feeling my index finger always landed in the same general area even when I tried to simulate randomness.

callMEphil July 15, 2010 2:52 PM

@tracer. Actually we are playing “stomp the wombat”. See the randomness??? don’t forget what happened to Manfredi and Johnson….
Seriously, This is discussed in article as being small enough to fit into current circuit boards. then notebook/computer theft was become useless?? It might make the internet better for transactions, etc. but thieves and gov’ment tracking with unique ids even easier. OK….

Jake July 15, 2010 3:55 PM

Bruce Schneier has a specialized brain region that serves as a TRNG.

its output is a constant stream of 9s, yet it passes all tests of randomess.

JimFive July 15, 2010 4:00 PM

@Clive, RE: “determinism cannot make entropy.”

This is, of course, not true–at least in physical terms. All of the physical processes that lead toward the heat death of the universe are deterministic and increase entropy.

Increasing entropy can be viewed as equivalent to removing information (order) from the system. Thus, any one way hash removes information and thus increases entropy.

JimFive
(Yes, I know that the physics definition of entropy and the cryptographic definition are orthogonal, lighten up.)

Jay July 15, 2010 4:37 PM

If the universe is a big computer, then it is deterministic. It may be that what we think of as random is actually predictable, but so complex that it is just very very hard to predict.

Section9_Bateau July 15, 2010 8:48 PM

If anyone has actual useful information about generating quality random bitstreams, I would be interested. So far, my best idea is a very long LFSR continuously cycling, just sample X bits (where X < some_partial_length of the period), then apply some standard algorithm to attempt to assume adequate entropy from the selected sample.

I once (for a school project) looked at making a VPN-like implementation, XORing the bitstream with the output of the LFSR, made for very easy synch between clients, but I found that within the length of the register, I could actually break the system, had it broken before I finished writing the code. (it was designed to run continuously, set bandwidth, protection from timing analysis). I ended up wrapping it in AES, still, wouldn’t consider it trustworthy.

greg July 16, 2010 1:01 AM

I’m with Geek Prophet. As a physicist. It is very widely accepted that god plays dice and that quantum noise is truly “random” with some often know distribution.

As for theories of non randomness, you need hidden variables to be there for that, otherwise you can’t explain the observed randomness. However Bells inequalities do hold (so far and to quite high accuracy) and this is experimental evidence against hidden variables. ie its random.

However is also false to claim that a deterministic system is always predictable. Chaos theory is about non predictable deterministic systems. The definition is roughly that in order to increase the time you can predict something, you need exponentially more accuracy in measuring the initial state.

aka predicting a pair of die thrown in a glass becomes, for all practical purposes, impossible.

blue92 July 16, 2010 2:23 AM

If his theory requires quantum dynamics to be mechanistic, it is up to him to come up with some solid reason to believe this is true. No one ever has.

Quantum mechanics is not mechanical?! And there’s “no solid reason”? Madness. We call this “argument from ignorance”, FYI. Physics deals with physical (AKA mechanical) stuff. What do you propose it is, if not mechanistic? Tiny invisible pink unicorns?

Seth Lloyd himself says that quantum mechanics is clearly non-probabilistic.

Relevant quote: “[Einstein] believed the universe was deterministic, but in fact it’s not. Quantum mechanics is inherently probabilistic: that’s just the way quantum mechanics works.”

Uh… you meant in the first sentence “non-deterministic”? Or else this is clearly contradictory.

In any event, this determinism/probabilism debate is a false semantic dichotomy, probably based on misunderstanding. I would, for once, like for some person [preferably sane] to show me why a die roll isn’t deterministic. Note that “determinism” in is almost always claimed in the in mechanistic sense, not in the “knowable” sense. [It is less a epistemological claim than a metaphysical one.]

The problem is distinguishing between the mechanistic and the practically determinable. To be clear on the point, the die roll (and anything probability-driven) is mechanistic. If it were based in true randomness, quantum mechanics (and probabilities in general) could not be usefully predictive BY DEFINITION. Quantum mechanics IS mechanic-AL, not magic. And mechanical things can be measured and/or manipulated.

What we can say is that, at a designated time 0, the randomness at a small enough scale is such that it is impossible to predict the state of the system at an infinitesmal amount of time later — the reason being that the initial state is inferred to be unknown. That does NOT mean, however, that attacker Jack at time x cannot impose field Z so that at time x+y the randomness of the particles being measured is severely reduced. (Probably it’s easier for him to go guess passwords instead, but that’s a different topic.)

Broken crypto still can look random; just because it looks random doesn’t mean it is. Similarly, just because you use the best tech available doesn’t mean your code is unbreakable or unmanipulable.

Snarki, child of Loki July 16, 2010 7:59 AM

Dudes, cut the speculation and try and do the PRNG yourself. Really.

I did that, back in student days, with a diode-shot noise (yes, it’s quantum randomness) pulse generator and a high-speed counter…

…and you find that flipping a counter bit from ‘0’ to ‘1’ takes about 5ps longer than going from ‘1’ to ‘0’. And many other, much more subtle, biases to the randomness.

It’s not the `quantum’ stuff that makes or breaks a PRNG: it’s the subtle flaws in interfacing the phenomenon to your computer, the solution to which does not make for an interesting vapid PR soundbite.

A Nonny Bunny July 16, 2010 9:47 AM

@blue92
“I would, for once, like for some person [preferably sane] to show me why a die roll isn’t deterministic.”

A die roll is pretty much deterministic, since a die is a macro-object. If you know its initial position and orientation and the forces acting on it throughtout it’s movement, you can, in principle, calculate the end result.
The initial state can only lead to one end state; and that’s what me, and a few others like me, like to call deterministic.

Now take on the other hand measuring the spin of an electron. If you measure one time it might have an up spin; measure it another time in exactly the same way, with the exact same starting conditions, and it may have down spin. The same initial state can lead to different end states using the same process.
And that’s what I’d call probabilistic.

Now, if you take a mathematical (idealized) die, it is of the second kind. Throwing it leads to any of six end states, with equal likelihood. It’s also not a real physical object, but such trifles never bother mathematicians.

moo July 16, 2010 1:02 PM

@JimFive:

Re Clive’s “determinism cannot make entropy.”

Of course he’s using entropy in the in information-theoretic sense, where entropy means basically “unpredictable information”. Being unable to add entropy is practically the definition of a deterministic process–everything it outputs is determined completely from the inputs.

@blue92, A Nonny Bunny:

Just because a die is a macro-object, doesn’t mean that you can predict with 100% accuracy its physical interactions using only stuff like newtonian physics. What if you happen to roll the die in such a way that it lands precisely on an edge between two faces, and minute physical forces between nearby atoms in the two surfaces then decide whether it falls one way or the other? Can you truly rule out quantum effects and other micro stuff? If not, are you prepared to assert that all such effects are completely deterministic?

Even if every aspect of the process turns out to be fully deterministic, without knowing ALL of the input state (as well as ALL of the rules that govern the deterministic process), we can not predict with 100% certainty the output of the process. In the case of die rolls, maybe we can predict them accurately 99.999% of the time using a model that approximates the physics using just Newtonian rules. But not 100% of the time. On some vanishingly small fraction of die throws, all of the atoms in the die will align nicely with the empty space between the atoms of the table, and the die will fall right through! Yes, I’ve never seen it happen either. No, that doesn’t prove that it is impossible. I think there’s an inherent amount of non-determinism in any physical process, even if its comparatively small. (however, I’m not a quantum physicist.. maybe one day one of them will prove me wrong?)

David Thornley July 16, 2010 4:05 PM

@blue92:

Quantum mechanics is in some sense a mechanical model, but it is explicitly probabilistic. The extent of many predictions is that there’s a X% chance of observing this and a 100 – X% chance of observing that. This is, in fact, a verifiable prediction. If the odds are 50-50 for measuring an electron’s spin, run a few million electrons past a measuring device and count which are spin up and spin down; if there’s a significant difference, the prediction failed.

It’s a tremendously successful physical model, making scads of very detailed and verifiable predictions. It’s also had to deal with many physicists over time who hate the thought of inherent randomness, and so there have been a whole lot of alternative, deterministic, theories proposed. They’ve been all proved wrong. Einstein worked really hard to try to break the Heisenberg uncertainty principle, for example, and failed.

After a century of this, it seems reasonable to think that the Universe probably is indeed irreducibly random at a low enough level. It probably isn’t because of things we can’t observe, from all the hidden-variable theories proposed and proven wrong. If there is predictability, you’d think somebody would have come up with an appropriate theory that didn’t get experimentally disproven by now.

blue92 July 17, 2010 12:14 AM

See what I mean about the determinism/probabilism debate?

You can even explicitly explain that “deterministic” does not mean “determinable” when used in this context–that it’s a false dichotomy–and people still try to explain in terms of epistemology. Granted, it doesn’t help when generations of professionals insist on pretending they understand things that they can’t possibly understand. Even when the professionals are honest about their ignorance, there’s always someone to misinterpret.

As for the (seemingly) more informed…

…After a century of this, it seems reasonable to think that the Universe probably is indeed irreducibly random at a low enough level…

The terminology is badly muddled, but when you say “irreducibly random”, I say, first, I doubt we know what is irreducible yet (quantum spacetime is fun, but quite theoretical yet), and, second, I doubt that unknowable randomness by any necessity negates the possibility of mechanisms at work.

The first misunderstanding is this: at first glance quantum mechanics has, at its core, a logical contradiction. But it’s not really. QM has deterministic predictions based on a “random” wave function collapse. There are two definitions of “random” at work: (1) the practically unknowable and (2) the impossibly unknowable. Neither necessarily relates to causation. QM succeeds as a predictive framework because it depends on probabilities measured in the past as being descriptive of future interactions. There is causation at work, in any sense of the concept of “causation” that means anything.

Indeed, it has not to my knowledge been sufficiently explained what non-causation would look like in a scientific model — for the obvious reason that an uncaused event must be random, unpredictable, and hence cannot be meaningfully modelled. Bounded probabilities necessarily imply bounds, and bounds, even loose, non-absolute ones, necessarily imply causal interaction. We have no experience with the non-causal to guide us.

One more than one occasion people trot out Bell’s Inequality for some reason, yet even the seemingly thorny quantum entanglement has every indication of being causational — in the sense that it is predictive and reproducible. It is commonly said that this is “spooky”, that it violates locality. So what; non-locality is non-locality. Sucks to be you if you wanted locality, but causation has no necessary dependency on it. Worse, the idea that clearly non-randomized phenomena is presented as evidence of irreducible randomness is patently absurd and contradictory. Bell’s supports QM, but as QM does not disprove causation, this is just barking at the moon.

If there is predictability, you’d think somebody would have come up with an appropriate theory that didn’t get experimentally disproven by now.

And you’d think gravity would be thoroughly understood.

Again, determinism is primarily a metaphysical claim, not an epistemological one: it says, for as far as we know and can measure, physics is a series of cause/effect interaction. Predictability (AKA human detection) is a matter of knowledge and one can rationally posit that knowledge of all particular causes is not required for those causes to exist. Lightning did not wait for the discovery of electricity.

That there may be non-caused events is possible in the sense that there are clearly unsolved problems in physics; we don’t know everything. It’s probable that we can’t know everything. But as to what hypothetical uncaused events should look like–that condition appears to be indistinguishable from pure ignorance of the causes of those events. Non-determinism is based solidly in an argument from ignorance, not rational deduction. We’re considering the unknown structure of the universe, not looking at a six-sided die roll where we’ve eliminated five possibilities. You may not like the evidence that exists, but stopping at “oh well, dat wascally wandomness!” is the logical equivalent of invoking a deity for the answer and going out to lunch. I find that sort of failure of inquiry especially egregious in a field that exists precisely to inquire.

DCFusor July 19, 2010 8:22 PM

Well, the last time (and it will be the last, that hurt my head) I read the full derivation of Schrödingers wave equations, about 25 pages of very dense math that is usually passed off as a single transform symbol in equations because “we just know”, it struck me that near the end the magnitude of a complex number is taken and the phase is tossed out in the trash. So you wind up with probabilities of something being here or there, but no info on where they are vs time. Which to me is just silly.

If you did that with the output spectrum of an FFT, and tried to reverse transform, yeah — that’ll work, right? Nope, you don’t get any hint of the original waveform-shape with the phases tossed in the trash or all assumed some number. It will have the same frequency content, but never the same wave-shape as the original did.

I think that means some quantum theories are incomplete at best. Even though both those and relativity are tested to N++ decimal places where they apply, they can’t both be right where they join up, all attempts so far have failed at that point.

Doesn’t mean that at the current level of understanding it can’t make a good random number generator though. But as someone pointed out above — the devil is in the details.

Robert July 20, 2010 3:06 AM

Fun stuff, TNRG’s, If I never have to design another one it will still be too soon….

absolute RNG entropy:
8 bits easy
12 bits difficult
16 bits unbelievably difficult
20 bits your kidding yourself
32 bits yea right….

As for adding RNG entropy to an algorithmic TRNG, it is certainly looks good on paper, and even helps you pass simple testsets (diehard etc).

For any real, on chip TNRG’s, with active injection attacks, I’m not even sure that entropy is increasing with each cycle RNG addition.

All that I know for sure is that my ability to spot the error injection mechanisms (introducing correlation) decreases with each successive add-on.

I’ve long ago given up trying to “fix” TRNG problem, these days I prefer to simply pray that all my adversaries are similarly challenged!

Clive Robinson July 20, 2010 5:24 AM

@ Robert,

At the risk of hearing a loud scream and rapid foot falls fading into the distance…

Did you ever resolve that issue with two chip TNGs producing similar output?

@ ALL

For those thinging of designing their own TRNG’s DON’T. Oh and DON’T use “Chip TRNGs” either, and don’t buy them off the Internet either…

As several people have noted here there are many many hidden difficulties.

First of there are two basic types of noise in a system those that have a high truley random component and those that have a high determanistic content and guess what they both look the same to the naked eye, and are often found together on the same wire…

You can see this with cheap TRNGs and a lot of designs you see on the internet, their designers make the fatal assumption there is such a thing as a “ground” or “earth” with zero impeadence and no noise.

I can assure you that no such beast exists nore is one even close or possible. Like gold at the end of the rainbow we just assume it exists for other reasons.

The classic mistake you will see in nearly all designs is having a “noise source” like a diode or reverse biased junction connected across the powersupply and providing a single signal out. Any design that does this is doomed from this point onwards.

Even if you do use a differential signal (ie from either side of the noise device) you still have to deal with ground and EM noise getting into just one arm of the differential signal more than the other (not easy).

Realisticaly you would be lucky to get a 60dB CMRR on the overall system using standard PCB design and standard components at audio frequencies.

Now -60dB is aproximatly 1mV/V or 2^10 or 10bits.

With good audio design experiance you might get this up to 100dB (10uV/V) around 18bits. To get any better you need to switch over to RF design.

Now if your noise source is a diode or other semiconductor it should produce RF noise into the hundreds of MHz range. Using “Superhet” design you can get between -120dB and -160dB fairly readily giving you between 20 and 27 bits.

Getting any further means paying real attention to what you are doing, and this is with audio frequency range signals….

Which is why if you want more you really need to go another way. which is why I like to spread the entropy from my TRNG across the function of a high speed stream cipher.

Oh and for those thinking “hey laser that’s light no EM field issues” saddly no. The device they are using is a semiconductor and it runs at quite a low current. It is going to be subject to powersupply noise like any other semiconductor thus an EM field will if done correctly effect it’s output… How much and if it is exploitable depends on your physical implementation.

As I keep saying “the devil is in the details” and there are lots and lots of details in designing TRNG’s and you can not ignore any of them…

I have a sneeky fealing that the Intel engineers discovered issues with their “on chip” system and “ground/PSU noise” hence the use of “magic pixie dust” hashing at the output…

Danko July 20, 2010 6:33 AM

There is actually a company which already makes commercial RND generator based on quantum noise. It’s called “ID Quantique” and their website is http://www.idquantique.com.

Quantity of random data is far less then the one from the link (16 Mbits/sec), but on the other hand you can plug-in as many PCI or USB devices as you like to achieve required quantity of random data.

Robert July 20, 2010 9:49 PM

@Clive
[Robert]”Did you ever resolve that issue with two chip TRNGs producing similar output?”
Sorry I missed your reply to my previous question on the multiple TRNG’s

For this product all the TRNG’s are completely separate devices on separate chips with their own power sources. All chips use the same RNG structures (stable osc samples internal noisy osc). The devices are scattered around inside a room 10m * 5m. Each device is about 1mm*1mm total area. (Think powered RFID devices and you wont be far off)

I think the problem is that the combined data contains remnants of the sampled oscillators and when these are combined there is mixing and down modulation.

I think the reason that this does not show up on a single device is that the complexity of the repeat sequence is so long that the data looks “white” in the frequency domain. However, in the “code domain” they are in effect loosely locked. So I concluded that my raw standalone TNRG was like looking at a DSSS signal sequence without knowing the spreading code. Now imagine that multiple “similar” (but non orthogonal) DSSS sequences exist at the same time/space, if you start combining similar sequences it seems to me that you can achieve inadvertent despreading. I think something similar to this is what is going on.

In this sense the RNG is a true source of entropy where the added algorithmic extensions further spread this data to give us a TRNG. Combining lots of these sources results in quasi correlation which causes de-spreading.

I convinced myself that the total data set was not so large that “central limit theorem” CLT effects needed to be considered, however I was only about one to two orders of magnitude below the point at which CLT would result in data shaping.

The above is what I believe is happening, however one of the other engineers thinks it is due to Hall effect or something similar. so we have unintentionally built a hall sensor, resulting in local data being correlated.

The only thing I know for certain is that I hate designing on RNG’s.

[Clive] “Oh and DON’T use “Chip TRNGs” either”

Regarding your assertion that you cannot do a good on chip RNG, I would strongly disagree with this. I would suggest that the only way to possibly get 20+ bits of real entropy is to use an integrated RNG. The main reason that I say this is that the circuit can be made very small (meaning external effects couple equally) and symmetric design techniques can be used to cancel all first order field gradients.

Doing a good RNG design is like designing a higher order “chaotic” bandpass sigma delta with no input source. Basically a circuit that just accumulates noise, This needs to have 100db PSRR and similar CMRR and you need to understand which Noise is correlated and which is uncorrelated. You remove all first order biases using DC auto-offset cancellation. and you remove 1/f and other similar noise sources by the nature of the bandpass function. Basically what you want is for all the thermal noise at all sample harmonics (+- bandpass frequency), to fold down into band and add together.

Anyway, I’ve got real work to do…

Luis October 10, 2010 3:23 PM

I have a system to predict random, i wan´t to search an aplication…
If some one knows an aplication lease tell me.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.