Comments

Delphi Ote October 10, 2013 6:36 AM

That Ars article is horribly written and hyperbolic. This in particular is a ridiculous statement, “Properly implemented RNGs, to extend the metaphor, are akin to a relief dealer who thoroughly shuffles the deck, an act that in theory results in the strong likelihood the cards never have and never again will be arranged in that exact same order.”

If their understanding of random number generators is that bad, they shouldn’t be writing an article about them.

Ralf Muschall October 10, 2013 7:56 AM

@Delphi␣Ote: sqrt(52!) is about 9e33, so the statement on arstechnica is probably correct. Even with the smaller deck of 32 cards the value is 5e17, i.e. everybody in the world would have to shuffle a deck every second for 200 years before the birthday paradox would probably create duplicates.

And the problem looks like a repetition – the same happened before with device-generated SSL keys: https://freedom-to-tinker.com/blog/nadiah/new-research-theres-no-need-panic-over-factorable-keys-just-mind-your-ps-and-qs/ (at least one of the authors (Nadia Heninger) is the same).

Clive Robinson October 10, 2013 9:34 AM

I fully expect to see more issues along these lines…

RNG’s are natorious for getting right and most methods of testing them don’t work in the way most people would expect of “testing”.

Most people assume that tests are designed to show an item under test is working to specification, which is the general accepted use of tests.

But it has the underlying assumptions of being both testable and subject to being accuratly described. To have a specification by which you can perform “go-nogo” testing you have to have a reliable method of determining “nogo” and that requires you to be able to fully enumerate it’s charecteristics, which implies that they are both determanistic and bound.

Nearly all RNG tests are infact witnesses to specific failings not general failings let alone all failings. That is whilst it is relativly easy to show behaviour indicative of one very narrow type of determanistic behaviour or bias, you would need what is in effect (under general understanding) an infinite set of witness tests to show all failings…

As was once observed using a determanistic process to generate supposed non-determanistic sequences is in effect living in a state of sin.

However thats what we try to do with “hardware RNGs” and with just one or two exceptions the argument about their behaviour being “chaotic”, “complex” and “unpredictable” are little more than opinions not proven facts.

The history of random sequences shows that “what was unpredictable yesterday” may well be “predictable today or tomorow” simply because of the advancment in theory. The “standards” generaly call for a minimum of meeting the “Diehard tests” but we already know they are far from sufficient as some fully determanistic processes pass them with ease. And thus many hardware designers “cheat” by putting in place a determanistic process following the questionable entropy source which is known to meet the tests even if driven by a counter or similar determanistic process. I’ve described this as “magic pixie dust” thinking whilst others have had other less polite ways of putting it…

However there is another issue to consider which is what is “True Randomness”? We know that there are currently limits in what we know. For instance we do not know exactly when a particle will be emmitted from a radioactive isotope or in which direction. What we do know is that the overall process is very predictable irrespecive of the quantity of the isotope. Thus the question arises as to why the overall process is so predictable, what fundemental physical law/process defines it such that a statistical model fits so accuratly?

We may never know, then again we might find out tomorow, in which case we may well loose a source of “true randomness” as the process moves from the unknown into the relm of determanistic but effectivly uncalculable along with other complex and chaotic physical processes. We might even find that God does not play dice.

delphi_ote October 10, 2013 10:04 AM

@Ralf Muschall

The problem I have is their use of the word “never”. “Never” implies to the casual user that “truly random” processes don’t repeat themselves. This is a common misconception. We know that if you roll a dice 4 times, 1111 is as likely as 1234 or 3819. Someone without our training thinks one sequence is “more random” than the others.

They could very easily have explained that it would be “almost impossible that two users would share the same identifier”. Instead, they wrote a very muddled sentence that plays into a common misconception. The fact that they couldn’t explain the concept clearly implies to me that they don’t understand the concept well themselves.

I suspect this is an instance of the Igon Problem.

Ben October 10, 2013 11:01 AM

” serious random-number generation flaw in the cryptographic systems used to protect the Taiwanese digital ID”

Of course there is. It is in various parties’ interests to ensure that this is so: China and the US both need the ability to forge Taiwanese ID cards. So of course there is a way to do so. In fact since it is unlikely they collaborated, there’s probably two ways.

The question is: Is there any crypto hardware at all without deliberately introduced flaws?

Brian M. October 10, 2013 11:11 AM

According to the original paper, the smart-cards themselves contain the hardware random number generator. It’s this generator that’s the problem, and all of the cards need to be reissued with a better HRNG.

These keys were generated by government-issued smart cards that have built-in hardware random-number generators and that are advertised as having passed FIPS 140-2 Level 2 certication.

The paper goes on to recommend,
The correct response is not merely to eliminate those RSA keys but to revoke all keys generated with that generation of hardware and throw away the entire randomness-generation system, replacing it with a properly engineered system.

Basically, the 1024-bit hardware, the Renesas AE45C1, is a big failure:
As we will see later in the paper, the smart cards used in the PKI we examined fail to follow many well-known best practices and standards in hardware random number generation: they appear to utilize a source of randomness that is prone to failing, they fail to perform any run-time testing before generating keys, and they clearly do not apply any post-processing to the randomness stream

Aku Ankka October 10, 2013 12:42 PM

Several top websites use device fingerprinting to secretly track users
http://phys.org/news/2013-10-websites-device-fingerprinting-secretly-track.html
A new study by KU Leuven-iMinds researchers has uncovered that 145 of the Internet’s 10,000 top websites track users without their knowledge or consent. The websites use hidden scripts to extract a device fingerprint from users’ browsers.

The fingerprinting scripts were found to be probing a long list of fonts – sometimes up to 500 – by measuring the width and the height of secretly-printed strings on the page.

The researchers also evaluated Tor Browser and Firegloves, two privacy-enhancing tools offering fingerprinting resistance. New vulnerabilities – some of which give access to users’ identity – were identified.

To detect websites using device fingerprinting technologies, the researchers developed a tool called FPDetective which will be freely available at:

FPDetective: Dusting the Web for Fingerprinters
http://www.cosic.esat.kuleuven.be/publications/article-2334.pdf

Mike the goat October 10, 2013 1:45 PM

Unfortunately smart cards have been notorious for their shoddy RNG implementations, e.g. Google for “on bad randomness and cloning of contactless payment and building smart cards”

Dirk Praet October 10, 2013 8:13 PM

I’m seeing a new law emerging here: “never attribute to stupidity what can be explained by a broken RNG”.

Mike the goat October 10, 2013 11:41 PM

Dirk: yeah seems convenient doesn’t it? With the Dual EC DRBG fiasco anything is possible.

Fred October 14, 2013 3:37 AM

Anyone heard anything (good or bad) about this?
The South Korea government has requested companies to use a special encryption algorithm “SEED”. It is a block cipher developed by the Korea Information Security Agency (KISA).

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.