Quantum Computing: Hype vs. Reality

Really good blog post on the future potential of quantum computing and its effects on cryptography:

To factor a 4096-bit number, you need 72*4096^3 or 4,947,802,324,992 quantum gates. Lets just round that up to an even 5 trillion. Five trillion is a big number. We’re only now getting to the point that we can put about that many normal bits on a disk drive. The first thing this tells me is that we aren’t going to wake up one day and find out that someone’s put that many q-gates on something you can buy from Fry’s from a white-box Taiwanese special.

Posted on March 23, 2008 at 6:29 AM41 Comments


Scott Shorter March 23, 2008 7:56 AM

But isn’t “quantum” the gold standard of snake oil buzzwords?

“It adds, it subtracts, it factors! Buy the Ronco PairCracker, now with 50% more Quantum!”

Ewout ter Haar March 23, 2008 9:32 AM

It seems that the quote confuses the time and space scaling in a quantum algorithm.

For complexity purposes, the “size” of a problem involves the time, number of (qu-)bits and number of gates. Quantum algorithms are usually analyzed within a computational model where the number of qubits is O(k) (k = #bits of input and output numbers). In factorization, it is really the time that scales as k³, not the number of gates.

Leo March 23, 2008 9:38 AM

I hope people read the following comment at the end of that blog post:

“If memory serves, the number of qubits needed to run shor’s algorithm on an k-bit number is only O(k), owing to the fact that there are efficient reversible circuits for modular exponentiation. Each “gate” in the quantum circuit should be thought of as roughly equivalent to a bit operation in a deterministic algorithm. So the statement “Shor’s algorithm requires 72k^3 quantum gates” is equivalent to saying it takes TIME 72k^3 to run. This, by the way, is almost exactly the number of steps it takes to do a RSA decryption with a k-bit key — no mistake, since the main step in the algorithm is to perform a k-bit modular exponentiation.”

I hope no one is actually using 4096 bit keys, because with the need for 5 trillion bits, more bits than we can currently put on a disk drive, it will be impossible to decrypt anything encrypted with such a large key, right?

Apparently this “mordaxus” doesn’t understand quantum computing anymore than those he mocks in his first paragraphs. The commenter “Not a Quantum Bigwig” apparently does. He points out valid concerns in the rest of his comments.

Hasn’t anyone around here ever taken a course in digital logic?

superduper March 23, 2008 9:44 AM

Does anyone think that the gov has or will not already solve this issue. Has anyone heard of NSA

Leo March 23, 2008 9:45 AM

Just to add…

For those who are interested in the real issues and not the hype, positive or negative, there are good books available. “Quantum Computations and Quantum Information” by Michael A. Nielsen and Isaac L. Chuang was used as the textbook in the first course I took.

Leo March 23, 2008 9:48 AM

“Does anyone think that the gov has or will not already solve this issue. Has anyone heard of NSA”

Why do people persist in thinking of the NSA as a supernatural organization?

Jeff March 23, 2008 10:23 AM

The article says that experts suggest a move to elliptical encryption schemes for more security – can someone explain what this means?

Henning Makholm March 23, 2008 11:16 AM

Jeff: http://en.wikipedia.org/wiki/Elliptic_curve_cryptography

But it is not clear to me that ECC would necessarily be safer from quantum attacks. Because integer arithmetic is drilled into children from an early age, it is easier for most people to grok the structure of ordinary modular arithmetic well enough to design an algorithm (quantum or classical) for reversing its operations, than it is to grok some weird elliptic curve over a finite field. But does that mean that a good algorithm is actually less likely to exist for elliptic curves? If a good algorithm does exist, it takes only a single genius to find it, no matter how difficult it would be for ordinary humans to conceive of it.

KTC March 23, 2008 11:23 AM

Quoting Shor’s actual paper (arXiv:quant-ph/9508027v2) : “…, and thus modular exponentiation requires O(l^3) time with this method. These gate arrays can be made reversible, however, using only O(l) space.”

“It is currently believed that the most difficult aspect of building an actual quantum computer will be dealing with the problems of imprecision and decoherence.”

Like Leo said above, the poster of that blog post Mordaxus, knows as little about quantum computing as the people he’s mocking. Really Bruce, this is something I expected you know better of than to link to misinformation like that.

Glenn Willen March 23, 2008 12:02 PM

I must second Leo’s recommendation of the Nielsen and Chuang book. It’s what we used in our Quantum Computation and Quantum Information course here at CMU. I still have it sitting here on my shelf. It is an excellent book.

Steve March 23, 2008 12:24 PM

I am a PhD in physics who worked in the foundations of quantum theory back in the day, and I must say everything I have read about quantum computing is like fingernails on a blackboard. I have never seen anything related to quantum computing which reflected contemporary understanding of the foundations of quantum theory (e.g., Schrodinger’s cat was a severe criticism of the interpretations of 70 years ago). There are some papers by E.T Jaynes on the Washington University website that will put the Bohr-Einstein debates into perspective.
In short, quantum mechanics is a “realization of an equivalent of an algebra of probably inference (in the sense of Cox)” for those of you out there into Bayesian statistics. It is a system of Bayesian inference. There is a Boolean structure even, meaning it is subject to all those computability and decidability theorems they feed baby computer scientists! So, to parody things a bit, when you add 2+2 on a quantum computer you get the most probable outcome of such an addition – which is okay if you are working in a way such that your spectra is restricted to the integers (2+2=4), and not the reals (2+2=3.99792). So, if you actually understand what you are really doing with your quantum computer, you will realize that you are solving an altogether different problem than, say, the NP hard problem you started out with. I have never seen any discussion of this very significant point: you may be able to change the class of a problem if you can frame it in a quantum context properly. One of my suspicions is that it may even be possible to build a “universal translator” for crypto based on (discrete) wavelets and their relation to quantum theory, to calculate “the most probable decryption”, but this would take an enormous effort by people who know what they are doing for several years. The fact that you are decrypting deterministic algorithms is not a problem: quantum theory is agnostic on determinism, what it presumes is correlation.
I must confess, the few papers I did read were sufficient to prevent any serious effort on my part at studying quantum computing, so I may be a bit severe in my ad hominum judgement, but certainly the overwhelming majority of such work is simply “not even wrong”.

Ben March 23, 2008 12:25 PM

“we aren’t going to wake up one day and find out that someone’s put that many q-gates on something”

Way to ignore Moore’s Law.

“640k ought to be enough for anybody”

aikimark March 23, 2008 2:59 PM


“I hope no one is actually using 4096 bit keys, because with the need for 5 trillion bits, more bits than we can currently put on a disk drive, it will be impossible to decrypt anything encrypted with such a large key, right?”

Wrong. 5 trillion bits is equivalent to a 625GB hard drive, which I’ve already seen advertised. And just this week, I read of some computer memory research that may bring hard drive storage capacity into the range of these huge hard drives.

3D storage that might store 1 terabit/cc:

Quantum-dot memory that may reduce memory access down to the picosecond range:

Moreover, computer speed advances aren’t limited to quantum architectures. How about exa-flop performance!?!

ForReal March 23, 2008 3:37 PM

“Yes, but isn’t the future of cracking anything -like AI- just a matter raw horsepower and massive disk space?”
Well, A1 ever since it was “invented” is still just dumb software. You’ve got to be artificial to believe it intelligent.

Joe Fitzsimons March 23, 2008 8:02 PM

This is my first time commenting here, but since this post is in my area I decided to delurk.

The scaling for Shor’s algorithm quoted in this post is incorrect. The bottleneck in the algorithm is the modular exponentiation. For small N a naive approach is used which leads to O(N^3) scaling, but for larger N it makes sense to use the Schonhage–Strassen algorithm, which has O(N^2 logN loglogN) scaling. (See arXiv:quant-ph/9508027)

There have been quite a few implementations which factored 15, the most recent being a very nice demonstration using photonic qubits (arXiv:0705.1684).

Joe Fitzsimons March 23, 2008 9:21 PM

Jeff/Henning Makholm: There is indeed a quantum attack against elliptic curve crypto systems, although I’m not sure that it has appeared in the literature yet.

elegie March 24, 2008 12:09 AM

The emergentchaos.com page’s favicon image looks like a yellow padlock. For a second, it looked as if the page incorporated SSL security.

ftPeter March 24, 2008 2:30 AM

One wonders whether Moore’s law applies to quantum computing. That means what 35-40 years until that can be done?

Monday March 24, 2008 7:50 AM

How does the quantum computer know that it has decrypted a file? Suppose that a file is random bits, and is then encrypted. If the quantum computer can consider every possible decryption, how does it choose which is correct?

greg March 24, 2008 9:08 AM

@Henning Makholm

I agree. There are already weak EC than can map to normal Z_p and then the DL problem can be solved with index calc as per normal DH. For that reason the “extra” security of ECC seems permature to me. A lot of work has gone into factoring and DL over Z_p etc. But not over EC fields.. I remained sceptical (paranoid?)

As for QC. There are a few things that are true no matter how much you arm wave. First is decoherance scales with the number of states… or exponetialy in the number of qbits. In other words the enginnering is “problem class” O(e^n) in number of qbits. Hence a Mores law would predict a linear scaling of qbits over time.

Secondly if i have enough qbits to factor a 1024 bit number. I cannot use this in anyway to factor 1025 bits. In other words there is no way to “simulate” a n+1 qbit register on any number of n qbit registers. This is in contrast to classic computers where simulation 64 bit registers on a 8 bit computer is straitforward.

I personaly know a few people who have large grants (in the millions) to work on QC. One personally belives that they will never be faster than classic computers. The other things they will only compete with classic computers when “simulation” quantim systems.

Oh and there is no proof yet that factoring or that the DL problem are NP hard/complete or otherwise. For all we know they could be P (I doubt it). But you never know…… ;).

greg March 24, 2008 9:15 AM

Oh one more comment.

There is a habit when considering exotic techogies that need advances. That these same advance will not improve anything else.

If we can build a n qbit computer, with techogly X, there is a very very good chance that with the same tech we can build much much better classic computer devices….

Andy Hickmott March 24, 2008 9:22 PM

I wonder how many qbits it takes to factor a 4096-bit number known to have precisely two prime factors?

Anonymous March 25, 2008 12:13 AM

Steve, the outcome of a quantum computer is probabilistic- if you put things in superposition. This isn’t as a bad as it sounds- there are techniques to skew the probabilities towards the correct answer. Grover’s algorithm is the other famous quantum computing algorithm and he uses an “inversion about average” to skew things towards the correct answer. The benefit is that superposition lets you act on all the possible inputs at once. So the result of an operation (gate) is only probabilistic if you place the input in superposition and then observe (measure) it after.

In cases such as factoring, the hard part is coming up with the factors- checking them is easily done with existing computers. So even if the result of a quantum computation is probabilistic, in these cases you can just check the answer and retry if it is wrong. Even if you have to retry quite a bit, this is still better than classical (existing nonquantum) computers.

It seems that people seem to think of quantum computers as some sort of magic device that will solve all these problems. They won’t, they’ll only solve certain problems better. The simulation of a quantum system on a classical computer experiences an exponential slow down- this doesn’t happen on a quantum computer. I think the people who predict quantum computers won’t exist are a little short sighted, as of 2006 we’re already up to experimental systems of 12 qubits [1].

Bruce, as others such as KTC have pointed out, I’m really disappointed that you linked this. The author seems to have read enough on the subject to make it sound like he knows what he is talking about- but doesn’t.

[1] D. Bacon and D. Leung, “Toward a World with Quantum Computers,” Communications of the ACM, vol. 50, pp. 55-59, September 2007 2007.

Some other good intro texts on quantum computing in addition to Nielsen and Chuang:

N. D. Mermin, Quantum Computer Science: An Introduction, 1 ed. Cambridge, UK: Cambridge University Press, 2007.

P. Kaye, R. Laflamme, and M. Mosca, An Introduction to Quantum Computing. New York City, New York: Oxford University Press, 2007.

M. Hirvensalo, Quantum Computing, 2 ed. Berlin: Springer, 2004.

Crypto Student March 25, 2008 12:43 AM

@Jeff et al.:

Perhaps the motivation for recommending elliptic curve systems is key length.

From my reading, there is no strong basis to assert that EC is inherently more secure than (say) RSA or Diffie-Hellman. The main EC selling point is that the expected computational cost to break matches that of these older public key systems at smaller key sizes.

When the day comes that a 500 byte RSA key is not considered strong enough, it would (by theory) be possible to increase the computational cost of breaking the cipher by switching to an EC system without using larger keys.

This consideration may be relevant for established protocols or systems that have a built-in limit to key length.

Laser-Andy March 25, 2008 9:15 AM

Moore, Moore, Moore, too much Moore, everywhere, even on quantum computing.

Ty my opinion, Moore’s law is only an indicator of technology, connected to miniature electronic chips. The driving force of this revolution, the computer, surfs on this miniaturization to gain speed and memory and so on…

On the optics side, we have demands on faster communication and therefore miniaturizations down to the atomic-layer-level (building up quantum dots) as well as nano-structures as wires in the sub-wavelength regime are up to date. There is a lot going on since the advent of the laser 51 years ago.

There is no need that quantum computing need to develop its own technology, we probably have it already and therefore we are far more advanced as in building up computers as 61 years ago, as the first transistor was invented.

And yes, you can use Moore’s law as an indicator, when quantum computers will be available. Despite technology gets more expensive (how the hell we will get structures of 13nm (or even less) on millions of chips…), try to extend the slope to the region where the proper description of physics is the quantum mechanics and not classical rules any longer. No doubt, classical computers with a current of 1 electron possible to switch a gate between two bits will be realized, but with so many problems in realization, because you have to fight against the quantum nature. Then a real smart way will be to construct a quantum computer to use all the benefits. Sooner or later all computers will be “quantum” anyhow.

Michael Nielsen March 25, 2008 4:13 PM

The quoted section from the post in question makes an elementary mistake, confusing the number of operations (“gates”) in the computation with the number of qubits in the quantum computer. The former can be many orders of magnitude larger than the latter, and in most (not all) proposed designs for a quantum computer, it will be.

Such an elementary error doesn’t bode well for the correctness of the rest of the quoted post.

superprofesional March 27, 2008 5:11 AM

Author don’t very good know about quantum computer, becouse for quantum computer to factorize 4096 bit number don’t need 4096^3 gates, but need c4096 gates and about k4096^3 time. But gates need 4096^2 or 4096 depending how much time need if need more time then need fewer gates…

Yosemite1967 April 15, 2008 12:26 PM

I’m really glad that someone is trying to debunk the hype that IS quantum physics. It’s nothing more than naturally occurring “wheels of fortune”, perpetually spinning so fast that you can’t read ’em unless you take a snapshot. Everything else is just hype and algorithms designed to take advantage of these glorified random-number generators. The same algorithms (if the number of possible states is tweaked) can be used with any random-number generation system–like a deck of cards, a pair of dice, a coin, a wheel of fortune, etc., etc.

If anyone (except one who has a vested interest in the hype) thinks that I’m oversimplifying, please tell me how.

Laszlo Kish May 10, 2008 10:13 AM

Bruce Schneier’s conclusions are correct, even though some of the arguments may need corrections/refinements. In 2003, I chaired SPIE’s Fluctuations and Noise Symposium and arranged a public debate entitled “Dreams versus Reality: Plenary Debate Session on Quantum Computing”. That was in June. In September, DARPA, the largest defense sponsor of quantum computing shut down the whole program. The debate can be read here:


Quantum computers need the whole logic circuitry located within the quantum coherence volume. That poses the greatest technical problem at building it. But the problem with running a quantum computer is similarly serious: heat. According to optimistic calculations, a general purpose quantum computer would dissipate at least 1000 times more heat than a classical computer with the same performance.

Speed, error rate and heat form a triangle of interrelated aspects. This is the fundamental killer of Moore’s law, too. Here is some more info from back around 2003:


Last Thursday, we had a seminar by a leading researcher at IBM and the seminar was focused on the very final stage of Moore’s law, where we presently are. He said all the resources are running out to continue Moore’s law with reasonable error rate, energy dissipation and costs.

Laszlo Kish

valan July 28, 2008 4:12 AM

I am interested in quantum computing .I am a post graduate student.I want to do project in this field.please give me a idea in which particular area I do .

Anomolous April 13, 2010 2:03 PM

Quantum computing is almost here.
They are close to device-ifying the
technology. Then they will componentize
it, then they will make million petaflop
computers the size of your head.
It is scary stuff, brute force decryption
in the blink of an eye, brute force A-I.,
scary stuff, makes HAL seem like
Robbie the Robot. Do we really, really
want quantum computing?

iab June 19, 2010 3:59 PM

I know they have processors at the atomic level…. don’t think anyone has figured out how to make an atomic level hd, other than me so far, only thing to do now is get the info to the right people…

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.