Quantum Computer Factors the Number 15

This is an important development:

Shor’s algorithm was first demonstrated in a computing system based on nuclear magnetic resonance—manipulating molecules in a solution with strong magnetic fields. It was later demonstrated with quantum optical methods but with the use of bulk components like mirrors and beam splitters that take up an unwieldy area of several square meters.

Last year, the Bristol researchers showed they could miniaturize this optical setup, building a quantum photonic circuit on a silicon chip mere millimeters square. They replaced mirrors and beam splitters with waveguides that weave their way around the chip and interact to split, reflect, and transmit light through the circuit. They then injected photons into the waveguides to act as their qubits.

Now they’ve put their circuit to work: Using four photons that pass through a sequence of quantum logic gates, the optical circuit helped find the prime factors of the number 15. While the researchers showed that it was possible to solve for the factors, the chip itself didn’t just spit out 5 and 3. Instead, it came up with an answer to the “order-finding routine,” the “computationally hard” part of Shor’s algorithm that requires a quantum calculation to solve the problem in a reasonable amount of time, according to Jeremy O’Brien, a professor of physics and electrical engineering at the University of Bristol. The researchers then finished the computation using an ordinary computer to finally come up with the correct factors.

I’ve written about quantum computing (and quantum cryptography):

Several groups are working on designing and building a quantum computer, which is fundamentally different from a classical computer. If one were built—and we’re talking science fiction here—then it could factor numbers and solve discrete-logarithm problems very quickly. In other words, it could break all of our commonly used public-key algorithms. For symmetric cryptography it’s not that dire: A quantum computer would effectively halve the key length, so that a 256-bit key would be only as secure as a 128-bit key today. Pretty serious stuff, but years away from being practical.

Here’s a really good essay on the realities of quantum computing and its potential effects on public-key cryptography.

Posted on September 22, 2009 at 2:00 PM34 Comments


Craig September 22, 2009 2:40 PM

But in a quantum computer, wouldn’t the number be factored and unfactored at the same time?


Bill P. Godfrey September 22, 2009 3:31 PM

  1. Take 128 radioactive atoms.
  2. Wait that element’s half-life.
  3. Look at each atom. If it decayed write down 1. Otherwise, write down 0.

You know have a 128 bit binary number, and 2^128 copies of you in parallel universes with have a different number.

  1. Attempt to decrypt using your 128 bit number. One of you will have the right key. If its you, celebrate now.

(I think I’ve misunderstood quantum computing somewhere along the way.)

Y10K bug September 22, 2009 3:56 PM

@Bill: Actually that’s pretty close! The quantum computer generates many solutions at once, and reading the result of the computation is actually a measurement which gives you only one solution out of many. In most cases, this solution doesn’t help you at all so you have to try again. In case of factorization you only have to try a polynomial number of times to get a “good” result – a result which reveals a prime factor, using ordinary computation.

bitmonger September 22, 2009 4:13 PM

Interesting summary of the state of Quantum Computers. I found it very interesting , but I wonder if choosing 4096 bit RSA is a fair way to estimate complexity.

Shor’s algorithm will likely be improved at some point, and if ECC really is a denser key wouldn’t it be reasonable to assume some isomorphism should compress the problem to something closer to ECC density?

Basically, Where does the 70 * k^3 estimate come from and would the same scaling apply for common ECC methods?

Anon September 22, 2009 4:44 PM

@Bill P. Godfrey: That’s the Anathem version of quantum computing! 🙂

@bitmonger: Yes, elliptic curves could be affected. Look up [Shor’s algorithm elliptic curves]. Look up McEliece for a cryptosystem for which we haven’t found a good quantum algorithm yet. (Its downside: public keys are 64+ KB.)

The linked essay about generations of quantum computing is awfully interesting, but I don’t think we can be so certain quantum computers won’t matter. I wonder if Moore’s Law held true in the “prehistory” of conventional computing, pre-integrated circuit; we may be in the prehistory of quantum computing, before development takes off. And, of course, we could discover really basic reasons quantum computers or Shor’s algorithm can’t work as predicted, or we could discover more efficient quantum factoring algorithms.

Maybe the lesson is that when it comes to really long-term security, we have to remember that cryptography or computation could change radically in fifty years, not just march along in the way that’s easy for us to imagine. And maybe we shouldn’t rely on cryptography alone for that kind of security!

Clive Robinson September 22, 2009 6:50 PM

I shall go and make a nice hot cup of a strong brownian motion generator and have a sip or two before powering up my Bambaweenie….

I here the sound of the late Douglas Adams laughing gently at the improbability of it all 8)

tcliu September 22, 2009 6:57 PM

@Bill P.: Almost! The thing with quantum computers is that between steps 2 and 3, you’d perform some operations that increase the probability of the correct answer being in your universe.

So, yes, you almost got it.

Anon September 22, 2009 7:58 PM

Huh — IBM claimed to do the same thing in 2001. What’s new about this and some other recent efforts is that the scientists actually observed that there was quantum entanglement: we can be relatively sure they didn’t try to make a quantum computer but end up with a conventional one instead.

MHarlos September 22, 2009 10:28 PM

Some of my most sensitive data is encrypted using 15 as the RSA modulus. The reported breakthrough in quantum computing means that this data is no longer secure 🙁

But I’m going to erase these archives, after re-encrypting the data using 143 as my RSA modulus. It’ll be years before they can factor that monster!

Harry Johnston September 23, 2009 12:08 AM

It seems to me that for quantum algorithms you need to consider the number of qubits required as your measure of difficulty instead of (or as well as) the number of steps required.

So, apparently the qubit difficulty for Shor’s algorithm for factoring is 72*k^3; does anyone know what the qubit difficulty for the best known algorithm for discrete logarithms is?

Personally, I don’t believe that quantum error correction will actually work in the general (i.e., real world) case, which means quantum computers will always be limited to a fairly small number of qubits. YMMV.

Clive Robinson September 23, 2009 2:09 AM

@ Harry Johnston,

“Personally, I don’t believe that quantum error correction will actually work in the general (i.e., real world) case, which means quantum computers will always be limited”

Funny thing is the quantum physics scientists pretty much all said it was a nice idea that wouldn’t work, then an engineer came along and showed them (error correction) that they where wrong.

The real difference between a scientist and an engineer is, scientists are always looking for problems (to “write up”), engineers however are inventing solutions to get on with the job…

It’s also the reason for the famous quote about improbable sounding ideas,

“If an elderly scientist says it’s possible then they are probably right. If however an elderly scientist says it’s not possible then they are probably wrong.”

Lawrence D'Oliveiro September 23, 2009 3:47 AM

Quantum computing will never offer computing power radically beyond that possible with other technologies, because it violates a fundamental principle of the universe, namely that you cannot get “something for nothing”. And if constructing exponential numbers of computing elements from linear numbers of components is not “something for nothing”, then what is?

Greg September 23, 2009 4:07 AM

So my physics department was working on this. There are some quite big limitations even if you have engineers on your team 😉

The first is that a building a quantum computer is exponential hard to build in the number of qbits. Even with error correction. In fact the requirement for error correction bits kinda makes it even worse. ie O(2^(n^2)) rather than just O(2^n). There is nothing you can do about that, and many physicist think that there is even probably a finite upper limit to the number of entangled qbits you can have.

The next problem is that a n qbit quantum computer cannot simulate a 2n qbit, or even a n+1 qbit computer in polynomial time. This is in contrast to doing 1000+ bit arithmetic on a 64 bit computer easily. So if you have a 1025 bit prime number… a 1024 qbit quantum computer won’t help. Add error correction and you may even need 3-4000 qbits.

And 15 is a 4 bit number…

I am far more concerned at more direct classical progress at factoring large numbers. We have a huge improvement in the last decades. Combined with faster computers, we can factor or solve discrete logs far better than most would have predicted 40 years ago i think.

Clive, The quote from http://www.quotesandsayings.com/quotes/scientists/ reads

If an elderly but distinguished scientist says that something is possible he is almost certainly right, but if he says that it is impossible he is very probably wrong.
–Arthur C. Clarke

Note the very probably wrong….

greg September 23, 2009 4:15 AM

@Lawrence D’Oliveiro

There is no something for nothing.

Computational complexity is based on some model of computation. Thats currently based –somewhat loosely on a Turing machine.

This does not mean that there are not other computational models, that give different complexity results… Optical computers tried to use this since they can do some complicated math functions due to the physics very easily.

And also remember that we don’t even know if factoring is in the higher complexity classes. though most think it is…

Fridz September 23, 2009 6:44 AM

“You cannot get something for nothing”
– Lawrence D’Oliveiro September 23, 2009 3:47 AM

“Heavier-than-air flying machines are impossible”
– Lord Kelvin (1895)


Andrew September 23, 2009 10:16 PM

Actually Bruce, this is important, but not for the reasons one may think. The Bristol group have done an exact repeat of an experiment done two years ago by two different groups in Australia & China (see Physical Review Letters 99, articles 250504 & 250505 [2007].)

What is new is that one part of the experiment – the optical circuit – has been minaturized, but the other equally important parts – the photon sources and detectors – have not.

What is not new are the results. Indeed, they are actually worse than those of two years ago, in that the Bristol results could have been achieved with noisy photons (technically “mixed states”). The earlier experiment actually measured the amount of entanglement — a strictly quantum phenomena — to show they had a quantum result. The Bristol team did not do this.

Lawrence D'Oliveiro September 23, 2009 11:33 PM

Ah, I see the cargo-cultists have arrived. One lot of techno-mumbo-jumbo not convincing enough? Just add more techno-mumbo-jumbo. So you think quantum computers represent some magical new computation model that is not subject to the standard constraints of computability theory? That they somehow manage an end-run around the Church-Turing Thesis?

Dream on.

Gamma Ray September 24, 2009 1:10 AM

@Lawrence D’Oliveiro:

“Now all your sinners
This is the prophecy
The revelations of your own destiny
Sleep well and dream on
The dream that you have sold
And now my brothers
This world is slowly getting cold…”


Steve McKay September 24, 2009 2:28 AM

@Lawrence D’Oliveiro:

Did someone claim to have a quantum computing algorithm for the Busy Beaver problem? Complexity is not computability.

greg September 24, 2009 3:14 AM

@Lawrence D’Oliveiro

Computability has nothing to do with complexity.
Clearly factoring is computable….What we are talking has nothing to do with computable or not.

As for complexity.. Here is an example with optical computers. I have a problem that runs in O(n (n ln n )) time since i need to do on the order of n FFT/IFFTs. Optical computers (in a fancy way) can do the FT in constant time when built that way. So now the complexity goes down to O(n) rather than O(n^2 ln n). If n is big this can be quite a speed up.

The same thing can be said for normal complexity on plain computers. Assumptions mater. Many O() results may only count arithmetic operations, while these days getting a number from memory can in fact be much slower. Also what is n. Factoring with \rho factorization is O(sqrt(n)) IIRC. But that where n is the number — not the number of digits that cryptographers may use, in this case it becomes O(exp(n)).

I could go on. Also as i said before we don’t even know if factorization is in P or not.

Anyway the wiki entry… not always the most accurate info, but a good place to start.


Clive Robinson October 17, 2009 2:41 PM

@ DoktorThomas,

“Lord Kelvin (1895) missed birds? Seems hard to believe…”

We are talking Victorian Britain, where Charles Darwin was to frightened to publish his ideas on evolution due to what we now call the “creationist” view point being the dominant one. Where science and politics trod timidly in the shadow of the Church.

Being a presbyterian his viewpoint would probably have been “birds are creatures of God” and “machines the work of man”, and the two where incomparable due to their respective creators.

As a matter of historical fact man had made several “heavier than air” machines that had flown as well as the “lighter than air” baloons etc, prior to Lord Kelvin’s (William Thomson) pronouncment.

He even repeated his pronouncment in 1902 saying he doubted they would be of practical significance. This was just five years prior to his death and as we now know within a decade of areoplanes becoming practical and shortly there after changing the face of war forever.

This was not the first time William Thomson had made an incorrect pronouncment, he originaly pronounced X-rays to be impossible. And just a short while later had it proved to him as well as later having his hand X-rayed.

These pronouncments from a man so venerable as to be considered the “father of physics” might well have been the thing that prompted the Arther C Clark comment 😉

Pepe October 22, 2009 4:14 PM

Dont worry friends, we will be back very soon using again OTP encryption like the only secure choice¡¡¡¡¡¡¡
Good look for everyone

Paul October 22, 2009 4:31 PM


I look forward to your solution for the key distribution problem. If you have a secure channel to convey a secret key of n bits, why not just use the channel it to convey the secret message it would otherwise encrypt?

Mark Green February 1, 2010 10:32 PM

If quantum computing is based on multi-dimensionality, doesn’t there have to be a whole bunch of universes out there where the computer gets the answer wrong?

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.