## Why Is Quantum Computing So Hard?

Blog post (and two papers) by Ross Anderson and Robert Brady. News article.

Note that I do not have the physics to evaluate these claims.

Posted on February 6, 2013 at 12:21 PM • 16 Comments

Blog post (and two papers) by Ross Anderson and Robert Brady. News article.

Note that I do not have the physics to evaluate these claims.

Posted on February 6, 2013 at 12:21 PM • 16 Comments

Eitan Adler • February 6, 2013 12:33 PM

Related blog post from Scott Aaronson

http://www.scottaaronson.com/blog/?p=1255

"Yeah, this paper is pretty uninformed even by the usual standards of attempted quantum-mechanics-overthrowings. Let me now offer three more general thoughts."

Thomas Ferraro • February 6, 2013 1:07 PM

So quantum computing is hard because quantum mechanics is wrong? Hmmm...

Brian • February 6, 2013 1:41 PM

@ Thomas

Not so much is quantum wrong, but in doing calculations and simulations there are huge limitations. The wavefunction describes the system, and must obey several postulates such as the wavefunction must be an eigenfunction etc. On simplest terms we can solve for the Hydrogen atom and Helium +1 atom. once you toss in 2 electrons we have no way of explicitly solving that electron-electron repulsion (3 body problem). So theorists get creative for their models and make force fields and other best approximations for their models. Even for solving something simple like for table sugar, coupled cluster theory, which gives excellent agreement with observed values, is just too hard. What I am trying to say is, quantum is hard, quantum is real from the physical observables, our ability to simulate is poor.

Gweihir • February 6, 2013 2:10 PM

On a related note, Quantum Mechanics may still be wrong, or maybe just not precise enough ;-)

It is a huge difference whether you make experiments on pairs of photons, or whether you suddenly want to do basically analog computations with a precision of 1:10^600 (e.g. for 2048 bit RSA). Conventional analog computers get to no more than 20 bit resolution and that only with extreme measures. Also, quantum computations cannot be divided into sub-problems, you always have to do the core-algorithm in one gigantic step: If you have a 2048 bit problem, a 2047 bit quantum computer is useless.

I predict this hype will be over in a few years.

Bob T • February 6, 2013 2:21 PM

The problem with quantum computing is that you can't observe what you're computing without changing the result.

Bob T • February 6, 2013 2:24 PM

Shrodinger received the blue screen of death. Shrodinger did not receive the blue screen of death.

Alan Bostick • February 6, 2013 3:45 PM

Bruce, why are you giving airtime to this nonsense? Ross Anderson may well have excellent computer security credentials, but his physics is sheer crackpottery.

Mitch P. • February 6, 2013 3:51 PM

I suspect that significant quantum computing is like fusion based energy production... it is 20 years in the future and always will be. :-)

Nick P • February 6, 2013 8:46 PM

I'm with Mitch on quantum computing. It's a nice dream and funding source. Nothing more so far and probably nothing for a while.

Winter • February 7, 2013 2:47 AM

I would say that quantum computing is possible in principle, for the simple reason that quantum interactions do everything you want from a computation. Whether it is practical is something else.

The problem, as I see it, is that you cannot simulate a quantum system in a non-quantum computer. So any result of your simulation will be an approximation.

Too bad that entanglements are destroyed in the margins of any approximation.

Nick Nolan • February 7, 2013 9:46 AM

I skimmed trough the paper and it looks like crackpottery.

It looks like two enthusiastic amateurs wrote paper on quantum physics.

BobD • February 7, 2013 12:27 PM

booroos do you heard about analog computers DARPA working on ? they can run shore algorithm ? or its not true ?

Clive Robinson • February 7, 2013 3:26 PM

It appears that for some strange (Spooky ;) reason Quantum Physics has hit the news twice in one week in the news.

Prof Brian Cox had a program on BBC television titled "A Night With the Stars" where he made some statments about the Pauli exclusion Principle indicating that his rubbing his hands on a (1million GBP rough) diamond would add energy to the electrons in the atoms of the carbon instantaniously causing them to change the energy state and that PEP would cause this to happen to all the other electrons in the universe...

To find out more Google "Pauli Exclusion Principle Brian Cox".

Brian • February 8, 2013 1:57 PM

@ Clive

I watched your video. I do not think this person has a clue about the Pauli exclusion principle. First, the principle states that no two identical fermions may occupy the same quantum state at the same time. Let us simply consider a Neon atom, 10 electrons. Well luckily we have 5 different orbitals to stick them in (5 different quantum numbers there). So every orbital now can have 2 electons, but yes they now have all the same quantum number…. This is where Pauli stepped in and said, oh look lets add another quantum number, let’s call it spin. So we have spin 1/2 and spin -1/2, problem solved. You need to remember these quantum numbers are our attempt to describe nature. So what if we take 2 Neon atoms. Lets say the 1s orbital, +1/2 electrons have all the same quantum numbers! Well, kinda. The problem there is they are not in the same space, that means they are not sharing the same quantum numbers as the numbers for their positions are different….. any undergrad physics or chemistry QM class will teach you this. On a side note, as he “warms up” the diamond in his hand and the electrons are jumping to higher energy levels…… well, he is adding energy, yes, in the form of vibrations. If the energy levels were changing the optical properties of the diamond would change, spectroscopy 101. I hate people who use half correct science to bamboozle non-science audiences.

Clive Robinson • February 9, 2013 2:46 AM

@ Brian,

Yup, and if it was anyone other than Prof Brian Cox we would just laugh politely and ignore it.

But amongst his many other activities Prof Cox is also involved with CERN and the LHC, where he is responsible for parts of ATLAS and worked on the R&D behind FP420.

Well his response to the argument is read the following,

http://www.hep.manchester.ac.uk/u/forshaw/BoseFermi/Double%20Well.html

(and a book he has coauthored with Jeff Forshaw who wrote the above web page).

And having read it, it is consistent with the two main arguments he presented in the earlier part of the show (double slit experiment and probability of the diamond tunneling out of the box)

In a grosely over simplified way (I can hear the roar of the flames already ;-) what he appears to be saying harks back to Wave-v-Particle argument and that on the Wave side there is no local for particles that are in some way connected.

What is not clear is how the connection between all similar particals in the universe was established (except perhaps at the time of the Big Bang).

Now the little I did on quantum mechanics was taught to me when Brian Cox was probably first getting to grips with Carl Sagan's more popular ideas, and thus can be considered a little prehistoric compared to modern QM ideas.

That said however when describing the wave effect of the double slit experiment he mentioned Richard Feynman and his thoughts and work relating to it and yes from what I remember Feynman likewise argued against local effects.

Now IF (and some consider it a big if) Feynman, Cox and Forshaw are correct in their view then yes I can see why the security of Quantum Crypto can be viewed as standing on shaky ground and likewise why Quantum Crypto won't be of as much use as some would hope.

Look at it this way if every electron can and does effect all other electrons in the universe because they are in effect connected (likewise other Qbit candidates). Even if the effect is two small to measure individualy it will generate a tiny part of the overal signal that is effectivly random and thus equivalent to a noise barrier, beyond which you cannot go in a finite time. Thus whilst Quantum Computing will calculate the result you want, it will be buried in the noise, and to get it out you will have to perform one of a number of statistical actions that all require you to repeate the calculation over and over or have multiple parallel computers (but...). Either way the result only improves with the square root of the number of computations...

Oddly we know this from other analog computers where even with the best circuit design you were very very lucky to get results reliable to one part in a million.

It looks like I'm going to have to get the Cox and Forshaw book and dig out my old Feynman texts and have a thoughtful read.

Danishcrows • February 13, 2013 2:36 AM

@Gweihir

Quantum computing is emphatically *not* basically analog computation. People have been promising an impossibility proof of quantum computing based on analog computing results since the 1990s, but not one has appeared, and the reason they haven't appeared is due to two uniquely quantum phenomena - entanglement and the measurement postulate. These allow the development of fault-tolerant techniques and the establishment of the threshold theorem.

Knill's paper *Quantum computing with realistically noisy devices* shows that (in your language) a 7-bit resolution would be sufficient.

Subscribe to comments on this entry

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.

## Leave a comment