Yup, and if it was anyone other than Prof Brian Cox we would just laugh politely and ignore it.
But amongst his many other activities Prof Cox is also involved with CERN and the LHC, where he is responsible for parts of ATLAS and worked on the R&D behind FP420.
Well his response to the argument is read the following,
(and a book he has coauthored with Jeff Forshaw who wrote the above web page).
And having read it, it is consistent with the two main arguments he presented in the earlier part of the show (double slit experiment and probability of the diamond tunneling out of the box)
In a grosely over simplified way (I can hear the roar of the flames already ;-) what he appears to be saying harks back to Wave-v-Particle argument and that on the Wave side there is no local for particles that are in some way connected.
What is not clear is how the connection between all similar particals in the universe was established (except perhaps at the time of the Big Bang).
Now the little I did on quantum mechanics was taught to me when Brian Cox was probably first getting to grips with Carl Sagan's more popular ideas, and thus can be considered a little prehistoric compared to modern QM ideas.
That said however when describing the wave effect of the double slit experiment he mentioned Richard Feynman and his thoughts and work relating to it and yes from what I remember Feynman likewise argued against local effects.
Now IF (and some consider it a big if) Feynman, Cox and Forshaw are correct in their view then yes I can see why the security of Quantum Crypto can be viewed as standing on shaky ground and likewise why Quantum Crypto won't be of as much use as some would hope.
Look at it this way if every electron can and does effect all other electrons in the universe because they are in effect connected (likewise other Qbit candidates). Even if the effect is two small to measure individualy it will generate a tiny part of the overal signal that is effectivly random and thus equivalent to a noise barrier, beyond which you cannot go in a finite time. Thus whilst Quantum Computing will calculate the result you want, it will be buried in the noise, and to get it out you will have to perform one of a number of statistical actions that all require you to repeate the calculation over and over or have multiple parallel computers (but...). Either way the result only improves with the square root of the number of computations...
Oddly we know this from other analog computers where even with the best circuit design you were very very lucky to get results reliable to one part in a million.
It looks like I'm going to have to get the Cox and Forshaw book and dig out my old Feynman texts and have a thoughtful read.