Quantum Computing and Cryptography

Quantum computing is a new way of computing -- one that could allow humankind to perform computations that are simply impossible using today's computing technologies. It allows for very fast searching, something that would break some of the encryption algorithms we use today. And it allows us to easily factor large numbers, something that would break the RSA cryptosystem for any key length.

This is why cryptographers are hard at work designing and analyzing "quantum-resistant" public-key algorithms. Currently, quantum computing is too nascent for cryptographers to be sure of what is secure and what isn't. But even assuming aliens have developed the technology to its full potential, quantum computing doesn't spell the end of the world for cryptography. Symmetric cryptography is easy to make quantum-resistant, and we're working on quantum-resistant public-key algorithms. If public-key cryptography ends up being a temporary anomaly based on our mathematical knowledge and computational ability, we'll still survive. And if some inconceivable alien technology can break all of cryptography, we still can have secrecy based on information theory -- albeit with significant loss of capability.

At its core, cryptography relies on the mathematical quirk that some things are easier to do than to undo. Just as it's easier to smash a plate than to glue all the pieces back together, it's much easier to multiply two prime numbers together to obtain one large number than it is to factor that large number back into two prime numbers. Asymmetries of this kind -- one-way functions and trap-door one-way functions -- underlie all of cryptography.

To encrypt a message, we combine it with a key to form ciphertext. Without the key, reversing the process is more difficult. Not just a little more difficult, but astronomically more difficult. Modern encryption algorithms are so fast that they can secure your entire hard drive without any noticeable slowdown, but that encryption can't be broken before the heat death of the universe.

With symmetric cryptography -- the kind used to encrypt messages, files, and drives -- that imbalance is exponential, and is amplified as the keys get larger. Adding one bit of key increases the complexity of encryption by less than a percent (I'm hand-waving here) but doubles the cost to break. So a 256-bit key might seem only twice as complex as a 128-bit key, but (with our current knowledge of mathematics) it's 340,282,366,920,938,463,463,374,607,431,768,211,456 times harder to break.

Public-key encryption (used primarily for key exchange) and digital signatures are more complicated. Because they rely on hard mathematical problems like factoring, there are more potential tricks to reverse them. So you'll see key lengths of 2,048 bits for RSA, and 384 bits for algorithms based on elliptic curves. Here again, though, the costs to reverse the algorithms with these key lengths are beyond the current reach of humankind.

This one-wayness is based on our mathematical knowledge. When you hear about a cryptographer "breaking" an algorithm, what happened is that they've found a new trick that makes reversing easier. Cryptographers discover new tricks all the time, which is why we tend to use key lengths that are longer than strictly necessary. This is true for both symmetric and public-key algorithms; we're trying to future-proof them.

Quantum computers promise to upend a lot of this. Because of the way they work, they excel at the sorts of computations necessary to reverse these one-way functions. For symmetric cryptography, this isn't too bad. Grover's algorithm shows that a quantum computer speeds up these attacks to effectively halve the key length. This would mean that a 256-bit key is as strong against a quantum computer as a 128-bit key is against a conventional computer; both are secure for the foreseeable future.

For public-key cryptography, the results are more dire. Shor's algorithm can easily break all of the commonly used public-key algorithms based on both factoring and the discrete logarithm problem. Doubling the key length increases the difficulty to break by a factor of eight. That's not enough of a sustainable edge.

There are a lot of caveats to those two paragraphs, the biggest of which is that quantum computers capable of doing anything like this don't currently exist, and no one knows when -- or even if ­- we'll be able to build one. We also don't know what sorts of practical difficulties will arise when we try to implement Grover's or Shor's algorithms for anything but toy key sizes. (Error correction on a quantum computer could easily be an unsurmountable problem.) On the other hand, we don't know what other techniques will be discovered once people start working with actual quantum computers. My bet is that we will overcome the engineering challenges, and that there will be many advances and new techniques­but they're going to take time to discover and invent. Just as it took decades for us to get supercomputers in our pockets, it will take decades to work through all the engineering problems necessary to build large-enough quantum computers.

In the short term, cryptographers are putting considerable effort into designing and analyzing quantum-resistant algorithms, and those are likely to remain secure for decades. This is a necessarily slow process, as both good cryptanalysis transitioning standards take time. Luckily, we have time. Practical quantum computing seems to always remain "ten years in the future," which means no one has any idea.

After that, though, there is always the possibility that those algorithms will fall to aliens with better quantum techniques. I am less worried about symmetric cryptography, where Grover's algorithm is basically an upper limit on quantum improvements, than I am about public-key algorithms based on number theory, which feel more fragile. It's possible that quantum computers will someday break all of them, even those that today are quantum resistant.

If that happens, we will face a world without strong public-key cryptography. That would be a huge blow to security and would break a lot of stuff we currently do, but we could adapt. In the 1980s, Kerberos was an all-symmetric authentication and encryption system. More recently, the GSM cellular standard does both authentication and key distribution -- at scale -- with only symmetric cryptography. Yes, those systems have centralized points of trust and failure, but it's possible to design other systems that use both secret splitting and secret sharing to minimize that risk. (Imagine that a pair of communicants get a piece of their session key from each of five different key servers.) The ubiquity of communications also makes things easier today. We can use out-of-band protocols where, for example, your phone helps you create a key for your computer. We can use in-person registration for added security, maybe at the store where you buy your smartphone or initialize your Internet service. Advances in hardware may also help to secure keys in this world. I'm not trying to design anything here, only to point out that there are many design possibilities. We know that cryptography is all about trust, and we have a lot more techniques to manage trust than we did in the early years of the Internet. Some important properties like forward secrecy will be blunted and far more complex, but as long as symmetric cryptography still works, we'll still have security.

It's a weird future. Maybe the whole idea of number theory­-based encryption, which is what our modern public-key systems are, is a temporary detour based on our incomplete model of computing. Now that our model has expanded to include quantum computing, we might end up back to where we were in the late 1970s and early 1980s: symmetric cryptography, code-based cryptography, Merkle hash signatures. That would be both amusing and ironic.

Yes, I know that quantum key distribution is a potential replacement for public-key cryptography. But come on -- does anyone expect a system that requires specialized communications hardware and cables to be useful for anything but niche applications? The future is mobile, always-on, embedded computing devices. Any security for those will necessarily be software only.

There's one more future scenario to consider, one that doesn't require a quantum computer. While there are several mathematical theories that underpin the one-wayness we use in cryptography, proving the validity of those theories is in fact one of the great open problems in computer science. Just as it is possible for a smart cryptographer to find a new trick that makes it easier to break a particular algorithm, we might imagine aliens with sufficient mathematical theory to break all encryption algorithms. To us, today, this is ridiculous. Public- key cryptography is all number theory, and potentially vulnerable to more mathematically inclined aliens. Symmetric cryptography is so much nonlinear muddle, so easy to make more complex, and so easy to increase key length, that this future is unimaginable. Consider an AES variant with a 512-bit block and key size, and 128 rounds. Unless mathematics is fundamentally different than our current understanding, that'll be secure until computers are made of something other than matter and occupy something other than space.

But if the unimaginable happens, that would leave us with cryptography based solely on information theory: one-time pads and their variants. This would be a huge blow to security. One-time pads might be theoretically secure, but in practical terms they are unusable for anything other than specialized niche applications. Today, only crackpots try to build general-use systems based on one-time pads -- and cryptographers laugh at them, because they replace algorithm design problems (easy) with key management and physical security problems (much, much harder). In our alien-ridden science-fiction future, we might have nothing else.

Against these godlike aliens, cryptography will be the only technology we can be sure of. Our nukes might refuse to detonate and our fighter jets might fall out of the sky, but we will still be able to communicate securely using one-time pads. There's an optimism in that.

This essay originally appeared in IEEE Security and Privacy.

Posted on September 14, 2018 at 6:15 AM • 37 Comments

Comments

meSeptember 14, 2018 8:12 AM

@wooglyboogly
lmao for aliens

if i remember correctly zrtp (linux encrypted voice calls) made by Philip Zimmermann is quantum resistant.

Clive RobinsonSeptember 14, 2018 11:56 AM

@ Bruce and the usual suspects,

You look away blink twice and somebody tugs at your coat tails and says,

    Hey Mister, you want a 128qbit quantum computer chip? We get some in real soon.

Yup you read that right 128qbits on a chip,

https://medium.com/rigetti/the-rigetti-128-qubit-chip-and-what-it-means-for-quantum-df757d1b71ea

Though what you can do with it is far from clear for various reasons. @Bruce mentions one of them above with,

    Error correction on a quantum computer could easily be an unsurmountable problem

But also says,

    My bet is that we will overcome the engineering challenges, and that there will be many advances and new techniques­but they're going to take time to discover and invent.

On aspect of the errors is something called "noise". One of the reasons we can have 128bit conventional computers is that each bit is either a one or a zero, and thus designing each bit to reject noise whilst not trivial is comparitively easy. This is not the same with "Quantum Bits" or "qbits" for various reasons each qbit can hold multiple values, thus they have a much much greater suceptability to noise. This problem is also multipled by the way that the qbits interact with each other so getting the noise problem down is actually not just a much more difficult problem than putting more qbits on a chip it's one that rises exponentialy with the number of bits under various circumstances. One of which is temprature. Thus as far as we can see for times to come Quantum Computing is going to also involve cryogenics of some form, which is also an awkward niche in both science and technology.

But one that there may be a solution for based on a hundred and fifty year old "thought experiment" by James Clark Maxwell in 1867[1]. Put simply he imagined two containers with a common barrier. In this common barrier he put a "demon" who's job it is to operate a trap door. If a fast moving molecule or atom comes towards the trap from the left chamber it's let through into the right. Likewise if a slow moving molecule or atom in the right chamber comes towards the trap it's let through to the left chamber. As the temprature of each chamber depends on the average speed of the molecules or atoms it can bee seen that contray to what the second law of thermodynamics implies the left chamber will get colder whilst the right will get hotter.

Well a few years ago it was proved that the "Maxwell's Demon" thought experiment was flawed and that the second law of thermodynamics did not apply in the way originally thought. So people started practical experiments to imolement Maxwell's Demon, with not particularly encouraging results...

That is until last year Prof. David Wiess at Penn State appears to have cracked it right open,

https://www.nature.com/articles/s41586-018-0458-7

http://science.psu.edu/news-and-events/2018-news/Weiss9-2018

Which whilst it is good news for the Prof and his team, it's not necessarily good news for the rest of us.

As I've mentioned in the past "technology is agnostic to use" and that it is "the directing mind that decides how to use it". Further that the decision as to if "the use is for good or bad" depends very much on "The point of view of the observer".

It's clear to most technologists that the Penn State results will be applicable to reducing noise in qbits, but how much improvment it could bring will be down entirely to practical experiments.

But there is another problem with Quantum Computers, you can not get the results out of them without a lot of complications. Not only do these effect the "error" issues it also requires the use of specialised conventional computing hardware.

Put --over simply-- Quantum Computers need conventional computers to get out the results, which tends to slow things down a lot.

We already know of various ways of reducing "random noise" from certain types of signals via various DSP techniques. One such is to align a synthetic signal with the noisy signal in an adjustable matched filter. I won't go into the mathmatics but it is used in many peoples homes right now in their Internet Modems.

Another way is to get the same signal multiple times and average them together. Provided they are correctly aligned the signal adds linearly whilst the noise goes down by the square of the number of samples.

This gives the interesting propersition that the Quantum Computer would have to repeat it's search many many times to actually get the results out beyond a small degree of precision, which kind of brings it back to being limited to what classical computing can do...

This ultimately may be the fate of Quantum Computing, which would relegate it to being "just another one of those theories that don't pan out in practice".

Something we should kind of be used to by now in cryptography. Our algorithms "are fine in theory" but in practice the practical systems can not support the theory, because the theory is not practical or based on invalid or inaplicable assumptions.

[1] https://en.m.wikipedia.org/wiki/Maxwell's_demon

TheInformedOneSeptember 14, 2018 12:01 PM

In order to instantly crack a 256 bit key with a quantum computer don't they have to first generate and maintain 256 stable (useable) qubits? From what I understand, they are a few years off still. I'm sure whoever gets there first (probably the Chinese) will not tell anyone for years to come. Kind of like what happened at Bletchley Park w/Enigma.

Fred PSeptember 14, 2018 12:11 PM

This paper: https://cr.yp.to/papers/pqrsa-20170419.pdf appears to claim that one can create 1 TB RSA keys that are very resistant to Shor's algorithm. " initial implementation results for RSA parameters large enough to push all known quantum attacks above 2^100 qubit operations. "

Not particularly practical, but I'm not certain that "And it allows us to easily factor large numbers, something that would break the RSA cryptosystem for any key length." is accurate.

tzSeptember 14, 2018 12:52 PM

Quantum computing may giveth as well - Imagine a one time pad transmitted by coherent states. There is already quantum cryptogrpahy.

Lawrence D’OliveiroSeptember 14, 2018 4:11 PM

Quantum computers have so far proven useless for number-theoretic problems. The only kinds of problems they have been able to tackle have been ones involving physical simulations, where they have proven faster than digital computers, but less accurate.

In other words, “quantum” computers are just the old analog computers reborn. Back in the day, they, too, could compute physical-simulation problems faster than the new-fangled digital machines, but with limited accuracy. As digital machines got faster and cheaper, the analog machines faded into obscurity.

With the demise of Moore’s Law, that may not happen this time. But they are not going to take over the computing arena by any means.

Impossibly StupidSeptember 14, 2018 4:37 PM

Today, only crackpots try to build general-use systems based on one-time pads -- and cryptographers laugh at them, because they replace algorithm design problems (easy) with key management and physical security problems (much, much harder). In our alien-ridden science-fiction future, we might have nothing else.

Well, then call me a crackpot. I think it's a good idea to act sooner rather than later to better solve the problems with OTP use so that we're more prepared to mitigate the risks associated with quantum computing should it come to pass that we do need that level of security. Let's not shy away from solving the hard problems.

rustyalSeptember 14, 2018 8:33 PM

quantum key distribution is a potential replacement for public-key cryptography. But come on -- does anyone expect a system that requires specialized communications hardware and cables to be useful for anything but niche applications?

"Specialized communications hardware" like an array of tiny beamforming antennas, a chip that compares 5 clock signals from space, or some other magic that's in every portable phone? Wireless QKD over 25 km was already demonstrated. I'm not expecting that to be ubiquitous—but as was stated, it's a weird future, and I wouldn't be so quick to dismiss it.

GweihirSeptember 14, 2018 10:53 PM

With the consistent and decades-long failure of quantum computing to make any meaningful progress towards a product that can be used for more than tiny toy demonstrations, my take is more that those that work on "quantum resistant" crypto have more likely just run out of actually useful work and are now riding the hype-express. Makes some sense with regards to getting research grants (but is morally problematic, as that money did not grow on trees), but otherwise, this seems to not be more than a pretty persistent shared delusion.

echoSeptember 15, 2018 12:58 AM

@Gweihir

I find most discussion about quantum computing to be very boring lately. The discussion is more "boys toys" often badly explained and neglects to engage with the variety of possible discussion.

I get the sense most security discussion is like this too. I have seen it before in other industries.

John DoeSeptember 15, 2018 1:23 AM

@Clive Robinson says: "would have to repeat it's search many many times to actually get the results out beyond a small degree of precision, which kind of brings it back to being limited to what classical computing can do..."

I'm not sure who thought infinite precision would ever work in the real world... If a "qbit" is stored in something physical, it can't have (input/output with) infinite precision without taking up at least one infinite dimension of space or time or something as well... Only in the theoretical can infinity exist (that is to say, that's only where it can be "measured" so to speak, because you can just declare it a finished concept/idea and move on as if it were real).

Concern of the mental health varietySeptember 15, 2018 1:30 AM

@echo

"The discussion is more "boys toys" often badly explained and neglects to engage with the variety of possible discussion.

I get the sense most security discussion is like this too. I have seen it before in other industries."

Nothing about this is anti-female and there's no reason to bring unrelated issues into this.
Get a grip, no offense.

echoSeptember 15, 2018 2:03 AM

@Concern of the mental health variety

Thanks for the gaslighting. Organisations and individuals have their biases and habits and stereotyped thinking. My observations are on the whole fairly level and generally fairly well documented and even admitted in some quarters. I have found people who aren't familiar with this or who aren't being professional tend to go into denial and become aggressive, and in some cases personally insulting as your nom de plume suggests, which rather proves the point.

I know the seasons are changing and the mating season is arriving but could you direct your energies somewhere else and more fruitfully please?

echoSeptember 15, 2018 4:49 AM

@Concern of the mental health variety

I addressed the industry issue sensibly in the appropriate topic. I perceive no need to continue discussing anything with you because judging by your responses you're not the intended audience anyway.

yet another BruceSeptember 15, 2018 8:58 AM

Firstly, very nice writing. Kudos.

Secondly, anyone know anything about memcomputing? See for example, http://advances.sciencemag.org/content/1/6/e1500031.full

My understanding of this is that the authors claim to be able to run digital logic backwards and provide inputs consistent with defined outputs. Seems too good to be true but probaby would have implications for cryptography.

AndersSeptember 15, 2018 9:27 AM

Superb essay!

I once had *almost* a quantum computer - at least hard disk was
a quantum disk - Quantum Fireball.
It was full of qubits - sectors of unknown state, for normal
computer understandable as a "bad sectors". So no, i'm not
a fan of quantum computing any more :)

CuriousSeptember 15, 2018 1:06 PM

I wonder how well understood 'quantum entanglement' is by scientists, maybe it entails more than the things that is commonly thought about quantum entanglement today (not that I would personally be able to list such things). Presumably, anything "quantum mechanical" is relevant for any construction of quantum computers with their q-bits and related hardware.

Somehow, and I am certainly not very knowledgable about quantum technology so take what I write with the proverbial grain of salt, I can at least imagine how it might be problematic should quantum entanglement infer some kind of guaranteed backdoor structure, should quantum entanglement also work as a trapped energy potential due to time-reversal , however miniscule an effect, allowing tiny, but perhaps critical bits of types of locks, and switches/gates that could act as a backdoor, that presumably can't realistically be found out if relying on a single tiny switch somewhere in a piece of hardware that I imagine could perhaps rely on smaller cluster of matter that has some quantum spin values hidden in time reversal physics, laying latent until sprung so to speak, with the precise manipulation of some system of Q-bits, really small, or perhaps being even a larger system.

Clive RobinsonSeptember 15, 2018 5:11 PM

@ Gweihir,

With the consistent and decades-long failure of quantum computing to make any meaningful progress towards a product that can be used for more than tiny toy demonstrations

Whilst true, it's more likely to be a resources and political imperitive issue.

Quantum Crypto as Quantum Key Distribution (QKD) being somewhat less costly and having not just commercial but military purposes is moving forwards.

As @Bruce has observed above[1] and I've repeated over and over QKD in optical fibers is not realy much of a prospect as it's "point to point" only. That is it suffers from the same problems monorail railways suffer from, switching is a mechanical issue that is mostly impractical due to amongst other things reliability. It is also known to suffer from a classical range-rate limitation.

However now we have state of the art timing, and genuine single photon emitters the range has improved greatly and thus narrow beamwidth signals transmitted from satellites has over come the P2P fiber impediment for what is currently "fixed locations" and some "mobile platforms" such as aircraft and ships. There is other ongoing work since around 2005 by Quintique and BBN to push this into wireless. Other developments are things such as techniques to get around the range clasical QKD limitations[2], that also have "cross over" with Quantum Computing (QC).

The point being that QKD has resources available and a political will thus progress is being made.

But there are other things to consider. Both nuclear weapons and liquid fuel rockets would not have come to fruition if it were not for the political imperative of WWII making otherwise unavailable resources available. Likewise man almost certainly would not have gone to the moon if not for the significant failings of the US missile development causing grave political embarrassment, thus the setting of a goal way beyond simple manned space flight and the considerable resources made avialable politically to make it happen.

Thus the question arises as to "what might give rise to a political imperative to divert resources to QC?". Rather than some other military or Intelligence Community projects or other "big budget" line item.

[1]

    But come on -- does anyone expect a system that requires specialized communications hardware and cables to be useful for anything but niche applications?

[2] https://arxiv.org/pdf/1705.02174

Clive RobinsonSeptember 15, 2018 6:43 PM

@ yet another Bruce,

Secondly, anyone know anything about memcomputing?

Not under that name.

However there has been "neural network" research into the implications of shared content addressable memory for some time, originally for explaining how biological networks learn (Hopkin Model) as well as more recently digital neural networks.

The diagrams and text are very suggestive of being an "analog neural network with shared content addressable memory". Which would be a halfway house between biological and digital neural networks.

RonKSeptember 16, 2018 1:58 AM

Interesting post, but I personally find it hard to believe in an opponent who is advanced enough to be able to break all possible symmetric encryption algorithms yet unable to gain access to our one-time pads (for instance, by remotely hacking everything, or monitoring everything using autonomous microdrones, or lots of other things I cannot imagine, just as I cannot imagine being able to break all possible symmetric encryption algorithms).

Clive RobinsonSeptember 16, 2018 7:10 AM

@ RonK,

I personally find it hard to believe in an opponent who is advanced enough to be able to break all possible symmetric encryption algorithms yet unable to gain access to our one-time pads

Oh believe it, because it's fairly easily seen when you consider the difference between "covert mass surveillance" and "covert targeted surveillance".

Covert activity is best done at a distance, as it reduces your visable image. Ideally you want to be as fully out of sight as possible to your target(s) of interest.

Mass surveillance or "collect it all" is best done at the center of a "star configured switched communications network" not out at the leaf nodes. Where as targeted surveillance is best done out at the individual leaf nodes as "up close and personal" as remaining covert will alow.

If you as a potential target take even moderate precautions with your security and communications end points, mass surveillance fails for the attacker. Which means that the target either has the privacy/secrecy they desire and should legaly have, or they force an "up close and personal" targeted surveillance operation.

Such "up close and personal" is very resource intensive[1] at the best of times which alone would act as a limit on any attacking agencies power. Secondly the human brain is very very good at pattern recognition, once you've seen one supposadly covert operation you can get the feel of any others that might be around you, which makes the attackers ability to remain covert significantly harder by a very long way.

Another issue is people love to talk. Whilst it is moderatly easy for an attacker to remain covert to a single issolated target, they are effectively "hiding in plain sight" and covert behaviour when seen from anywhere other than the target side looks darn suspicious in many ways and thus people will see it and gosip. More than one covert operation has failed because the target chatted to the janitor or maintenance man on a regular and friendly basis, or the old lady who lives across the road etc.

Another issue for an attacker is the "You can not see what you are going into" problem. It's relatively easy for a prepared target to have "instrumented their environment" in ways that can only be seen by an attacker after they have been recorded etc. Likewise there are so many random "tells" a target can setup an attacker can not catch them all. In essence the attacker is not just fighting Lockard's Exchange Principle these days but also the "all seeing eye" principle.

The work the attacker has to do to spot/avoid what the target can put in place is so dam suspicious in it's own right that the attacker would be lucky not to have the police called on them and neighbours talking, or worse avoiding the target, nothing smells worse to an alert target than changes of routien or behaviour in those around them.

Oh and technology is not the attackers friend many mistakenly think it is, because even at the best of times it's not passive and the attacker has to get it in and out of the target zone. To understand why it's an issue, firstly it will use energy and it can not be efficient enough to stop heat being put into the environment. Secondly it needs to communicate, in most cases that can also be detected by emmissions or "red eye" principle. Thirdly it's not made from the same materials as it's immediate environment which means it's detectable in a number of ways. But for a target it's an issue that can be dealt with "without looking". Put simply the better something is hidden the more static it is which means that individually it's of little use against an agile target. Simple probability then works against the attacker not just in "not seeing" but also in "being seen".

In the main targeted surveillance fails more often than it succeeds especially against an alert or worse prepared target. It's only in the movies that master villains talk about their evil plans close to the microphone in a cocealed object the attacker put in place five minutes before hand. It's why the likes of "spike mics" or the equivalent acoustic wave guides are more likely to be used (which can be detected with Differential Time Domain Refectometery D-TDR).

Even fairly rank amatures can survive years of intensive targeted surveillance by not just LEO's but also SigInt agencies working in close cooperation. Look up someone called Kenneth Noye from the UK who was involved with disposing of the Brinks Mat Gold, for which he is alledgedly currently on a "hit list".

You can get a bare bones start with,

https://en.m.wikipedia.org/wiki/Kenneth_Noye

As you can imagine after murdering a police officer even after being aquitted he was a very high surveillance priority for the police, and the parallel construction about his time in Spain after his second murder is fairly obvious as is the cover up in court over what actually happened on the slipway and why is now fairly common knowledge (drugs deal gone bad).

I suspect various people are hoping that the underworld "offs him" now he is in open prison.

[1] Even in the best of circumstances targeted surveillance is at what is an eye wateringly high cost. To get an idea see the leaked figures for the fairly low key Ecuadorian "Operation Hotel" surveillance on the very static target Julian Assange which is a fraction of what the acknowledged UK costs are just for LEOs. What the UK and US sigint costs are, are not publically known but it will be several orders of magnitude more than the "Operation Hotel" costs are.

Suresh PonnusamySeptember 16, 2018 11:45 AM

Well written article without the technical details. None can put is better than this. The last hope is the OTP, but without a proper key exchange and agreement, it's going to difficult isn't?

HmmSeptember 16, 2018 7:40 PM

Seconded on the well-written part, it filled in a lot of ground there on expansive topics.

A lot to think on and resist making unfounded sweeping prognosticating statements about.

WeatherSeptember 16, 2018 10:18 PM

Just would like to bounce a cipher off you lot

Say you start with encrypted data
AAAA
The first key could be
Door-AAAA= something
Say the first key equal
Htre-AAAA= you then can use a second key
Door-dhfd(htre)=
Rinse and repeat with as many keys as you want to find the plaintext (door) the number of how many keys you used and maybe what they were, sorry would be easy to write a C program but can't explain it

AlbertSeptember 16, 2018 10:30 PM

@ yet another Bruce & Clive Robinson

About memcomputing I also discovered that the authors founded recently a company (http://memcpu.com/).

Moreover, they published many papers to explain the tech, however they use mainly frameworks from physics therefore they are not easy to read.

For what I understood, they started form some abstract non-turing machine that in principle can solve efficiently NP-complete problems. And they claim that this can be realized with some sort of self-organizing circuit. It looks like more an hardware solution, however they say that they efficiently simulate it. Which I do not get, since if you simulate it, you are using a turing machine... I need to read more. Has anyone further information about this?

Nevertheless, I am skeptic about it, it looks like to me too good to be true and very complicated to understand. However, since it looks like they are trying to commercialize it, I will definitely look into it and follow the evolution. You never know, maybe they really found something serious.

Chris mobleySeptember 17, 2018 4:15 AM

Excellent article, as always.

Niche applications - like industrial control systems in the critical national infrastructure? They only talk to a very small number of controlled devices, they are installed and commissioned by *trusted* individuals, they are left alone to do their work for 10's of years...no need for pki - symmetric cryptography using ppk's (otp) is a reasonable answer. They can be built with high quality key management and physical security built in. See Bruce's articles on physically uncloneable functions.


There are quite a lot of niche applications I can think of, where there's a one-to-one or few-to-few predetermined comms channels that need protecting. Symmetric cryptography and ppk using a otp is a perfectly reasonable solution.


WeatherSeptember 17, 2018 4:17 PM

This explains it better

Encyted data jfxh

1)
Input jfxh
Key llll
Outputplain door
Outputchain wkyt

2)
Input wkyt
Key wvxd
Outputplain door
Outputchain nnxz

3)
Input nnxz
Key poll
Outputplain door
Outputchain tyhj

Plain text door,plus other info depending on the key holder you ask

GweihirSeptember 17, 2018 8:14 PM

@Curious:

Quantum entanglement is actually not well understood at all. It is pretty well _described_ for simple set-ups with low numbers of entangled particles, low distances and simple, short chains of manipulations. It is however entirely possible that this whole thing has an even worse behavior when scaling up than is already observable in practice (my takes is it definitely scales sub-linear at this time) and it may have a hard limit too, where entanglement cannot go over a certain, not very high, number of entangled particles at all or with what is physically possible in this universe.

The thing is that Physical theory is still badly incomplete at this time (no GUT) and that what we have is inconsistent, i.e. it is known that either Quantum Theory or Gravity or both are wrong (to simplify things grossly), because as they are modeled today they do not go together. Yet both are exceptionally well supported by measurements. This just means there are some not yet known effects at play and they may well kill Quantum Computing entirely, because they have to get pretty strong under circumstances we do not know at this time.

Making grand predictions on theories known to be incomplete is a rather unscientific thing to do, but classical "hype" behavior. We see this all the time when people confuse "technology" and "magic" and the Quantum Computing hype is no different. Now, we cannot rule out that Quantum Computing, strong AI, flying cars, etc. will eventually manifest in a meaningful way, but it seems rather unlikely and it will certainly not happen anytime soon.


@Clive Robinson:

Well, I certainly agree on the political, and the hype aspect makes it very hard to have experts stay out of it, because it means they get less visibility, less opportunities and less grant money. Something similar is currently happening with "AI" (which is decidedly >50 years in the future for actually meaningful "strong AI" and may well never materialize). But if you want a research grant or sell your product, you better write "AI" on the description, not "Fuzzy logic" or "statistical classifier" or "automation".

Hence I am not actually blaming Bruce for writing on this and I notice that he does not actually make any predictions on when or whether this will become really relevant as a threat.

WeatherSeptember 17, 2018 11:17 PM

Gwirhir
The numbers aren't limited, just the distance.
Imagine a flat surface, above that you have things like motion, gravity, em wave eg,all those things pock a hole through the surface, they go down away from the surface, then decay to the left and right,any place within that decay gets that information, the distance is limited to the strength at a point,having multiple qbits that are in sink can transfer information below at a strong enough value to reach a second layer,that can transfer the information even further, more qbits will make a stronger signal at distance, but the distance is limited, so you lose focus or detail, but it makes it clearer because that information about the qbit states swamp out the gravity, em,motion stuff,but a cool thing,you could say increase the temperature of the external qbits or any other value,make it a rf transmitter,etc

Clive RobinsonSeptember 18, 2018 10:26 AM

@ Suresh Ponnusamy,

The last hope is the OTP, but without a proper key exchange and agreement, it's going to difficult isn't?

I'm far from sure the last hope is the OTP.

In effect an OTP is a stream cipher with keystream generator that is "unpredictable to an observer" of either the key stream or the keystream and plaintext mixed together as a ciphertext stream (with inband signalling after authentication and synchronization).

It's entire strength rests on this "unpredictability" by any observer, but importantly any third party observerving the first and second party communications. It is effectively a random oracle, that only two parties can directly access as they chose. A third party only gets access to what either of the first and second party alows them to, which if the first and second parties follow the rules gives the third party insufficient information to get access to the plaintext.

Interestingly in military and diplomatic systems except under rare circumstances the OTP is only used for "super encipherment" that is the plaintext is first encrypted with a high strength cipher prior to being put through the OTP system. There are a number of reasons for this which I've mentioned in the past.

Also the OTP system is generally only used between an out-station and a central hub home-station with all OTP traffic going through the home station not out-station to out-station. That is it is generally only used in a Star Configutation Communications systems. Again there are several reasons for this, not least because it minimizes KeyGen and KeyMan issues with it's complex auditing. But more importantly it puts control of information at the home-station. Which for Mil/Dip traffic is desirable.

Which importantly makes the home-station a single point of failure and in the case of the Internet a single point of attack for various criminal entities including Government Agencies.

Thus the general use model for OTP systems is a very bad fit for the Internet.

Thus to make use of it for the Internet three basic problems have to be solved,

1, The authentication of parties that are unknown to each other.

2, Giving only the first and second parties access to the Stochastic process that is the Keystream generator / random oracle.

3, Removing the issues of a central hub.

Appart from the first that is currently susceptable to Quantum Computers and needs some kind of central hub (CA etc). The other two are currently open issues under the Internet model where nearly all communications are "one time" between parties unknown to each other and there is no "trust" between them.

I'm not saying that the OTP is unworkable on the Internet, but under the current Internet model it is.

But then so is the use of symetric cryptography under the current Internet model, because you still have both the first and third problems.

My honest feelings are we need to look at methods to replace the current Internet model with one that has two major changes,

1, No centralized or hierarchical security required.

2, No Quantum Computer sensitive methods.

Neither of which is a problem if the two parties have a secure side channel prior to first communications over the Internet.

It is first this problem we need to solve, but to most this looks like a "turtles all the way down" or "lesser flea" problem. But nobody has proved this is the case under all considerations...

AlainSeptember 20, 2018 7:04 AM

"[...] but we will still be able to communicate securely using one-time pads. There's an optimism in that."

Yes, optimism, because there is no way to prove that any one-time pad is really random. There are just to many possibilities to generate apparently random white noise to exclude them all, not even with quantum computers. In other words, nature may not generate true random data, but data shaped by yet unknown new physics - of which Aliens might well be aware... ;)

Dr. I. Needtob AtheSeptember 20, 2018 8:18 AM

"So a 256-bit key might seem only twice as complex as a 128-bit key, but (with our current knowledge of mathematics) it's 340,282,366,920,938,463,463,374,607,431,768,211,456 times harder to break."

This is incorrect. It's 340,282,366,920,938,463,463,374,607,431,768,211,456 times AS HARD to break, but only 170,141,183,460,469,231,731,687,303,715,884,105,728 HARDER to break.

Clive RobinsonSeptember 20, 2018 12:08 PM

@ Alain,

Yes, optimism, because there is no way to prove that any one-time pad is really random.

There is as I've pointed out no usefull definition of "random" let alone "really random" and many arguments are in effect circular.

So in effect the word means what ever you want it to mean.

As an observer of the output of an alleged "random generator" there are three things you can tell "within the bounds" of your observation,

1, It appears determanistic.
2, It has determanistc characteristics.
3, It appears non determanistic.

In other words the best you can hope for is you can show it is determanistic in some way within the bounds of what you can measure, if your measurment assumptions are correct.

It is very easy for me to design and build two generators each using a long sequence determanistic process and a thermal noise generator, mixed together by an analog switch.

The switch has two inputs and one output. One input is the signal input the other is the control input. The countrol input opens or closes the switch thus the input signal will appear (closed) at the output or not (open). Thus it acts as a modulator or half mixer (or AND gate with digital signals).

This means I have two basic options, in one generator the long sequence drives the control input in the other the the thermal noise generator drives the control input.

However what you see at the output also depends on the frequency range of the two input signals. If the control input switches faster than the input signal then the output will reflect the control input short term (high frequency component) but the signal input long term (low frequency component).

From this you should be able to see how the way you measure effects you perception of the generator.

Depending on how you "band" your generators into N frequency segments you have N^2 output combinations to consider. Which with bounded measurments covers all three observable conclusions.

Have a look at "Gold Codes" and "JPL Ranging Codes" to see how even very short determanistic generators can be coupled together such that the resulting output will have a sequence too long to measure in a human life time that will not show any real auto correlation over the measurment period a human might make. But will fail other tests if you know which ones to use, so your determination is also based not just on measurment time but type of measurment.

You can go on and show how each measurment type you use will only detect a very very limited subset set of determanistic sequences. It's why we have so many tests, and so many algorithms that pass all of the tests...

An argument can be made --via Cantor Diagonal Argument-- that there will always be determanistic sequences for which no test exists unless you know the exact algorithm and state.

So actually I'm not overly fussed about "Aliens", because they are unlikely to have the time to find the algorithm in use or the method to detect it, except by the "blind luck" of probability.

As our host @Bruce should know from his work on random generators you can mix many generators together such that the Aliens will never get time...

Unless of course they have not Quantum Computers but Tachyon Computers ;-) The problem being that for such Aliens they would not be part of our universe, as no entity in our universe could see the results... So we are never going to be able to interact with them so they have no relevance for us...

If you want me to take the tounge in cheek further I can B-)

But... It would get to the point where like the joke about the cat with to little mu it would not be funny to anyone but a physicist thus would creat friction :-S

Cincinnatus__SPQROctober 17, 2018 9:56 PM

@ Bruce Schneier

Bravo! That was the most positive statement you have ever made about the "meme" of the one-time pad, as you once called it.

For us crackpots who are absorbed by true random number generators and cherish the STASI Tapir, we will take your comments as good news. We knew you would come around!

As Whitfield Diffie said, if you can generate truly random numbers, then you can have private communication. Yes, that is true. But, the problem is, it will perhaps not be allowed in the future, just as we do not see a fully robust form of PGP publicly available today.



Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.