Factoring 2048-bit Numbers Using 20 Million Qubits

This theoretical paper shows how to factor 2048-bit RSA moduli with a 20-million qubit quantum computer in eight hours. It’s interesting work, but I don’t want overstate the risk.

We know from Shor’s Algorithm that both factoring and discrete logs are easy to solve on a large, working quantum computer. Both of those are currently beyond our technological abilities. We barely have quantum computers with 50 to 100 qubits. Extending this requires advances not only in the number of qubits we can work with, but in making the system stable enough to read any answers. You’ll hear this called “error rate” or “coherence”—this paper talks about “noise.”

Advances are hard. At this point, we don’t know if they’re “send a man to the moon” hard or “faster-than-light travel” hard. If I were guessing, I would say they’re the former, but still harder than we can accomplish with our current understanding of physics and technology.

I write about all this generally, and in detail, here. (Short summary: Our work on quantum-resistant algorithms is outpacing our work on quantum computers, so we’ll be fine in the short run. But future theoretical work on quantum computing could easily change what “quantum resistant” means, so it’s possible that public-key cryptography will simply not be possible in the long run. That’s not terrible, though; we have a lot of good scalable secret-key systems that do much the same things.)

Posted on October 14, 2019 at 6:58 AM46 Comments

Comments

Frankly October 14, 2019 7:34 AM

Almost all cryptocurrencies, including Bitcon, are not quantum secure. We are headed for a crypto-financial disaster, whose pace is determined by a select group of scientists at only two or three companies.

TimH October 14, 2019 8:54 AM

Perhaps the trick is to make the plaintext of a secret document not look like plain text. If you reverse a hash, you know that you got the original data sequence, and a brute force password attack simply goes on until the try works. So in these case you know you got the result.

Marcos October 14, 2019 9:04 AM

We lack a good secret key algorithm for public signatures. We also can not create anything similar to a PKI using only secret keys,

Secure person to person communication is the easiest of the problems to solve, there are plenty of important uses of cryptography that require public keys.

Bruce Schneier October 14, 2019 9:16 AM

@Marcos:

“Secure person to person communication is the easiest of the problems to solve, there are plenty of important uses of cryptography that require public keys.”

Such as?

Peter S October 14, 2019 10:33 AM

While I agree that computers with 20 million qubits are still a ways off, notice that there are qubits and then there are qubits.

The standard approach to using Shor’s algorithm requires a fully-entangled quantum computer — one where every qubit is connected to every other qubit. This paper suggests that a (large) quantum computer with qubits only connected to their neighbors could still make productive use of Shor’s algorithm.

These partially-connected quantum computers are making progress; the Canadian firm D-Wave currently sells a 2,000-qubit computer and will launch a 5,000-qubit computer next year.

Andy October 14, 2019 1:21 PM

Public keys are required in non-repudiation as part of digital signatures. Using a shared key guarantees integrity from an untrusted third party but not when one of the parties later says “it wasn’t me”. How about signed timestamps over a hash from a trusted third party?

MarkH October 14, 2019 2:23 PM

According to my reading (I don’t pretend to understand this stuff), the principle used in D-Wave products — called quantum annealing — is not compatible with Shor’s algorithm. In that case, it doesn’t matter how big they make them …

As far as I’m aware, the customers who have bought D-Waves only run tests to try to ascertain whether they do anything at all you can’t do with a conventional computer. The last time I read about progress, the answer was “no.”

What’s interesting about the new paper, is that the kind of quantum computer people have been trying to make would need less than 5K qubits, and perhaps less than an hour, to perform the computation — so instead they ask, “what’s possible without solving these very stubborn problems of QC design?”

It’s an approach which might yield practical results before large fully-entangled systems with coherence times on the order of 1000 seconds. I’m skeptical whether the latter will ever be achieved.

Jesse Thompson October 14, 2019 4:00 PM

@TimH:

While a nice idea, making the encrypted plaintext “look unusual” will not be enough to prevent crackers from rapidly confirming that they have hit upon the right decryption solution.

To demonstrate, I can encrypt a kilobyte of CMB noise with — say — a small enough symmetric or assymetric cypher that you can crack it on today’s hardware, and you’ll know you got the right answer not because the output plaintext “looks like german troop movements”, but because you can feed whatever you got into the inexpensive “encrypt” process to arrive at the exact same cyphertext that you cracked. Any false input would lead you elsewhere.

Notwithstanding all that, your plan does also suffer from being “security by obscurity”.

Clive Robinson October 14, 2019 4:04 PM

@ Bruce,

Such as?

To which one of the two statments,

1, Secure person to person,
2, Important uses of cryptography that require public keys

Whilst the first is moderatly simple to solve with an appropriately secure second/side channel for key exchange, finding such a channel is by no means a simple problem, which is why PublicKey crypto came to be a privacy and E-Commerce stalwart.

I remember years ago you making a statment that we had sufficiently secure crypto algorithms, and it was time for cryptographers to move on to the much harder Key Managment (KeyMan) issues.

From my view point little has happened on general KeyMan, most efforts have been towards building and pushing “hierarchical KeyMan” often through PubKey. The big issue with which is that as many know corruption like scum rises to the top of hierarchical systems. They appear to be an endemic failure in most Western Thinking as I’ve remarked before.

Another issue with hierarchical systems is the perversion by those at the top of such structures insisting on giving each and every one of us a “serial number”, which is a very very major and extreamly dangerous thing to do. As people who have suffered Identity Theft in the US have found out the simple usage of the Social Security number is all it requires. Thankfully one of the few reasons people can partialy defend against this is that the system is so broken that they can ask for proof that they carried out the transaction, which allthough fort by those responsible for the mess can be a successful defense. If Crypto ID’s are introduced as has happened with Chip-n-Spin from EVM then defending yourself becomes difficult at best, as judges usually fail to rule that “propriatary financial systems” should be made available for independent expert examination.

Further as Steller Rimington once head of MI5 (and now successfull author) has pointed out, it is impossible for the general populous to prove who they are. That is a Birth Certificate which is the route of all ID is just a piece of paper with no way to link it to an individual. In fact when people think about it there are very good reasons as to why this should be the case, and the list is quite long.

So we desperately need a system that can provide not just a “root of trust” that can not be abused for other things, but also a way to transfer cryptographic secrets that can not be tracable explicitly to an individual.

As far as I’m aware nobody has proved that such a system is not possible without using the currently now assumed to be weak PubKey systems. I know it’s a tough ask, but PubKey as an idea was thought impossible untill Clifford Cocks showed it was actually possible…

Clive Robinson October 14, 2019 4:29 PM

@ TimH,

Perhaps the trick is to make the plaintext of a secret document not look like plain text.

This is an old problem and you can read more in the history of “Super Encipherment”.

The real problem is that to recover the plaintext in the pressence of noise means that you have to have some kind of structure to either the plaintext, or the resulting obfiscating first cipher text.

To see why you have to with block ciphers either have a rediculously and impractical block size, or reduce the block size to the point it is the same as the alphabet charecter size. To avoid ECB issues of simple substitution you need to add some kind of mode that often involves chaining which encorages error propergation. If you use a stream cipher then whilst you protect against error propergation you face synchronization issues due to drop outs. Generally what ever you do you have to have some kind of error error protection / correction added to either the first obsficating ciphertext or the second super encipherment cipher text.

In the past the use of a chaining or synchronised obsficating cipher has been used and this has then been super enciphered with a fixed short block size One Time Pad (see UK Diplomatic Wireless Service “Rockex” super encipherment machine).

Unless you have a very loe noise very low drop out communications link Super Encipherment gives rise to an increased “resend” of traffic issue, which introduces a whole load of other problems as well.

SpaceLifeForm October 14, 2019 4:56 PM

@Clive

Do we really need any ‘root of trust’?

If there is any ‘root’, then we have the hierarchy problem, where the ‘root’ can be compromised.

How about a distributed system with no root at all?

Where one only trusts their immediate contacts that they meet in person with to exchange a crypto-id.

Clive Robinson October 14, 2019 5:01 PM

@ MarkH,

It’s an approach which might yield practical results before large fully-entangled systems with coherence times on the order of 1000 seconds. I’m skeptical whether the latter will ever be achieved.

History shows us that applying a “binary chop” to a problem significantly reduces the difficulty, thus our ability to solve larger problems. So if possible in some way it would be the sensible way to progress.

As for the “coherance” issue, the paper refering it to noise is actually a sensible way to assess it. Because “noise” appears to be as fundemental to the physical universe as the speed of light, and a direct consequence of it’s formation. Further the current belief in science is that the universe is not analog in nature, that is at a fundemental level the universe is in fact discreetly quantitized, with some believing that those fundemental components actually “vibrating” thus generating noise from the fundemental lack of coherance.

When we look at what is practical, with current microwave cryogenically cooled systems, there is still very much a noise floor that we can not get below. You will find that it is above the 4KTBR level –often quoted as -174dBm in a 1Hz bandwidth– by a couple of dB. Modern receivers only have a linear dynamic range of a little over 100dB. You can take the noise level down by either reducing the bandwidth or by putting 2^N receivers in parrallel. The former is not realy practical in fixed time processing, the latter only reduces the noise by 6dbV for each increase in N, which is likewise quickly not practical especially at high bandwidths or high frequencies (due to speed of light issues).

Now if you work out the equivalent noise voltage for 20 million Qubits it’s very easily going to exceed the 100dB dynamic range by a very long way, as well as having equivalent voltages that would destroy any Qbit we can currently design, or are ever likely to.

Thus the “divide and conquer” route of the “binary chop” may actually be the only way QC will be of any real use as a processing speed up…

Clive Robinson October 14, 2019 5:49 PM

@ SpaceLifeForm,

How about a distributed system with no root at all?

A root of trust is the equivalent in many ways as an indicator of who trusts you.

Back in the early PGP days we had “Key signing parties”. In essence you turned up at a party with your passport and drivers licence and a public key. People would verify your documents and sign your key as well as adding it to their key ring.

It was an attempt at “distributed trust” and with a few changes it might well have worked.

The important point about it was the key could be anonymous but trusted and you could have multiple anonymouse keys.

Thus trusted organisations just like legal notaries could sign keys.

Lawrence D’Oliveiro October 14, 2019 7:31 PM

All practical “quantum”† computers so far have been analog computers. They have been handy for solving certain kinds of physical-simulation problems, but completely useless for anything number-theoretic (such as factoring large integers), like digital computers can do.

I have said before that there is something fundamentally wrong with the assumption behind quantum computing, anyway: getting an exponential increase in processing power for a linear increase in processing elements sounds too much like getting “something for nothing”. Which has never worked anywhere else in human technology.

This fact, combined with the incurable flakiness of the current machines with their small numbers of qubits, points clearly to the conclusion that the idea of building something workable with millions of qubits is sheer fantasy.

†I never understood what was so peculiarly “quantum” about them anyway. Consider that the transistors (and even the vacuum tubes before them) that drive “conventional” computers cannot work without quantum effects.

Rodney Van Meter October 14, 2019 7:50 PM

Ah, you’re in my wheelhouse here; design of a quantum multicomputer for factoring was my 2006 Ph.D. thesis. For more on the current state of quantum computer architecture, see my recent tweetstorm either here or in PDF here.
My background is original in computer systems — architecture, OS, networks, and especially networked storage — but about fifteen years ago I moved into quantum. But I was at a conference recently in which I was asked about the details of rekeying IPsec in the context of quantum computing and QKD. I went home, and fell into a rabbit hole on crypto, and have begun a series of blog posts I’m tentatively titling “What Every Quantum Scientist and Engineer Should Know about Classical Cryptography”. As I’m not originally a cryptographer, doubtless there are naive and perhaps outright wrong. Comments welcome, my desire is to improve it and make it as useful as possible.

Rodney Van Meter October 14, 2019 7:52 PM

(FYI, when I click on the “subscribe to comments on this entry” button, I get raw XML. Is this a known bug, e.g. w/ Chrome?)

Rodney Van Meter October 14, 2019 8:04 PM

(Ugh, typos in my comment above. I hate that, but it’s apparently not possible to edit posts?
s/original/originally/
s/wrong/wrong statements or emphasis/
)

Clive Robinson October 14, 2019 9:17 PM

@ Rodney Van Meter,

but it’s apparently not possible to edit posts

The software alows it, but as this site alows anyone to use what ever handle they please (appart from Admin accounts) that means an “edit function” after posting would in effect alow anyone to edit someone elses post. Which would be undesirable.

However if you are ready to post but want to see how your post would be displayed hit the “preview button” and the site will refresh the page, in reverse time order with your post at the top followed by the entry window, followed by everyone elsed posted comments. You can obviously edit then refresh with the “Preview” button as often as you want, when done hit the “submit” button. This will then post it and in effect lock it.

Weather October 15, 2019 2:06 AM

@RVM
The D-wave ,what I understand from ars is you map out what you want, say 5+5 you then throw in random(by just checking and see unknown answer) ,then you pick ones that are 3+4 that the links between tunnels allow the code to come forward.
?is act having all parts interlinked the best idea?
?why call it quatom ,when any object close to another effects both?
?what coding obj are you using for d-wave?
?is quatom tunneling understood, like in fusion, which it is not?
?can it workout AA with the first A being 256 different volts, with the second A I’ve the same or another 256?

Weather October 15, 2019 4:36 AM

@joker123
Its a filter, if you say 5=3a,then ..sorry lost what I was going to say.

No not all linkage,trying.. You have to have a mark to aim fore, if anything is the target… Back too, if you have four satellite you can get a accurate picture were you are, to be able to link that, you have contains and wires, it can be worked out with 7f but you need -+/* if statement are out of bounds, binary needs filters others can workout.
Sum it to the four, and the size byte you start with you can end Wit.
If I new the filter aes

I’m called weather because I’m drunk

Weather October 15, 2019 4:47 AM

accidentally listed AES as using a 256 bit block, this is not the case. AES uses a 128 bit block size.

Back to top

Andy
Mon Oct 13 2014, 04:21AM
Andy Registered Member #4266 Joined: Fri Dec 16 2011, 03:15AM
Location:
Posts: 874
Hi DaJJHman
Thanks for the good reply. The theory is based on one 128bit input for key and IV set to zero, the block, isn’t revelent as, 7fff8000 is a array of 0x00-0xffff which is 256 array * 256byte chars per array cell(sorry might have to check if its 256256256, my bad), the ecb and cbc modes , keywraping etc, would be a modified filter value and ending value that gets added before the filter, saying that it shouldn’t matter if its cbc as the next block, still gets added to the encryption 7f80(round about plus size) which covers all the possibility so it shouldn’t matter as ”
7fffffffffffffffffffffffffffffff80000000000000000000000000000000″ = 16^256, but there might be some changes in the addon that could effect it.

Edit sorry wrong ways around, these numbers don’t reltate to the aes algo, are just examples of the parrellel logic.
Test
7f80 = 1 byte(0x00 || 0x01 || 0x02 || 0x03 || 0x04, as loop(0xff(x)(i=i+x))
4141 + 7f80 = C0C1
C0C1 – 5367 = 6D5A
6D5A – 8888 = FFFFFFFFFFFFE4D2
FFFFFFFFFFFFE4D2 *52 = FFFFFFFFFFF74B44
Apply filter, which shouldn’t have one with basic maths functions, just sub
FFFFFFFFFFF74B44 – 7f80 = FFFFFFFFFFF6CBC4
FFFFFFFFFFF6CBC4 = 0x00 with all the maths above
FFFFFFFFFFF74B44 – 7f00 = FFFFFFFFFFF6CC44
FFFFFFFFFFF6CC44 = 0x80 with all the maths above

0x7f00 = 0x7f80 – 0x80(this is the start we want to check so remove all others)
0x7f80 = 0x7f80 – 0x00
0x7f10 = 0x7f80 – (0x20& 0x50) both value we want to check

Edit 2
I think IVs are just xored with the input, have to check, but that would be a matter of generating the filter for a xor function, then adding 7fffffffffffffffffffffffffffffff80000000000000000000000000000000(size of IV input

Weather October 15, 2019 4:48 AM

as ”
7fffffffffffffffffffffffffffffff80000000000000000000000000000000″ = 16^256, but there might be some changes in the addon that could effect it.

Edit sorry wrong ways around, these numbers don’t reltate to the aes algo, are just examples of the parrellel logic.
Test
7f80 = 1 byte(0x00 || 0x01 || 0x02 || 0x03 || 0x04, as loop(0xff(x)(i=i+x))
4141 + 7f80 = C0C1
C0C1 – 5367 = 6D5A
6D5A – 8888 = FFFFFFFFFFFFE4D2
FFFFFFFFFFFFE4D2 *52 = FFFFFFFFFFF74B44
Apply filter, which shouldn’t have one with basic maths functions, just sub
FFFFFFFFFFF74B44 – 7f80 = FFFFFFFFFFF6CBC4
FFFFFFFFFFF6CBC4 = 0x00 with all the maths above
FFFFFFFFFFF74B44 – 7f00 = FFFFFFFFFFF6CC44
FFFFFFFFFFF6CC44 = 0x80 with all the maths above

0x7f00 = 0x7f80 – 0x80(this is the start we want to check so remove all others)
0x7f80 = 0x7f80 – 0x00
0x7f10 = 0x7f80 – (0x20& 0x50) both value we want to check

Edit 2
I think IVs are just xored with the input, have to check, but that would be a matter of generating the filter for a xor function, then adding 7fffffffffffffffffffffffffffffff80000000000000000000000000000000(size of IV input block) value xored with the step before in the Aes code.

Weather October 15, 2019 4:49 AM

wrong ways around, these numbers don’t reltate to the aes algo, are just examples of the parrellel logic.
Test
7f80 = 1 byte(0x00 || 0x01 || 0x02 || 0x03 || 0x04, as loop(0xff(x)(i=i+x))
4141 + 7f80 = C0C1
C0C1 – 5367 = 6D5A
6D5A – 8888 = FFFFFFFFFFFFE4D2
FFFFFFFFFFFFE4D2 *52 = FFFFFFFFFFF74B44
Apply filter, which shouldn’t have one with basic maths functions, just sub
FFFFFFFFFFF74B44 – 7f80 = FFFFFFFFFFF6CBC4
FFFFFFFFFFF6CBC4 = 0x00 with all the maths above
FFFFFFFFFFF74B44 – 7f00 = FFFFFFFFFFF6CC44
FFFFFFFFFFF6CC44 = 0x80 with all the maths above

0x7f00 = 0x7f80 – 0x80(this is the start we want to check so remove all others)
0x7f80 = 0x7f80 – 0x00
0x7f10 = 0x7f80 – (0x20& 0x50) both value we want to check

Edit 2
I think IVs are just xored with the input, have to check, but that would be a matter of generating the filter for a xor function, then adding 7fffffffffffffffffffffffffffffff80000000000000000000000000000000(size of IV inp

Karsten October 15, 2019 5:07 AM

“… so it’s possible that public-key cryptography will simply not be possible in the long run. That’s not terrible, though; we have a lot of good scalable secret-key systems that do much the same things.”

“good scalable secret-key systems” What is that? I have never heard of any system using symmetric crypto that could replace public key cipher systems.
If exchanging data with others without having a secure channel is the task, how can we use symmetric ciphers for that (without exchanging session keys via public key crypto)

- October 15, 2019 5:25 AM

@ Moderator,

“Joker123” is unsolicited link advertising for an illegal to use (in the US) betting service.

@ Weather,

You’ve fallen into a simple “Spamer-trap”.

Whether that has anything to do with your “I’m called weather because I’m drunk” statment I’ll leave to others to decide.

Who? October 15, 2019 5:48 AM

@ TimH

But we are talking about RSA here, not hashes. There is only one factorisation for a given positive integer. So you do not really need to have an idea how the plain-text message looks.

Cassandra October 15, 2019 7:27 AM

I will chime in here and, as usual, support @Clive Robinson’s thoughts. He nearly always writes things more clearly and concisely that I can.

As far as I am aware, Key Management is an unsolved problem, especially for the general user; which is to say, few in the general public have any idea about how public and private keys work and how they should be handled, and how collections of symmetric keys should be managed (or indeed, collections of private keys)

I am intrigued by Clive’s aside about key signing parties and how a ‘web of trust’ almost worked. I would appreciate it if he had time to go into more detail about what went wrong and how it could have been, or still could be, improved. Certainly, turning up to such parties with documents that resembled, to an inexpert eye, genuine passports and driving licences could have interesting consequences. I know I am not qualified to determine if I am being offered a skilful forgery or not. Trusting in everybody else’s expertise might not be justified.

Marcos October 15, 2019 9:18 AM

Can key management ever be “solved”? It is a huge set of problems, full of trade-offs, and on each context the least bad options look nothing like the ones from other contexts, it’s more of a human than a mathematical problem… except for the fact that most viable solutions rely on public keys and it would be a huge loss on most contexts if it go away.

We are still starting to use cryptographic non-repudiation on legal systems, it would be a shame to lose that capacity now.

Non-repudiation is also very useful for distributing capabilities. I have never seen that put to good use, but it’s put to bad use all the time by software and media companies in DRM systems. There are very likely many good uses for this, but applied infosec moves slowly, we would lose those before we even know what they are.

There are some options for electronic money transactions that could work. Except that you can’t get those without non-repudiation either.

Public keys allows one to receive secure communication from random strangers, and to create N-to-N communication channels. Those are much harder to create with secret keys alone.

Tatütata October 15, 2019 10:55 AM

Van Meter:

when I click on the “subscribe to comments on this entry” button, I get raw XML. Is this a known bug, e.g. w/ Chrome?

Neither Chrome nor Opera can handle RSS feeds directly, or even just display nicely formatted XML. You need to take the RSS URL and shove it into an appropriate application (all of them are clunky, I use Liferea), or install a plugin.

Firefox used to be able to digest RSS feeds natively, but this is no longer the case.

To look at raw XML, I often end up typing something like :

curl $URL | xmllint –format /dev/stdin | less

For podcasts, I roll my own cr*pware that handle the feeds exactly to my taste. Inter alia, I recode the audio on the fly using ffmpeg to a low rate AAC monaural track in a MP4 container, which takes a whole lot less space than MP3.

@Clive:

that means an “edit function” after posting would in effect alow anyone to edit someone elses post.

Normally, sites that allow comment editing place an authorising cookie on the posting browser. That’s perhaps one of their rare good uses.

The alternative is to log into the comments section using Tweetor, Fessbuck, or some other wretched site that follows and track you all over the place. Thanks, but no thanks.

Advances are hard. At this point, we don’t know if they’re “send a man to the moon” hard or “faster-than-light travel” hard. If I were guessing, I would say they’re the former, but still harder than we can accomplish with our current understanding of physics and technology.

Personally, my gut feeling tends towards the second alternative, supraluminous hard. In any case, current messages ought not to be stored longer than necessary, but I’m not optimistic. Venona intercepts kept cryptographers busy for something like 30 years…

Thunderbird October 15, 2019 10:55 AM

While a nice idea, making the encrypted plaintext “look unusual” will not be enough to prevent crackers from rapidly confirming that they have hit upon the right decryption solution.

To demonstrate, I can encrypt a kilobyte of CMB noise with — say — a small enough symmetric or assymetric cypher that you can crack it on today’s hardware, and you’ll know you got the right answer not because the output plaintext “looks like german troop movements”, but because you can feed whatever you got into the inexpensive “encrypt” process to arrive at the exact same cyphertext that you cracked. Any false input would lead you elsewhere.

Maybe I’m just not getting it since I am certainly not a crypto person. For any key K, you can decrypt a ciphertext and get SOME plaintext, and if you then encrypt it again you get the same ciphertext. You’re proposing to test your choice of K by verifying that A^{-1}(K,A(K,C)) == C? Isn’t that an identity?

Dave C. October 15, 2019 2:48 PM

A quick look at Shor’s algorithm (1995) suggests only thousands of qubits are needed to factor a 2048 bit number. This is deduced as follows:
Let N be a 2048 bit number, choose Q st N^2<= Q < 2N^2 and do a Quantum Fourier Transform (QFT) of size Q on some special function (x^a mod N) Shor came up with. So Q is a number of size between 4096 bits and 4097 bits and hence 4096 or 4097 qubits are needed. Now there are some ancillary qubits to hold the appropriate function values and to perform the calculations. So assuming a QFT of this size can be performed on this many qubits makes the “perfect” Shor algorithm to be in the 1000s of qubits range not 20 million. Clearly. this also assumes no noise. As Peter S. pointed out in an earlier comment, this assumes a fully entangled state can be achieved. Similarly Shor showed that a QFT is also involved with breaking discrete log problems (DH or ECDH).

Dave C. October 15, 2019 3:10 PM

A quick look at Shor’s algorithm (1995) suggests only thousands of qubits are needed to factor a 2048 bit number. This is deduced as follows:

We need to perform a Quantum Fourier Transform (QFT) of size Q where Q is a 4096 or 4097 bit number . This QFT requires the calculation of a special function x^a mod N. So we need about 4097 qubits + some ancillary qubits to help with the function calculations. As Peter S. indicated earlier this assumes all qubits are fully entangled and there is no noise. So this suggests only thousands of qubits are needed, not millions, on a perfect universal quantum computer. Something similar applies to breaking DH or ECDH.

MarkH October 15, 2019 3:14 PM

@Dave C:

Yes, a perfect quantum computer — fully entangled, with long-term stable coherence and a negligible error rate — would need only 4096 qubits, and less than an hour.

Despite years of “gee whiz” techno-optimism about progress in quantum computing, the best public results are still orders of magnitude short of this lofty goal.

The point of the paper cited in Bruce’s post is, since nobody knows how to build the kind of quantum computer people have been fantasizing about, what might be possible with a really miserable crap quantum computer, which is perhaps more practical to build?

The authors’ answer is, a LOT more qubits might enable a shabby (but achievable) quantum computer to perform a useful computation.

SpaceLifeForm October 15, 2019 3:59 PM

@Clive

Sorry. I mis-parsed your point.

I realize you meant web or net of trust.

And I understand key-signing parties.

I was just trying to point out that we do not want any central authority.

Ex: PGP/GPG still rely upon DNS and CA’s.

Both allegedly de-centralized.

But DNS and CAs, neither trustable.

So, even via key-signing, the users are still using untrustable network components.

I want zero dependency on DNS or CAs.

Weather October 15, 2019 4:07 PM

X+Y
X52
Y
2
X-Y
Y+88
X/2
Y+Z
X*Z
Z-33
X-Z
X+Y
Out=X

Start
Tx=7f80
Ty=7f80
Tz=7f80
Tx+X
Tx+Y
Tx+(7f8052)
Ty+(7f80
2)
Tx-Ty
Ty+88
Tx-(7f802)
Ty+Tz
Tx
Tz
Tz-33
Tx-Tz
Tx+Ty

Out=Tx
Select byte to check 41
7f80-41. A
(7F80-41)52. B
(7f80-41)
2. C
(7f80-41)*2. D

A+B+C+D=E
Out-E
If you start with 41 for X it will give the answer Out

Sorry for the spam

SpaceLifeForm October 15, 2019 5:28 PM

@Marcos

“Public keys allows one to receive secure communication from random strangers, and to create N-to-N communication channels. Those are much harder to create with secret keys alone.”

Very true.

But ask yourself, why should one trust a random leaker?

They may be legit, or they may be dumping misinformation.

And, N-to-N is not secure because metadata will reveal connections.

Clive Robinson October 15, 2019 8:21 PM

@ Bruce and others,

… so it’s possible that public-key cryptography will simply not be possible in the long run. That’s not terrible…

Hmmm… I’m not certain on that, it kind of depends on not just your point of view but in which direction you are looking…

There are three directions that are relevant “Past, Present, and Future”. I would suggest from your slightly optimistic prognosis it is because you are looking into the “future”, which leaves the issues of “past” and “present”.

The old saying of “What is past is past”, nolonger applies with the SigInt Agencies “Collect it all” policies it now has the rider of “and live with the future consequences” attached. This is especially relevant with the recent news about the FBI getting hundreds of thousands of search records from the so called “702 Databases” the NSA and others put together in a very unconstitutional way.

The FBI are obviously in the “no stone left unturned” mode for “easy pickings” where people have been incautious in what they did about protecting their communications. We have seen similar in the past where the FBI have taken over or gained illegal access to websites or gateway sites and used them to harvest user data prior to going after the users[1].

Appart from becoming a citizen of another country without extradition links to the US and never returning to the US nor ever going to countries that have extradition arrangments with the US I can not honestly say that even quite minor misdemeanors are not going to come back and haunt people in some way. In theory the statute of limitations should apply, as would other limitations for those that were minors at the time. But recent FBI cases have shown that they will find some way of punishing people. The problem being that once the FBI start, as they run on public money they see no reason not to keep doubling down untill they have a conviction for something or anything just to prove they were right to those that hold their purse strings[2]. Often their game plan appears to be by bankrupting their chosen victim in the process thus stripping them of their rights to a defence. Thus it appears that the FBI and DoJ are not about justice in anything other than the “show trial” manner “of might is right” and only back off when their victim has greater resources and every intention of using them publically in what becomes in effect a game of high stakes poker (Apple v FBI/DoJ).

However the FBI does have limited manpower resources, and whilst politicians talk a good game on being tough on crime, generally that does not translate into money for more manpower, which some politicians regard as a funding black hole and as big if not bigger political hobby horse than crime.

Thus I suspect quite a few very minor indiscretions will not get prosecuted, simply because it’s going to be a “target rich environment”. But not prosecuted does not mean unpunished…

But what of the current “here and now”, and how to protect yourself from future attentions of the likes of the FBI. Whilst many Public Key algorithms are very susceptable to Shor’s Algorithm it’s likely that many symetric key crypto systems will not be greatly effected by Quantum Computing unless some mathmatician or engineer comes up with a new wriggle on Grover’s algorithm. But what does “not be greatly effected” mean at the nuts and bolts level, Grover’s algorithm does mean quite a significan speed up of attacks against symmetric ciphers. But that can be offset by doubling the key size in certain ways thus effectively countering the speed up Grover’s algorithm gives whilst only increasing the encryption algorithm work factor linearly. Thus post-quantum symmetric cryptography may well not need to differ significantly from current symmetric cryptography except in the issues to do with key managment (KeyMan).

There are three basic ways of making key lengths increase

1, Design a new algorithm.
2, Modify an existing algorithm.
3, Chain existing algorithms.

AES is regarded as “128bit” however the original AES competition entry could work with 256bits. Block ciphers designed around Fiestel rounds, can have the round numbers increased without change to the individual round structure, but require the key expansion process that makes the individual keys from the master key to be modified appropriately. To get around the issue of modifying round key generation the notion of chaining ciphers together using different keys natutally arose. However it does have issurs which is why we have 3DES made from the 1970’s DES algorithm rather than the simple idea of 2DES.

Importantly the ciphers you chain together do not need to be the same. They only require thst the ciphertext block size passed along the chain remains the same size. Which alows for different algorithms to be used not just as a serial chain which is a 1D vertical structure, but in a parallel 1D horizontal structure or both to give a 2D structure and obviously in higher dimensions as well. The idea of 2D combinations has actually been seen in cipher designs where the Fiestel rounds divide the data horizontally not just into two but four or more parts.

Thus one thing people can do to increase their security is to chain ciphers.

However as I’ve noted for several years symetric ciphers based on similar constructs can be fragile and lead to knew attack methods (see the history of FEAL and Differential Cryptanalysis). Thus I peresonaly would use ciphers that had fundementaly different base functions and structures as it reduces the likelyhood that an advance in cryptography is going to have a significant impact on both cipher algorithms. You could further use multiple cipher algorithms in multiple structures, thus alowing which structure and which algorithm in which position become part of the “Key Material (KeyMat).

However people need to remember is that all you are building by chaining ciphers what ever the structure is a block cipher with a bigger key space. That is all the failings of a block cipher still remain thus you still would not want to use it in ECB mode but one of the other standardized modes for block ciphers (or other cipher types, see Ross J Andersons work on Bear and other algorithms).

Which brings up the question of synchronisation. Cipher modes have issues when the encryption and decryption get out of step, thus you need to take precautions to prevent this or ensure that any effects are not propagated far or indefinitely, thus requiring a restart or resend which brings up all sorts of issues and attack vectors as well as leaking information.

The fact that synchronization has to be done opens up other possabilities. In the design of DES there is a “whitening” stage where the data is simply XORed with in the case of DES a fixed value. As you have to synchronize both ends there is no reason why the whitening value should be fixed. That is in effect you “stream cipher” not on a bit or charecter size but on a block size (it’s also in effect what some of the block cipher modes do). You can further this idea by not having a “fixed key” for the block cipher but an “evolving/changing key” for the rounds in the block cipher.

What you do and how, again becomes part of the KeyMat. Thus simply by increasing your KeyMat size and complexity of the resulting cipher, you lift yourself above other users. Thus they become the low hanging fruit for the likes of the FBI to chose in prefrence to you unless you give them reason to make you a “Person of Interest” (which is a more general OpSec issue).

Importantly though as long as each individual instance of a key use is unique, non determanistic and kept secret the symetric algorithms if chained sensibly will remain secure over the likely period of a human life or that of most companies/corporations.

The problem with this is how do you ensure KeyMat is unique, non determanistic and kept secret. This job falls under the little understood “Key Managment” (KeyMan) processs and it’s sub processess of,

1, Key Generation (KeyGen).
2, Key Distribution (KeyDist).
3, Key Destruction (KeyDest).
4, Key Storage.
5, Key Audit.

Because each of these are actually a very hard process to get right in of themselves even when all the pieces are in place, they are a subject not much talked about for systems not involving in any way the now assumed to be vulnerable Asymmetric / Public Key systems.

The primary problem is working out how to transfere an initial secret between two parties that whish to communicate. It needs some kind of “secure channel” but to get one of these you need to transfere a secret… Which is “Turtles all the way down” if you only have a monitored communications channel available, which is generally the default case these days.

Thus you need another or side channel other than the monitored communications channel. Obviously this side channel has to be one that can be secured in some way. In the past this was done by person to person contact where a code book or cipher schedual would be transfered on easily destroyed and not easy to copy paper etc.

The problems with this can often be seen when people talk about “One Time Pad” (OTP) systems. Unfortunately what these discussions rarely make clear is that the only difference between OTP KeyMan and Symmetric Cipher KeyMan is the physical quantity of KeyMat involved.

It’s why Asymmetric / PubKey systems for all their problems with multiple CA’s and hierarchy security issues were lept on. Brcause they fundementally made E-Commerce and “no contact” secret communications over a monitored communications channel possible.

Making oneself secure currently means loosing the “no contact” element of PubKey systems. Thus anything you need to do privately requires a secure side channel, for the initial secret transfer. Which due to the “Collect it all” policy of the SigInt agencies can not be done by electronic communications, or for that matter any method assisted by electronic communications. Which rules out getting on a plane or many forms of modern transport including your private car. Likewise you can not use a courier service like DHL, TNT, etc or any other service that involves some form “tracking”.

Whilst the details are unclear it appears that even postal services are now recording the destination details of letters and parcels against the entry point in their systems to help them better optomize their service. These “third party records” have no legal protection and the length of time they are kept is unknown.

And in this age of “that will do nicely” where credit cards are seen as preferable to “cash” this makes more third party records we know get traded and kept effectively indefinately, thus making maintaining yourself on a journy longer than a quater of a day difficult unless carefully planed beforehand.

All of this kind of makes transfering secrets secretly more than a little difficult. Which is the sort of advantage the FBI and DoJ want over people[3] as do other Governments and their entities as it gives them power in various ways, most of which are considered illegal if you or I were to do the same to others. We might make complaint about what the Government of China is doing on the way of overt surveillance, yet we don’t complain when Global Corporations willingly do the same and various Western “Five Eye” Government entities benifit covertly for free… So for our own safety we have to assume we are under significant surveillance all the time irrespective of where we are, and more importantly we have no control over the product of that surveillance or when it will be revealed if at all. Because the old adage of “What you don’t know, can’t harm you” now belongs to an era long long gone, if it ever realy existed, our futures can be decided by others. An obvious example is the “Oh so secret” “No Fly” lists, if you are not allowed to fly many advancments in occupation stop dead and whole career paths closed off to an individual.

Thus how an individual gets a secret secretly to another person is a much wider security issue. One that is actually less difficult for criminal and similar networks that have already got their own “trusted courier” networks in place and have done since before “collect it all” became inflicted on the average citizen.

But I do know of technological ways to do these things, but they require both parties to participate. Which means either they have to be given a secret secretly, or there has to be communications channels in common use that can not be monitored by others.

Tor was an idea to implement publicaly available communications channels that could not be monitored. Unfortunatly it was predicated on a couple of assumptions that are not true, and were not true when originally thought up. Also due to the fact it needs PubKey systems to work it is not going to work for people now either, it just delays the inevitable disclosure. That is it will in no way be “Post Quantum Computing Secure” and as “collect it all” actually focuses on such traffic and it’s preservation, it will almost certainly be amongst the first communications looked at and stored in a “702 Database” or equivalent. Thus in turn be looked at by the likes of the FBI and similar entities in all Governments who can get their hands on either QC or the results of using QC on recorded Tor traffic.

Thus we realy need Post Quantum Computing Secure algorithms in place NOW. Not just to replace symmetric crypto algorithms but asymmetric / Public Key algorithms that give us the “no contact” ability to secretly transfer secrets that enable us to have the privacy society as we know it needs to have to survive.

We have seen what we will get without them, in those old East European nations and current tyrannies and police states that are proliferating and getting closer not just step by step but by the leaps and bounds of “Surveillance Capitalism”.

[1] I’m very definitely not using the term “illegal” here for the users of the systems, just for the FBI oparatives and those they have coerced. Because in the case of many sites for the likes of “Gambling” they are quite legal where they are located –ie not on US jurisdiction– and for by far the majority of the worlds population quite legal –if ill advised– to use. Thus the FBI are gaining unauthorised thus illegal entry into these systems in Sovereign Nations where neither the US judicial system or US Gov entities have jurisdiction. Further under the DOJ and FBI justifications of locality the FBI are breaking the laws of that Sovereign Nation. Thus if taken to court in those countries they would become criminals and subject to the the tarifs the judiciary of that Sovereign Nation choses to impose.

[2] It’s hard to work out which is actually worse show trial seeking FBI/DoJ or hard on crime politicians on approproations committies. One thing is certain though, one heck of a lot of money is involved and it’s hard to work out from which pocket to which pocket it goes or why. None of which gives any confidence in the “justice process” being equitable or for all.

[3] Importantly the one thing people must get out of their heads is that there is anything wrong with,

    Doing secret things secretly.

It is after all a good definition of what our “private lives” are, that is “doing private things privately”. Like technology, secrecy is agnostic to use and without it society as we know it can not nor will not survive. Every one has secrets even if it’s the fact they fart when getting up in the morning and their pet dog gives them a sideways look. If you think it has the potential to embarrass you or some how diminish you in the minds of others then it’s something you wish to keep private thus unknown to others which means you are keeping secrets, not doing anything wrong.

Clive Robinson October 15, 2019 9:17 PM

@ Rodney Van Meter,

Time to pop up to your wheel house and ask a practical question.

A couple of decades ago I used to frequent a select part of UCL where a Quantum Computing group was setting it’s self up.

Back then they were talking about getting the support mechanisms for Qbits occupying a little under a square centimetre of real estate.

Obviously time has moved on but the question of just how much real estate 20million Qbits and all their support mechanisms would occupy still remains. So what would be a fair estimate these days?

Are we still talking impossibly large aircraft hanger size, or something you might just be able to squease through an office doorway if you took the door of it’s hinges first?

Because as an engineer I know it’s the rather drab details of physical size, power consumption, environmental control and other areas of “ease of use” including that modern notion of UI that are ultimately going to decide if Quantum Computing ever realy becomes practical in the ordinary human sense rather than a laboratory curiosity.

Which leads into another aspect, that can kill a technology before it’s even got going and that’s “Press Stories”.

In most press reports when they talk of QC devices, they mention them working near absolute zero or liquid helium or such like… The reality is the only time the average person is going to come into contact with cryogenic gasses is when visiting the doctor to have a wart removed or similar cryo-surgery. Also the only time they knowingly see it on TV is with shattering bananas or making instant icecream. Thus I will predict they are not likely to want cryogenic gasses such as the getting scarcer liquid hellium sitting under their desk just waiting in their minds to cryo-burn their toes off etc.

Or as has allegedly happened with NMRI machines in hospitals, where cryo-gases have been released in quantity everybodys new expensive iPhones to stop working due to it getting into the hermeticaly sealed MEMS and other electromechanically active components[1].

Supprising to many these days, it was at one point touch and go as to if cellular phones would happen outside of city business districts. In part it was the “microwave your brain” stories and similar, but also about the horrendous industrial disaster looking base station equipment. Which some will remember gave rise to such equipment being “hidden” in fake trees, flag poles and all sorts of other “out of sight out of mind” techniques.

[1] https://hackaday.com/2018/10/31/helium-can-stop-your-iphone-maybe-other-mems-too/

Rodney Van Meter October 16, 2019 12:57 AM

@Clive excellent question.

The earliest superconducting and quantum dot systems required about a 19″ rack full of gear to control each qubit, using very general-purpose AWGs (arbitrary waveform generators, like an oscilloscope or digital signal analyzer in reverse). Nowadays, you can do it with a single FPGA plus a little analog hardware and filtering, about one 1U unit per qubit, and still improving.

SpaceLifeForm October 16, 2019 4:54 PM

@Clive

“But I do know of technological ways to do these things, but they require both parties to participate. Which means either they have to be given a secret secretly, or there has to be communications channels in common use that can not be monitored by others.”

This.

Human networking.

Alice knows Bob. They have personally exchanged cryptoids.

Bob knows Charlie. They have personally exchanged cryptoids.

Bob introduces Alice to Charlie.

Bob gives to Alice the Charlie cryptoid.

Bob gives to Charlie the Alice cryptoid.

All ‘gives’ over secure channel.

No forgeries.

Bob tells both Alice and Charlie that there will be an ‘intro’.

It will be up to Alice and Charlie to decide whether they comm or not.

Clive Robinson October 18, 2019 10:31 AM

@ SpaceLifeForm,

Human networking.

All the rage for Serious Organised Crime, Drug Lords, Terrorists, CIA and a few others such as those who are just plain cautious these days.

No by,

    But I do know of technological ways to do these things, but they require both parties to participate.

I was thinking on a work still in progress have a search on this site for “Fleet Broadcast”

Doing anonymity from observers as well as privacy in message contents and message transmission/reception is fairly easy…

However it’s coming up with the initial anonymous “Rendezvous Protocol” without a nexus or root of trust that is causing me problems still…

SpaceLifeForm October 18, 2019 4:46 PM

@Clive

I did not say it is easy.

“All the rage for Serious Organised Crime, Drug Lords, Terrorists, CIA and a few others such as those who are just plain cautious these days.”

Those named players are obviously an issue. Likely all the same cartel.

So, who does one believe in?

Those they have met, or the mysterious groups?

Best to trust people one has met, than to believe in fake news.

MarkH November 27, 2019 3:49 PM

@Jeff Hall:

Thanks for the link to an interesting article.

The notion of a hidden breakthrough is wonderfully seductive. Consider Roger Grimes’ delicious use of language: “Is it possible that a secret quantum computer team has obtained more stable qubits and has successfully accomplished the quantum crypto break and we just don’t know about it?” [my emphasis added]

I mean, a secret quantum computer team? How cool is that? Without effort, I visualize a vast underground laboratory dug into the bowels of a mountain, in the classic style of a James Bond villain.

The best thing about a secret-break hypothesis, is that it’s practically impossible to falsify at the time it is asserted. Years afterward, a survey of history may show that very likely there was no such breakthrough … but how can one prove that it isn’t happening right now?

Grimes offers “3 reasons the quantum crypto break might have happened.” With the disclaimer that my understanding of the physics and its application is nearly zero, here’s why I suspect that each of them doesn’t hold water:


Newer algorithms reduce the number of qubits needed — Grimes refers to alternatives to Shor’s algorithm which trade time for computer capacity. But time may well be more costly than size. Grimes offers the example of about 8200 qubits to solve a 4096 bit public key. I believe that this is estimated to need at least an hour for a flawless universal quantum computer; the alternative algorithms needing fewer qubits might need days to function.

If I understand correctly, it’s very difficult to get the small quantum computers of today to maintain the condition (coherence) needed to complete a computation for even a few seconds. If someone offers, “you can do this with less qubits, but you must maintain coherence for tens of thousands of seconds,” this may not be of any practical use.


Making more qubits should be getting easier — Grimes actually cites the example of Henry Ford!

If I understand correctly, making more qubits isn’t a big problem: the difficulty is that all of the qubits making up a quantum computer must be mutually entangled, maintaining a kind of intimate connection with one another, while being sufficiently isolated from about 1080 other particles which make up the universe. Of course, isolation is always imperfect, causing the entangled system of qubits to decohere after a while (see above).

The challenge isn’t so much getting more qubits, as it is enabling a group of qubits to interact as if they were in their own separate universe, without disturbance from outside. It’s scaling up stable entanglement, which seems to be an extremely hard problem.


Experts are mysteriously silent on quantum computing progress — “In June 2018,” Grimes ominously warns us, all of the quantum computing scientists from whom he is accustomed to getting news “suddenly went silent.”

Grimes even saw “the public website showing progress on qubit counts,” in his words, “go quiet.” What does it mean, when a website goes quiet? Did it stop accepting IP connections? Fail to answer HTTP GET requests? Show 404 pages? Or did its reporting of the maximum number of achieved qubits simply stop rising?

He says that “some” of his sources have been “working on secret government projects.” It’s plausible to me, that those individuals are respecting some gag order from their employers. In World War II, I’m sure scientists who weren’t already working on secret projects were willing to be quiet about research they understood to be sensitive.

But in 21st century America, that academic researchers would “clam up” because of some secret government breakthrough seems impossible to me. (For one thing, imagine going to dozens or hundreds of professors and saying, “we’ve made an awesome secret breakthrough! You’ve got to shut up about your unclassified research which failed to achieve our earth-shattering milestone!”)

What the silence of Grimes’ sources means, I don’t know. Could they be tired of his pestering them for help in getting his new book completed?

But the absence of further increases in qubit counts probably has one or more very prosaic explanations. Here are a couple of suggestions:

  1. Scaling up quantum computers is extremely difficult.
  2. Maybe the present lines of development are looking like a dead end. If with great difficulty you can assemble a few dozen qubits which will decohere in less than 0.001 second, will adding 10 or 20 qubits more — with an even shorter coherence time for the more complex system — accomplish anything at all?

Here’s a modest perspective: perhaps before the white-coated quantum scientists in the underground lab start routinely factoring 2048-bit semiprimes, somebody will factor one of 1024 bits. Or maybe 128? Do I hear 64 bits? Forty anyone, who will bid 40 bits?


Grimes writes for us, “I’m not normally a conspiracy theorist.”

But maybe when you’re hawking a book titled “Cryptography Apocalypse” (for realz, I didn’t invent that title), a little conspiracy theorizing will sell a few copies more.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.