Why Is the NSA Moving Away from Elliptic Curve Cryptography?

In August, I wrote about the NSA’s plans to move to quantum-resistant algorithms for its own cryptographic needs.

Cryptographers Neal Koblitz and Alfred Menezes just published a long paper speculating as to the government’s real motives for doing this. They range from some new cryptanalysis of ECC to a political need after the DUAL_EC_PRNG disaster—to the stated reason of quantum computing fears.

Read the whole paper. (Feel free to skip over the math if it gets too hard, but keep going until the end.)

EDITED TO ADD (11/15): A commentary and critique of the paper by Matthew Green.

Posted on October 28, 2015 at 2:11 PM98 Comments

Comments

Lisa October 28, 2015 2:39 PM

In order to get the required superpositions of qubit states, will this be possible in linear or exponential time/power relative to the number of qubits? I am not aware that physicists have properly answered this question yet.

If it takes more energy than the sun, or trillions of years to break a single RSA or ECC key with a quantum computer, then this exercise of migrating into alternative public key algorithms is pointless.

One might as well work on designing crypto systems that take into account time travel. (Not PFS, but dealing with brute force methods in which after countless years the results can be transmitted to the past.)

r October 28, 2015 3:26 PM

@lisa, I think one might be hard pressed to label any EPR based phenomena as ‘not time travel’?

Am I mistaken about the known or unknown limitations of entanglement here? I mean, just because we haven’t found a way to measure or make proof backwards shouldn’t discredit the potential to find it – we do have seemingly instantaneous transmission of ‘data’ in these instances don’t we. Plus for all intents and purposes the forward transmission of ‘data’ is standard fare in classical physics.

Scott October 28, 2015 3:37 PM

Could it be something more prosaic than ER – EPR bridges?

I’m guessing with the advent of free and easy Certificates, the NSA is trying to move people away from using technology that would overwhelm their data storage capabilities. My guess is that is if it becomes commonplace, they’re going to have to retain an exponentially larger amount of data for their totally not TIA related activities.

Even the NSA is subject to funding issues, and the honorable Mr. Ledgett is preparing for the future.

Anura October 28, 2015 4:57 PM

@Jacob

They’re talking about less than 10 qubits. From what I’ve heard from random people on the internet, each qubit you add is exponentially harder to do than the previous. So if that is true, it sure doesn’t sound like we are expecting quantum computers that can break ECC in the next 20 years. It’s possible that the NSA is just being extremely conservative (they should be) – we will be able to break 384-bit ECC with quantum computers long before we break 2048 RSA or DH.

blake October 28, 2015 5:24 PM

As a reader of Neal Stephenson’s Cryptonomicon, I’m disappointed that section 5 skipped a possible motivation:

The NSA can break ECC but have just recently been breached by a hostile and competent nation state who, now having the keys, can also break ECC. Now the NSA wants to discontinue ECC in a manner that doesn’t admit they could break it all along.

It does say NSA having broken ECC is unlikely, but part of the justification for that is the “why now?” question about the changing recommendation.

Jacob October 28, 2015 5:43 PM

@ Anura

10 qbits today, but scientists appear to crack the scaling issue now. Ms. Marchenkova states that the science is done, and now it is just engineering.
The Australian team’s breakthrough she referred to is about building a quantum logic gate on silicon with related scaling efficiencies. That team believes that in 5 years they can have a commercial quantum computer.

Nick P October 28, 2015 6:12 PM

[repost]

It’s a pretty straightforward situation for anyone whose done high assurance security long enough. I’ve already posted about problems and solutions in asymmetric crypto on many forums, including here. The risk was obvious before anyone even encouraged uptake of ECC despite the laughs and counterpoints I got. So, here it is again.

Most important: Watch what NSA use themselves for most critical stuff and what they’ve put into it. That was usually Suite A or B algorithms run through rigorous Type 1 requirements focusing on elimination of all protocol weakness (IPsec vs HAIPE), careful (often hardware) RNG’s, minimal TCB’s, enumeration of every state of design w/ errors states proven fail-safe, repeat at each layer, thorough analysis of implementation for flaws, and EMSEC. Already says most stuff on market is insecure, helped me predict many TAO attacks, and tells us what to focus on. But what about ECC itself?

They didn’t publish enough to reverse engineer that. So, here’s the possibilities.

  1. Classical algorithms are safe from classical attacks.
  2. Classical algorithms are safe from quantum attacks.
  3. “Post-quantum” algorithms are safe from classical attacks.
  4. “Post-quantum” algorithms are safe from quantum attacks.

At any point, their recommendations will reflect their beliefs on these for Type 1 at the least and Suite B if they’re not playing a BULLRUN scam. Their best recommendation was Suite A or B symmetric ciphers with FIREFLY protocol (Photurius variant) in product. That means they trusted both types of crypto when implementation and configuration had correct properties. This implied they believed 1 and 2 were correct while giving us no data on 3 or 4.

Note: I didn’t trust ECC specifically at this point because assessing the math’s security wasn’t as clear as RSA, etc. Plus, we knew from their fight with strong crypto they couldn’t beat the prior stuff any better than public cryptographers. So, why not use what’s proven? My motto is “tried and true beats novel or new” for high assurance. I recommended against ECC wherever possible as the risk was unknown from my vantage point.

Recently, NSA has advised against ECC (surprise!) and talked of the need for post-Quantum alogirithms. That means ECC is not safe from classical, quantum or both attacks. Carefully note that them saying it’s about post-Quantum doesn’t really mean the risk is quantum: it could be a classical risk that existing or proposed post-quantum systems don’t have w/ knocking out future, quantum issues being icing on cake. They’re known for misdirection on both defense and offense.

What we know:

  1. Classical algorithms, esp ECC, are potentially broken now or within some time frame.
  2. They’re neither really pushing nor fighting existing, post-Quantum schemes. They’re also researching more.

Note: The post-Quantum schemes have risk of classical attack as not enough analysis has been done in general and there have been some negative results. One needs classical and quantum security. Them seeming on the bench counts as a risk unless someone has specific recommendations from them I haven’t seen yet.

  1. They encourage strong, symmetric ciphers.
  2. They’re still heavily funding mathematical methods and tooling that find flaws in protocols, algorithms, implementations, microcode, and hardware.

The security recommendations follow naturally from that and corroborate my older ones. The oldest (and best) was to use symmetric (eg PSK, TTP, HSM’s) wherever possible. Most security across the board. The next was obfuscation and computational complexity added to proven, classical scheme if performance or cryptosystem choice were issues. Next was secret splitting of key to be exchanged/signed with one or more each of classical-resistant and quantum-resistant algorithms. Most recent was splitting among as many as feasible with different paths of attack (eg RSA, NTRU, McEliece). Newest is a hybrid that uses asymmetric crypto that’s strong against classical attack, immune to quantum, and fast. Probably try putting it into a journal or something instead of failed strategy of forums, etc. Just combines some proven stuff if you’re wondering: nothing new like usual for real security.

In any case, there’s no reason for conspiracy theory. Just look at what depends on what in their claims. They’ve been worried about quantum attacks on the main algorithms with evidence they might be achieved. There’s been classical issues on at least one post-Quantum algorithm. There’s also general principle of assuming things insecure until thorough, peer review. Further, their defensive arm already told us (and contractors) to prepare to ditch the classical algorithms. They also reinforced that symmetric stuff works.

Result: (a) using symmetric is strongest, (b) using main asymmetric is weak 1-2 ways, (c) using post-quantum might be weak 1-2 ways.

Easy conclusion: combine the three for best results or at least the latter two with one classical and one quantum. Use their key sizes or higher with highly assured implementation for each. Looking at the foundation of their claims, one doesn’t need tons of pages of math or argument to understand the situation or solve it. It’s a “known unknown” where you can infer enough of the problem to solve it without further details. So, we should just try to implement the only known solutions while cryptographers work out the rest of the details over time and maybe invent asymmetric stuff that will last. I have my doubts but wish our talented cryptographers the best…

Gweihir October 28, 2015 6:28 PM

Fascinating reading. It shows off some of the fundamental uncertainties when evaluating the capabilities of an attacker and when evaluating cryptographic strength of mechanisms.

Incidentally, from following something like 25 years of Quantum Computing research, my impression is that either scalability will never manifest (breaking the idea) or maybe even the physical models will not hold up when taxed to the accuracy needed. So far, when measuring very exactly, all physical models have turned out to not be accurate. When measuring very accurately in physics, we are talking about a resolution and accuracy of maybe up to 50-70 bits, as more is usually impossible due to noise and other factors. Even getting up that far is a significant challenge. So, lets say, entangled particles only behave as Quantum Mechanics suggests up to 64 bits of accuracy and any quantum computer powerful enough to threaten crypto will go right out the window. Do not forget that this is essentially an analog computer and while their speed can be impressive, their accuracy is not and cannot be.

MarkH October 28, 2015 6:31 PM

On QC:

I remain skeptical. I don’t know that there won’t be a useful quantum computer in 2025, but I wouldn’t bet more than a few bucks on it.

Consider the nuclear fusion power plant, which is fundamentally a much simpler problem than building a useful quantum computer. Its fundamental physical principles were well understood 76 years ago (that’s right, if the theory of fusion were a man, he would be elderly; based on US life expectancy, he would have died in 2001.)

For six decades now, with billions upon billions invested in fusion research, fusion optimists have been telling us that within a generation, fusion power plants will be providing us electricity at cost efficiency at least as good as technologies already in place.

The first fusion power system demonstration (neglecting cost, longevity etc.) yielding net energy (creating more joules of heat than the joules needed to trigger the reaction) was achieved in … 2014! It took sixty-one years of costly efforts to reach this point, still far removed from practicality.

As of today, some with insight into the field suggest that fusion may NEVER reach the point of cost competitiveness.


Of course, QC doesn’t require astronomical temperatures and pressures. But it does require large numbers of particles to remain entangled for extended periods of time.

According to my reading, a QC running Shor’s algorithm on an n-bit factoring or discrete log computation needs 2n qubits. For n=1024, the coherence time was estimated at about 1000 seconds. Presumably, for recommended public key lengths of 2048 or 4096 bits, the coherence time needed to break will stretch into hours or even days.

Researchers are claiming big strides in these areas, and who knows … maybe they will be a reality in a few years.

anonymous October 28, 2015 6:36 PM

What if it’s simpler than all that: they want “everyone” to upgrade to their post-quantum algorithms that have a nice hard-to-discover backdoor that’ll be ready “real soon now” (but not quite yet).

BTW, D-wave has 2000+ qubits, but it’s not a general-purpose computer.

Ralph October 28, 2015 7:05 PM

My guess is the motivation is mundane. They are simply being realistic about how long it takes people in the real world to change cryptosystems.
People are still deploying MD5 and god knows what things that were broken in the 90s.
Public infrastructure like the WWW is I think improving, but there is all the nasty one-off proprietary crap that will still be in use in 50 years time.

Daniel October 28, 2015 7:52 PM

Gweihir

I agree. I find the claim that a quantum computer is now “just an engineering problem” laughable. That’s like saying that swimming across the Atlantic Ocean is “just a biomechanical problem”.

So I’m skeptical until I see solid evidence otherwise.

MarkH October 28, 2015 8:22 PM

@anonymous 6:36 pm:

As you perhaps hint at, the D-Wave machines CANNOT run Shor’s algorithm.

Also:

a. D-Wave qubits have not been shown to be entangled, in a manner that convinces the research community. Even worse, it remains a debatable question whether their computations exhibit quantum characteristics AT ALL (the annealing process can be done by non-quantum systems).

b. D-Wave machines (so far) can perform only ONE computation, which essentially is to model its own structure … to offer a very unfair but nonetheless genuine metaphor, I could make a bi-metal spring and advertise it as “a computer that accurately computes the response of a bi-metal spring to changes in temperature”.

c. The ONE computation that D-Wave machines can do (annealing) is faster than a desktop PC computing a simulation of annealing, for sufficiently big optimization problems … but not by a very large factor, at the present D-Wave scale. When D-Waves get big enough, the speed-up will become significant. However, two caveats on this: (1) it isn’t known whether ordinary computer simulation algorithms can be greatly improved, thereby closing the gap; and (2) it’s possible that classical annealing systems (making no exotic “quantum” claims) can do just as well as a D-Wave.

Harry Johnston October 28, 2015 8:40 PM

@Lisa: according to Wikipedia, setting up the initial superposition “can be done by applying Hadamard gates to all qubits in the input register” and notes that “another approach would be to use the quantum Fourier transform”.

I have no idea what either of those actually mean, but the implication is that this is at most linear in the number of bits. 🙂

Harry Johnston October 28, 2015 8:46 PM

@r: no, if we assume QM to be correct it can be proven mathematically that there is no time travel or faster-than-light communication involved in EPR phenomena. It isn’t just “we don’t know any way that can do it” it’s “we know there is no way to do it”.

See also xkcd on Bell’s Theorem.

Jonas October 28, 2015 9:55 PM

MarkH, re: “I don’t know that there won’t be a useful quantum computer in 2025, but I wouldn’t bet more than a few bucks on it”–QC even by 2050 could be a problem for NSA, because they sometimes need to protect classified data decades into the future. If the adversaries of that time could break crypto on classified circa-2025 documents, it would be considered a security failure.

And I can reach a different conclusion from your fusion anecdote: that scientific progress is extremely difficult to predict on that timescale. Compare our 6 decades of progress on transistors to the early predictions.

Nicholas Weaver October 28, 2015 10:12 PM

One thing equally interesting is the same update to Suite B added in RSA and DH (3076b) as approved under Suite B.

I don’t know if this was just to remove the need to transition to Suite B or some other reason, but its interesting.

Nick P October 28, 2015 10:13 PM

@ MarkH

I think you’re applying one problem to another that doesn’t match it. The issue with quantum computers is if they can find a way to build it at all. The issue with fusion is if they can make a process that consumes extreme amounts of energy produce more than it consumes. Quite different.

The better comparison would be between people describing fusion and building it in useful ways. Personally, I don’t think there’s any real comparison between the two but that would be closer. We’d probably do better looking at difficulty of getting computers to work vs effort invested. Then compare that to what’s put into QC right now.

MarkH October 29, 2015 4:20 AM

@Nick P

Well, this is getting into quite a tangent — but I think they have more in common than you suppose.

Your characterization, that the “issue with fusion is if they can make a process that consumes extreme amounts of energy produce more than it consumes,” applies only to recent years.

This is place of comparative ease and luxury, which fusion R&D reached by way of milestones that were conceptually very simple, but enormously difficult to reach.

It took decades of costly research and development to get any Tokamak (toroidal magnetic confinement) to produce even relatively low temperature plasma currents.

The problem of keeping the plasma stable for more than a fraction of a second (the highly unstable plasmas are extremely prone to diverge and hit the walls of the toroid) took an enormous amount of R&D to solve. I remember a time when it seemed unclear whether this hurdle could be surmounted at all.

It took decades of costly research and development to reach fusion ignition temperatures with both magnetic confinement and laser approaches.

Fully forty-three years elapsed, between invention of the Tokamak and the first time fusion occurred inside of one. I can promise you that net energy production was not the big worry during those decades.

These tall mountains had to be climbed first, before these systems achieved any fusion at all, no matter how tiny.

Many Very Hard problems had to be solved, before the one at the top of the list became the generation a reasonable net surplus of energy.

To my mind, the enormously difficult problems of generating super-high temperatures and pressures, or stabilizing plasmas, are somewhat kindred to the problems of creating large networks of entangled qubits, and achieving the super-long coherence times required for many practical quantum computations.

Conceptually, getting a plasma to millions of degrees, or stabilizing a multi-million ampere current inside a magnetic doughnut, or getting an assemblage of thousands of qubits to cohere for hours at a time are all supremely simple.

Making them happen in the Real World (TM) seems to be kinda hard. In theory, theory and practice are the same. In practice, they are different.

Jacob October 29, 2015 6:02 AM

@ MarkH

It’s a bit unfair to compare QC to Fusion development without considering the allocated budget.

Some baseline data:

  1. The ITER project is estimated to cost $20B for its 25 years or so life. (Wikipedia)
  2. NSA annual budget is estimated at about $10B. NSA Bluffdale center alone is estimated at $4B for infrastructure+HW+SW (Wikipedia).
  3. The Apollo program costed $109B – $180B (various formal estimates per Wikipedia, 2010 dollars)

Two observations:
1. If QC is labelled “necessary for national security”, the budget allocated for developing a working machine can easily surpass the expenditures for the ITER project (per recent posts, it is now just an engineering issue..) and get it produced in a few years.
2. Had J.F. Kennedy declared a run for a commercial Fusion instead of a “man on the moon”,
I think that we would have seen some commercial fusion reactors by now.

Also, look at the effort and money invested in the Manhattan Project (5 yrs duration, $26B in 2015 dollars)for a reference. Put the same in QC dev and see the results.

z October 29, 2015 8:08 AM

My guess is that this has nothing to do with ECC. I think that the NSA wants to backdoor quantum-resistant crypto and believes it can get away with it now. QCs and quantum-resistant cryptography are still very new fields and it is safe to assume that the NSA has a significant head start on civilian researchers. Now is the time to subvert standards.

Who? October 29, 2015 8:15 AM

This raises a lot of good and proper questions, indeed.

On one hand, a goal of the National Security Agency is breaking codes so they would like the (unclassified) world moving away of nearly unbreakable crypto. What happened to its Suite B program may be just this. On the other hand, they know math is strong and it is usually easier attacking implementations. So the first step to break a secure communication channel is attacking end-points (by mean of software bugs, backdoors or, at a last resource, physical implants), not the cryptography itself.

My fear is that ECC is secure and they are trying the world to withdraw it in the hope weak, less studied, cryptography is adopted. It is not the first time NSA tries to get more than one concurrent open door to the information they want (see for example Google collaborating with the NSA under the PRISM program while, at same time, NSA was listening to the unencrypted communications between Google datacenters[1]).

[1] another good example about how careless is Google about security.

Curious October 29, 2015 8:51 AM

Thinking about an initiative based kind of problem, with NSA trying to tilt conditiona to favor them:

Not being a scientist or anything, I am wondering could it be argued that a discontinuation of ECC crypto industry wise would make it unlikely that the industry will continue researching on ECC to perchance find flaws in the future for today’s EC crypto, so that, if there ever was a damning flaw with EC crypto, such a catastrophic flaw would never be known.

That in turn makes me wonder if there is anything about the whole elliptic curve crypto, math or otherwise, that is perhaps such an off shoot (from other things) that nothing ever will likely be based off of ECC again (whatever that might be, I don’t know much about ECC, or the math).

Greg October 29, 2015 10:45 AM

Sigh… Saying a quantum computer is just enginnering betrays the fact that the engineering is the hard part in the first place. It really is 2x harder for every qubit. It is 2x more likely to have decoherence issues, while at the same time you need to do more operations on the larger qubit register. Yes I did work in the field and my friends still do. It is more likely the NSA have been visited by aliens that can crack ECC than they have a 100 qubit computer or are 10 years away from one.

QC just are not like classical ones in any way. On an 8bit classical computer i can easily multiply and divide number much larger than 8 bits. Not so with a QC. 100 qubits is totally useless for factoring a 101 bit number. This entanglement is very hard to do without decoherence. currently there are quantum error codes you can use, but now you need 3-100x more qubits (yes really 100x more depending on who you believe).

QC at the scale needed may in fact be impossible in this universe. We are about to start putting that on grant applications now are we.

DWave is not a quantum computer in this sense. The bits are not entangled.

It has been shown and proven countless times. Information cannot be transmitted faster than light. No spooky non locality (Really Einstin just didn’t like quantum physics more than anything) changes and has been show many many many many times.

The most likely reason is perhaps there is a new class of weak curves or something like that. Perhaps the class of weak curves is rather large. ie making ECC no better than DH etc.

Curious October 29, 2015 11:15 AM

Oh, lol I think I’ve done it again. With regard to my last commen, I did some thinking before I decided to write it and totally forgot to state that I was simply assuming that NSA somehow had undermined the security of elliptic curve crypto.

Hideous October 29, 2015 12:40 PM

Koblitz and Menezes do not mention my favorite hypothesis, which is:

The NSA is more interested now in attack than defense and believes that more ECC deployment will strengthen defenders against NSA attack. In this hypothesis, the NSA is not actually aware of any particular or pending weakness in ECC, but they do have many attacks against deployed RSA-based systems. After all, there are weak keys for various reasons (like bad PRNGs), usage, protocol design, and implementation flaws (like weak RSA padding), and the NSA has trojaned, subverted, backdoored, and sabotaged many deployed devices and cryptosystems. Upgrading to ECC means replacing a lot of stuff. The NSA doesn’t want to lose any breaks they have now and doesn’t want the effort of creating new ones. By deprecating ECC they extend the status quo, which favors their current goals.

Anura October 29, 2015 1:31 PM

Dropping 256-bit curves in favor of 384-bit and adding 3072-bit RSA is not consistent with the NSA looking to weaken security.

Henry Hoyt October 29, 2015 1:31 PM

May I offer for your consideration that a sense of perspective might by useful?

Every social interaction has a maturity curve associated with it. It isn’t static. It isn’t immutable.

Linear algebra in a social setting that is largely illiterate appears to be “magic” to those without similar educational opportunity.

Cesar’s cipher migrated from wonderment, then to pedestrian, as the math abilities matriculated to where our modern 6th graders have accomplished with their limited resources today. (Even with our US 6th graders, after that scathing article on further competence regression compared to the rest of the 1st and 2nd nation world!)

Every technology has a life.

And the assumptions about the world that were made at that time of any technology prove to be sufficiently incomplete or inaccurate in retrospect. (Look at the recent satellite pass that showed our understanding of the formation of solar system turned out to nearly orthogonal with the discovery of oxygen on comet Rosetta.)

I would proffer that it is the ability to continually adapt, improvise, and overcome that creates the opportunity. Immutability of any solution has been consistently shown to be erroneous. We need to be assessing if the a technology or capability will have, or has a useful life. And then is that life, and risk, worth the investment.

Rome built fortress walls (twice) in Britian. Nearly nomadic people found ways through them. Rome abandoned Britain. The English built castle fortress walls. Then the Welsh dug underneath and fortress castles were abandoned. Tally sticks and currency turned out to be superior, for a time.

Once, the US telephone system used a series of frequency inversions and randomized transformations that were considered “magic” for a couple of decades. By the end of WWII, it was clear new methods would have to re-establish the isolation that SigSally was supposed to have accomplished. A room full of equipment that can’t do what a multicore processor cell phone and gigabytes of memory can do today.

Practical application work was going on with quantum technology in the early ’80s. (Primary research precedes ‘first articles’ of application research.) Entanglement looked like “magic” back then. We quickly began to discover limits to its use. As was the string theory that was initially used. MagicQ (British) tried an early commercialization. Simply not competitive against the other alternatives.

Moore’s law gave us a perception set about which we all have a nearly subliminal understanding of today. And that was just on silicon-based technology that INTEL’s Moore knew.

Only ambitious public servants pursuing limited funding in hopes to have an infrastructure that shows a return for a career (their) lifetime persists. Keeping something protected for 50+ years is ambition. One technology that will accomplish that protection for that large a ‘life-cycle’ is simply not a reasonable expectation. Let alone a scale of yotta bytes.

Good fun for the NSA to think they will have repository of data to mine for 100+ years. But their assumptions about geologic stability, semantic integrity, media bit rot…more, show little reflection on the wisdom Heisenberg illuminated for us.

Look how long the library in Alexandria lasted.

Never follows a blood line. Never follows wealth. Never follows exclusive access to repositories.

Mesoamerica had base 20 math and made models of the solar and universe mechanics that we, with our base 10 math, had yet to “discover” what had already been known. Fortunately…the conquistadors and their priests burned the codices of everything possible so that wisdom could not be built upon.

Let’s first set reasonable expectations. Alchemy didn’t work before. Nor will the conflation of technology into a magical process of transformation, creation, or combination.

Cynical October 30, 2015 6:02 AM

The NSA has always attacked crypto. Somewhere on this site you can find the weird seeding of NIST standards that would seem to create a backdoor. At the same time, the NSA was pushing everyone to use the intentionally weakened random number generator.

When was the last time the NSA did something to strengthen crypto? Oh, yeah, they tweaked DES … so that everyone would trust it and then when it looked weak they said “just run the same math 3 times and everything’s good” (because that will protect you from everyone but US).

Not to mention all the anti-crypto/pro-backdoor nonsense floating around Washington.

According to the above, RSA is broken and ECC is safe. Of course, for the first time ever, maybe the NSA isn’t lying, which would mean that ECC is busted and RSA is safe.

Either way, the only safe path is to use neither: there’s tons of alternatives to both RSA and ECC. But hey, it’s up to you… how much do you trust your government?

Greg October 30, 2015 6:36 AM

You can’t bust ECC and not RSA. Since it is both a DL problem. We just know a better way to solve the DL problem over fields used in DH than in ECC. There are weak curves that can be “mapped” to the same fields and that makes ECC as easy to break DH.

Also Solving DL can factor numbers as well. So again RSA, DH and ECC are linked.

And we have never proven any of these problems to be hard. In fact we have not even proven the existence of trap door functions. We mostly believe that they are hard problems and that NP!=P.

So yea plenty of space for good work. NSA comes to university conference these days, and not just to recruit either.

Also the public key systems that don’t use factorization always suffered from large key sizes. However these days that is just not the problem it once was.

In the Shadow of October 30, 2015 1:01 PM

Greg wrote:

Yes I did work in the field and my friends still do. It is more likely the NSA have been visited by aliens that can crack ECC than they have a 100 qubit computer or are 10 years away from one.

Thank you for sharing. I know what that is like. My field is so very different from that field however, so your comments are very interesting.

It has been shown and proven countless times. Information cannot be transmitted faster than light. No spooky non locality (Really Einstin just didn’t like quantum physics more than anything) changes and has been show many many many many times.

What is your take on this recent experiment?

http://science.slashdot.org/story/15/10/21/197224/quantum-theory-experiment-said-to-prove-spooky-interactions

I believe you, but just curious.

MarkH wrote:

On QC:I remain skeptical. I don’t know that there won’t be a useful quantum computer in 2025, but I wouldn’t bet more than a few bucks on it.

And a few posters pondered, besides the authors of the paper, on ‘what is really going on in their minds…

Devil’s Advocate now writes:

I would not be surprised if the NSA had evidence that some secret information protected by powerful encryption was stolen. And if they did not confirm there was no way any manner of hack was involved. Which would leave far flung possibilities open to them like “maybe they have a quantum computer”. In such a way, and with such seriousness, that this literally was the only conclusion they could come to.

My two cents.

I realize that to 99.999999% of the readers that is preposterously implausible.

MarkH October 30, 2015 1:42 PM

@Cynical:

An apt commenter-name! At the risk of getting factual:

The NSA has always attacked crypto.

Yes, that is one of NSA’s core missions.

intentionally weakened random number generator

Yes, with this caveat: the backdoor was designed in such a way that NSA could exploit it, and nobody else. So this is not the same as weakening-in-general.

When was the last time the NSA did something to strengthen crypto?

Hard to answer, because in any specific case it depends on the existence or absence of secret NSA attacks. In general, there seem to be many examples of NSA strengthening crypto. Their OTHER core mission is to protect US national secrets, and in this role NSA promulgates standards that must protect against best-available attacks in order to fulfill that mission.

Oh, yeah, they tweaked DES … so that everyone would trust

Absolutely 100% flat wrong. NSA tweaked DES to make it stronger: this is established beyond doubt. The tweak was based on an attack that was not publicly known at that time, and NSA did not disclose it — so the change was an unexplained mystery. This did NOT make everyone trust, instead it engendered widespread skepticism and suspicion that the change had weakened DES. Before Snowden, this was perhaps NSA’s worst trust crisis in its history.

When DES looked weak they said “just run the same math 3 times and everything’s good” (because that will protect you from everyone but US).

The cryptography behind 3DES is sound, and does not offer any known opportunity for a backdoor. The best publicly known attacks are still infeasible, even with state-scale resources. The security level (112 bits) is below current recommendations, but probably will remain secure for decades to come.

there’s tons of alternatives to both RSA and ECC.

According to my surveys of the field, there are not many alternatives for public-key cryptography. These alternatives have various limitations:

• some of them are also insecure if a practical quantum computer ever comes into being

• none of them has had the kind of intensive crypto-community vetting required to ensure that there isn’t some built-in weakness — this is very important

• some of those that are hoped to be ‘QC-proof’ are heavily encumbered by intellectual property constraints

The last limitation means that anybody who makes an implementation with paying the license fees to the IP owner is breaking the law, and that it will never be adopted by the GNU/Linux community which is such a large part of Earth’s communication infrastructure.

Clive Robinson October 30, 2015 2:04 PM

@ in the shadow of,

If you are refering to this,

http://m.phys.org/news/2015-03-quantum-einstein-spooky-action-distance.html

I would be thinking on what the posibility of “spliting a single photon” would have on Quantum Key Distribution rather than Quantum Computing.

I have a friend I very occasionaly see these days that knows rather more about this than I do. Some years ago I had a thought experiment about sending one of a pair of entangled particles into a black hole to get information out.

My friend pointed out that depending on your view point either information must be able to travel faster than C or effectivly travel backwards in time, and the loop hole was that information has in effect neither energy nor mass in our classical and quantum view points. That is it’s like the issue with the double slit experement…

It’s one of the reasons I’ve said for some time that information is only constrained by the laws of physics, when it is inpressed/modulated onto matter/energy for communication, storage or processing by our systems that are constrained by the laws of physics.

MarkH October 30, 2015 2:28 PM

@Greg:

“Solving DL can factor numbers as well. So again RSA, DH and ECC are linked.”

For the sake of clarity (though it doesn’t affect your main point):

It’s true that solving DL can factor numbers … but not in general. A couple of years ago in a comment thread, I got “spanked” by a mathematician for my misunderstanding of this.

The most efficient known algorithms for computing DL depend on the group in which the problem is set. RSA moduli can be efficiently factored, given an efficient algorithm for computing discrete logs in a multiplication group modulo a semiprime.

Where the math guy corrected me (in the context of discussing news in discrete logs), is that because DL algorithms depend on the group, a “break” in one group may be meaningless for a different kind of group.

That being said, there may be substantial overlap between EC DL and semiprime DL, I just don’t know. If anybody has insight on this, please chime in!


PS I am truly grateful for your perspectives as a “QC insider”.

MarkH October 30, 2015 3:42 PM

On faster-than-C signalling …

I don’t know how this relates to quantum computers, because I don’t pretend to understand them.

However, it is a basic theorem of quantum theory that although widely separated particles can be instantaneously linked by entanglement — implying an infinite “speed of propagation” of the relationship between the particles — this effect can never be used to communicate information faster than c (speed of light in a vacuum).

Falsifying this theorem would require falsifying at least one of several quantum laws or relations. Until an experiment disproves one of these laws, we can be sure that superluminal signalling is impossible.

https://en.wikipedia.org/wiki/No-communication_theorem

Wael October 30, 2015 4:24 PM

@MarkH,

The topic was discussed briefly here. @Dirk Praet replied on the same thread here

@Anura,

Dropping 256-bit curves in favor of 384-bit and adding 3072-bit RSA is not consistent with the NSA looking to weaken security.

Valid point (but may be on the surface)

Harry Johnston October 30, 2015 4:44 PM

@In The Shadow Of: the results of that experiment were exactly as expected, i.e., they provided additional evidence that QM really does work in the same way as our models of it. It can be proven mathematically that this does not mean that we can use entanglement to send signals faster than light or backwards in time.

In some (unpopular) interpretations of QM, it may mean that the Universe itself sends some sort of information (but only in the broadest possible sense of the word) around faster than light. It still isn’t possible for us to make any use of it. (You might put it this way: God can travel faster than light, but that doesn’t oblige him to allow us to do so!)

MarkH October 30, 2015 11:08 PM

@Wael:

Thanks for the links.

I had forgotten the appalling ignorance and brutality of Dirk’s ideas concerning Russia’s aggression against Ukraine, as expressed in the same thread.

Clive Robinson October 30, 2015 11:31 PM

@ Wael,

I’m guessing that in general you take an astronomical view of the speed of gravity, but you forgot to add a link to this article,

http://www.metaresearch.org/cosmology/speed_of_gravity.asp

Which you have given in the past…

@ Mark H,

You might want to read the above especialy what it says about the “Speed of Gravity” and the experimental evidence that goes along with it.

@ Harry Johnston,

You might be interested as well because you might have to change your view on “God” like delegation 😉

Wael October 31, 2015 2:17 AM

@Clive Robinson,

but you forgot to add a link

I gave a link to a link to a link, but I should have given the direct link to avoid thread crosstalk and contamination. Lesson learned, I “guess”

@MarkH,

Dirk’s ideas concerning

Dirk has a view that differs from yours, that’s all there is to it. I think he’s an intelligent and well educated fellow. I see a lot of opinions here that are diametrically opposed what I perceive to be “correct”, but I also understand that people have different backgrounds, education, culture, different ways of thinking, different experiences, different foods they eat,…

Gerard van Vooren October 31, 2015 2:56 AM

@ Wael, MarkH,

Dirk’s ideas concerning

I think the major difference here is whether you have been raised and live in the EU or the US. People in the US don’t know how it is to actually undergo war yet for ages in the US the people have been subjected to the politics of foreign aggressiveness which requires “patriotism” and unquestioned loyability (Christianity helps a lot), thanks to the lucrative and influential MIC racket.

Wael October 31, 2015 3:14 AM

@Gerard van Vooren,

I think the major difference here is whether you have been raised and live in the EU or the US.

It’s one of the differences because people raised in the same region have varying opinions as well. It also depends how Europe is run! Is it heaven or is it hell? 🙂

MarkH October 31, 2015 5:14 AM

I don’t want to clog the thread with off-topic material, so I’ll make this my last comment about international affairs.

  1. Within living memory, most of Europe’s states have suffered hostile invasion by foreign armies bent on imperial conquest. Who then is better equipped than Europeans to understand the irretrievably depraved criminality of such actions?
  2. Dirk’s own homeland used the protection of its “racial/cultural brethren” from fictional danger and oppression, as a pretext to wrench territory from its peaceful neighbors (including staged “votes” on annexation), also within living memory. It is terribly disturbing to me that some Germans are eager to frame a precise replay of one of their own national crimes, in any way that minimizes the ghastliness of the act, or deflects responsibility from the military aggressor. The Germans I have known are relentless in their condemnation of any revival of Germany’s crimes from that era.
  3. Reasonable people can disagree about anything and everything. However, making statements whose factual falsity can easily be ascertained is not reasonable disagreement — it is ignorance and irresponsibility. I have no doubt of his extraordinary intellect, which makes the regurgitation of crude Kremlin lies even more grievous.
  4. Not wishing to itemize errors — they are too numerous — I offer one example. Some have suggested that a state freely entering into treaties in accord with the will of its people, is somehow equivalent to the annihilation of state sovereignty by the threat or actual use of armed force. Is that a reasonable equivalence? Does it show any respect for human rights and dignity — or is it a brutal instrumental calculus that reduces human beings to the level of ball bearings?

Gott hilf uns!

  1. If I started opining on Dirk’s academic papers, anything I might say would be completely drivel, because I wouldn’t know what the fck I was talking about. Intelligent people have the capacity to assess whether we know what we’re talking about. I am absolutely not an expert on any subject at all … even so, I have devoted some time to trying to learn and understand recent conditions in the former Soviet countries. I have visited both Ukraine and the Russian Federation more than twenty times each. I don’t mindlessly parrot pants-shttingly stupid claims about these countries originating from ANY political bloc, government or organization.

We debate security topics all the time — reasonable people disagreeing.

If someone posts that he’s just invented a cipher which nobody can break, we can be very confident that he is (a) deeply ignorant, and (b) culpably ignorant, because he is almost certainly intelligent enough to learn and understand his foolishness. This crosses the line of reasonable disagreement.

Well, the foregoing are the opinions of one fool, anyway. I’m sure that reasonable people will disagree.

MarkH October 31, 2015 5:18 AM

@Clive:

When I followed that link, my “crackpot” alarm bells rang more loudly with each paragraph. (I dunno, do Brits use the term “crackpot” for a crank making scientific-sounding claims?)

The author was a genuine professional astronomer, who became famous for the eccentricity of his beliefs.

Whatever merits his ideas on gravitation may have, they certainly have not been accepted by the community of physicists. Analyzing them is beyond me …

Clive Robinson October 31, 2015 7:39 AM

@ Gerard, Wael,

It’s quite a bit more than just which side of the puddle you are on.

Firstly as has been noted before “You cannot give people democracy they have to take it” and “All nations have to go through their middle ages to forge a nation”. Whilst it might sound cruel and hartless history shows there is a degree of truth in it.

Cecil Rhodes and his cronies divided up Sub Saharan Africa with a very deliberate intent. He drew up a map of the existing tribal borders and then formed countries of 2/3 of one tribe and 1/3 of a tribe they were hostile with. He then put the 1/3 in power and controled them by proving them with just enough arms etc to keep the other 2/3rds subjugated. I’ll let you look up the history to see what effect that had and is still having.

The Ukraine was not and is not a nation of one people of common outlook. It was and still is an artificial construct not unlike what Cecil Rhodes was upto. The people are very far from a common outlook. History indicates there are two ways it can go, firstly if left alone it can become a nation of people, if not then it will become two or more seperate nations .

Either way can happen by civil inserection or peacfully, but it has to be the choice of the people living there, they are not children and do not need to be led by the hand, they need to do it their way or no nation will be forged.

The Ukraine unfortunately has a number of issues, firstly it has an uneven distribution of natural resources, some of which other nations hunger after controling such as energy resources. It also has the misfortune to be stratigicaly placed with respect to two Super Powers and directly adjacent with a nascent super power federation.

The problem the US has is it’s now effectively an issolated nation. It forged it’s self with fighting it’s Southern neighbour, and forged a new and initialy hostile nation to it’s north by attacking those who were mostly it’s children. Having arived eventually at effectivly peacefull relations with it’s neighbours a century and a half ago, it had little use for the rest of the world and turned inward into “splendid isolation”. Unfortunatly two things happened that changed it’s outlook. The first of which was the discovery of oil in Texas, this made a fundemental change to the way the US worked and boosted industry and commerce to very high levels compared to it’s previous state, and that of the rest of the world. It had the resources for it’s then small population for everybody to have a fair chance of becoming wealthy by their own means. Unfortunatly back in Europe things were different, people had little chance of becoming wealthy by their own means, wars were frequent and kings lusted after their neighbours resources and frequently went to war on any pretext to get them. Some nations had empires and used them to obtain wealth rather than fight with their neighbours. For a while europe became more peacefull and started to develop industry and commerce. Unfortunatly the industries set up to enable empires to be built did not want to die out so started selling their premium product to those with the money to buy. One big area of this was armaments manufacturing. With hindsight it becomes obvious that if you give people the tools of war, they are going to pick them up and use them, especialy if they are sold the idea not just of an easy victory but a profitable victory. From the manufacturing nations point of view selling armaments abroad subsidized their own armed forces and kept them at the leading edge of technology. Then things started to go wrong, treaties not just of mutual protection but arms limitation were signed, but to little to late. Over in the East where the wealth difference between rulers and ruled was getting ever greater political ideals and bad repression caused a civil war. other Nations out of self interest started to prepare for civil insurrection. And other nations saw opportunity. The preasure cooker burst and what was a minor incident became an excuse and warfar started and what became a major European war quickly spread out in the empires and came knocking at the Americas. The US found it could not ignore the world any longer because the world would not let it ignore it. It became the First World War, and the US had the upper hand due to having a thriving industry and the raw resources to drive it, but most importantly it was not at risk of invasion or damage to any of it’s on shore infrastructure.

Likewise with the Second World War, the US people by and large did not want to be involved, but some could see where things were going many were not politicians, but a few were. The US was once again dragged into a war which again did not directly effect it’s industrial base, but it’s population was hit hard. At the close of war the US initially retreated back into isolationism, but wiser heads realised that the same would happen again unless steps were taken. There was quite a lot of support from US citizens to the plight and strife of a war torn Europe, much of which was still under occupation. Various charities started sending food and aid parcels and the London Olympics brought home the message of just how desperate things were compared to the standard of living in the US. The wiser heads prevailed and the likes of the Marshal plan started the rebuilding of the Western half of Europe.

Unfortunately the East was occupied and much had changed over in the far east. Nearly all industrial nations had had their infrastructure destroyed significant political change, unfortunatly replaced on set of tyrannical rulers with new tyrants in many places, the old order was gone and a new one threatend the peace. The big difference was that science had added significantly to the arsenal of weapons and what was once considered weapons of attack became weapons of defence by threat that got named by the Rand Think Tank as Mutually Assured Destruction. The stalemate that was the Cold War settled in as each side improved their weapons out of fear that the other side would get a sufficient advantage that the MAD doctrine would fail.

But MAD had other effects, it damaged the US in ways that did not appear to many for quite some time. The US was forced into a position that it scared it. It had not seen a hostile boot on it’s home shores in a hundred and fifty years, now other nations could rain down terror and destruction that the US had never in any way experienced in many generations. But simply it was scared, but it was not people or narions that scared it it was technology. It’s a fear the US still suffers from.

But other issues were happening, people came back from war to discover they were not heros for more than a day or so, many were unemployed nobodies, and disparity soon arose, which gave rise to civil unrest. It came close to civil war and the old political guard were all for making the same mistakes that the old european despots had made. It’s not entirely clear how reprating the past mistakes did not get to the point of civil war, but be it by new political views, social change, self belief or more available wealth it was avoided.

Unfortunatly the US had another hidden problem, the resources it needed could nolonger be met by it’s own domestic supply, and it’s population was rising dramaticaly. The US could no longer be inward looking it had to deal with the World as a State even though the nation of it’s people wanted to remain inwards focused.

The problem of enemies within and without and the need to change the national focus caused a horrendous outlook and change to occure. The US allowed the War Hawks a long leash, thus the MIC became a juggernaut by applying science to develop technology that would make the weapons the US feared it needed. The US was spending twenty times the amount of any other nation on arms, the only way it could sustain it was to fall back to the same mistake the late 19th and early 20th european states and their arms manufactures had made. The US was effectively swaping weapons for resources and subsidising it’s own military development. The US were making the toys of war, and as with most toys they had to play with them to keep the game going.

The problem is the side that’s winning does not want to stop playing, especially when it’s not their door step they are crapping on. The problem is the game requires enemies both without and within…

So yes parts of the US are activly seaking out conflict one way or another, the trouble is that those chickens come home to roost. Which we are now seeing with the likes of China that is now having to turn from inwards facing to outwards facing. In many respects the Chinese grievances are the same as those of Japan a century or so ago. The problem is the US nolonger has distance as a defence Both China and it’s wayward satellite –the US War Hawks hate so much– North Korea can put payloads in space. These payloads can be of peacefull intent or not…

Either the US needs to change it’s game or face the consequences, because it’s not just little old Russia rattling the bars, they have to think not just about, China and North Korea, but India as well. And I suspect it will not be long before Pakistan develops long range or space capable delivery systems in the near future.

The US is not “The Worlds Policeman” it never was, and to be in that position you need to be respected by those you are their to protect, which also means suffering their oversight. The rest of the world has little respect for the US these days, most view them as morally, fiscally and politically bankrupt, and it’s clear that the US will in no way suffer the oversight it inflicts on others. Thus not policemen, then what bullies, tyrants, war mongers? You take your pick from personal and national prejudice.

Clive Robinson October 31, 2015 9:32 AM

@ Mark H,

First look at the date on the paper then check out what his argument is about.

From his arguments and practical experiments it’s been shown that the assumption the effects of gravity are in effect applied instantly on objects near and far holds. In line with an earlier form of relativity that is not Einstein’s way.

Thus the argument is, is LR or SR nearer the truth than the other. Both produce the same major results and many of the same minor results. Thus is one right and the other wrong, or are they both wrong and we have to take the next step towards what we hope is an accurate model of our universe.

We have been at this scientific cross roads before, when we adopted the quantum view of the world. To slightly misquote some physicists “The quantum guys belong to the ‘shut up and calculate’ union”. For some time now some of the more exceptional thinkers have been trying to step outside the quantum box, because it limit’s the viewpoint.

If you make the assumption that astronomers do of gravity acting effectivly instantly so that your calculations agree with the experimental results, then you get into an awkward place with regards what you mean by “information”.

For instance we believe that our universe is finite but expanding, thus the number of ways you can represent information with physical objects is likewise finite, but we have no reason to believe information is finite or limited to encoding on a physical object. To see this consider two physical objects a known distance appart, the way we chose to measure that distance is both information it’s self and defines the information encoded in the distance. Change either the measure or the distance and the information changes (which gives rise to the question of if either is continuous). The number of distances between particles is 0.5(n^2-n) which goes up by n for each particle added.

Therefore I find it helps you think about information, if you make the assumption that it “exists” outside of the limits of our physical universe. Thus view it outside of the laws of physics as some kind of uniform entity which occupies every place at the same time, that is it has no locality. Another reason to think this is independent discoveries of informaton, where two entities come up with the same idea without knowledge of each other, it’s only later they are found to have had the same idea. There is no reason to suppose that any human knowledge is unique to humans, thus the observations and thus information that gave rise to the charecterisation of gravity could have happened at any time at just about any place in our universe.

You then consider that information can be impressed or modulated onto physical objects or forces that are constrained by the laws of physics, for the purposes of communication, storage and processing by which it becomes usefull knowledge within our physical perspectives, underlying axioms and models.

7 Truisms October 31, 2015 10:02 AM

7 Truisms of the NSA:

After Snowden, all NSA leaks have been eliminated. The spies have spies spying on the spies.

None of NSA’s actual progress with QC is now or will ever be publicly known, though smoke will continue to be blown through carefully planned back channel “leaks”.

The actual total budget of the NSA is known only by itself, and is greater than the GDP of most 3rd world countries.

NSA, as it has in the past, will continue to steer mainstream users away from secure algorithms into “new” secure ones it has had a hand in creating or manipulating.

The NSA currently has employees in key positions, working to slip back doors or novel vulnerabilities in all “open source” projects of relevance at this moment, including important commercial ventures as well.

At least 1 of the preceding comments to this article has been written by an employee of the NSA.

NSA has and will continue to infinitely justify it’s “we are above the law” actions, based in no small part by the fact that it has been given defacto permission, “the wink” by every congressional session and president.

I rest my case.

Figureitout October 31, 2015 10:49 AM

7 Truisms
None of NSA’s actual progress with QC is now or will ever be publicly known
–My understanding was the NSA was actually behind scientists in Europe (Germany I believe, it was discussed here) so they’re probably spying on them and copying them (pathetic). Never a good position to be in where you have to spy and copy to stay relevant (China), means you’re at the mercy of people who know what they’re doing.

In the Shadow of October 31, 2015 11:10 AM

… to all and/or anyone, on the Unknown, theories, odd present, past, futures… and on implausible secrets…

@Clive Robinson, Harry Johnston

Thank you for your responses. Very much not my field, and to a lesser degree not a field which I have looked into too much yet. But looking like about time to do so. I did get into theoretical physics as a youngster but decided against that life course to go into hacking and intelligence instead. (Family, you know. Parents and their will.)

@Henry Hoyt, Blake

Thank you for throwing some good doubt and open mindedness onto the fire. Whenever there seems to be a strong consensus over an Unknown, this is most healthy until that Unknown is fully made “tangible”, known.

@all

Enormously fascinating thread. Excellent posts all around. This certainly brought some interesting folks out of the forest.

@MarkH

Especially your posts. Exceptional intellect.

Though, :-), a bit of a head scratcher when one places a TM after “Real World”, lol. ie, “Real World (TM)”… I hope whomever holds a patent on the Dream of the “Real World” does not put us in this situation:

The last limitation means that anybody who makes an implementation with paying the license fees to the IP owner is breaking the law, and that it will never be adopted by the GNU/Linux community which is such a large part of Earth’s communication infrastructure.

On this:

We debate security topics all the time — reasonable people disagreeing.

If someone posts that he’s just invented a cipher which nobody can break, we can be very confident that he is (a) deeply ignorant, and (b) culpably ignorant, because he is almost certainly intelligent enough to learn and understand his foolishness. This crosses the line of reasonable disagreement.

Well, the foregoing are the opinions of one fool, anyway. I’m sure that reasonable people will disagree.

I think your self, like most of the posters on here, are exceptionally intellectually talented individuals. So you know what it is to live a very rarefied existence. Put another way, that very rarefied existence is a highly improbable existence.

All such people also know what it is to therefore be quite separated from the “mainstream” or “masses”. You have been privy to many matters which you are aware of are very rarefied through your entire life. For intellectuals, this often means matters which one simply thinks of or understands. Though, the reality is this is little different from the experience of others with very rarefied lives.

You know what it means for people to be wrong. That they are so certain, and so certainly incorrect. That so much of what you know to be true, you could not even begin to explain.

There are other classes of human beings with such rarefied lives, of course. As lonely as such existences can seem. There are the super wealthy, the super powerful, the very famous… there are those who have had lives of unparalleled and unusual adventure… those who have had just the briefest brushes with impossible fortune or fate. The win of an ever so rare lottery ticket.

It is true, one can never know for sure what goes about the lives of others, for one could never know for sure of your own life. Not from a distance anyway. Though, coming close, one does see and gain understanding. Not unlike approaching a strange planet, which looks so different from afar.

How often is a person truly astonished? How often is a person able to say in their lives that they have been overcome with surprise, or known deep shock? Not to little wonder of coming across something interesting, but the sort of great wonder and awe of the vastly impossible. The extraordinary kept from the eyes of all evidence of possibility? Scour the history books, scour the news. No trace of it anywhere, hidden from all eyes.

For the intellectual, everything should be reasonable, and we can often approach that which can not be explained reasonably as if it is foolish. We can become very accustomed to the foolish. People operating irrationally, without purpose, without conscious deliberation. Coming to strongly held beliefs which they have never challenged in the least. All so often they believe according to what they want to believe, by emotion.

And they have an arrogance about that, which is primely frustrating for those who actually bother to try and work so very hard to come to strongly held beliefs only by reason.

It is work. It is dilligence. It is due dilligence required. It provides strong reward for those who do it. Yet, all about us we see those who skip it all. We continue at it, regardless, because we know we are as susceptible as they. We know the reward and we know the punishment of failing it. Some may cheat and take their reward from foolish delusion, but sooner or later they will hit their wall and come to terms with reality. Then, they will have to pay their debt, substantial as it may be, for not having done their homework.

But.

There are times, plenty of times, when answers simply are not forthcoming.

When the reasoning simply is not going to be explained.

While it is certainly admirable and chivalrous to explain one’s reasonings, it is not a required necessity in all settings and in all times.

The wise, or those who well reason out matters consciously others ignore, might chaff at this. They might chalk it off as foolishness then. And, quite often they will be correct. But, very often they will also be incorrect in this surmise. They will have fallen into the error of failing to suspend judgment. Of forgetting the very fact that just as they so often simply can not explain things to those beneath them, intellectually, so too are there surely ample matters which can not be explained to their own selves.

For this can be the error of being in such a rare situation. As rare as one’s life can be, there can be rarer. As unknown as one’s life is to others, as unknown as others life may be to their own selves. As high as one’s intellect may be over so many, so too may there be intellects higher then your own. And as often as one comforts one’s self with “knowledge” that one already sees this, the truth be told, one would not. It is an illusion to see peers and those in other fields and say, “They are smarter then I”, or “they are as smart as I”, when you know you could just as easily have their knowledge if you tried.

And then there is the frustrating unknown. The error of having answers for so much, and not being able to understand the little things for which there is no answer. Or grappling with just how many answers have strong reasoning behind them… but the confidence in those answers being correct, and not a fool’s errand, an edifice of plausible but incorrect answers… and realizing this is, too, an impossibility. For all that we know, we never know enough of.

MarkH October 31, 2015 2:36 PM

@Clive:

I did note with particularity, the date of the article.

I also noted that one the arguments it put forth — in my judgment, the most elementary — is that if gravity propagation is as slow as c (or indeed, not effectively instantaneous), then orbits must be unstable.

While it is beyond my analytic skills, probably thousands of physicists around the world could readily confirm or invalidate this claim.

Think on it for a moment — proof that finite propagation velocity for gravitation means unstable orbits would be a (literally!) Earth-shaking result, and requires no costly experimental apparatus or procedures, only a chalkboard.

If it were true, it would be headline news throughout the world of physics … n’est ce pas?

MarkH October 31, 2015 3:31 PM

@Shadow:

I blush to read your compliment — it generous, and indeed disproportionate.

The main reason I read comments on this blog, is that there are plenty of Very Smart People who participate. Dirk exemplifies this — an academic conducting cutting-edge research in computational methods. I don’t pretend I could do the kind of work he does. I would need long and serious study to understand the questions his work addresses — let alone his answers to them!

In rooms like this, I’m never the smartest guy, or even notably beyond the mean.

When you’re not the smartest guy, you can make some compensation by intensity of focus. My nature is to focus on what is genuine — what is authentic. In my profession of engineering, this is massively useful … but also expresses my deep respect for Truth.

So mainly, I ask questions.

Suppose that some speaker X asserts a proposition P.

• How does X come by that knowledge?
• What do I know about X’s personal integrity?
• … cognitive biases?
• … personal stake in distorting the subject of P?
• … previous record of authenticity in the domain to which P belongs?

• Does P make sense?
• Is P as simple as possible, or needlessly complex?
• How is P consistent or inconsistent with well-established knowledge and principles?
• If P seems inconsistent, can I find a logical resolution for the inconsistency?

You can think of other questions to add to this list.

To the extent P is a general rule or principle, counterexamples prove its falsity. If P is asserted as a “law”, then a single counterexample annihilates it.

The central work of Gottlob Frege’s career was the construction of an elaborate systematization of mathematical logic — one of the great intellectual projects of the turn of the previous century. Just as Frege was about the publish the second volume of his system, he got a letter from Bert Russell, a younger fellow working in the same field, who essentially asked, “what about the class X of all classes that don’t contain themselves?” (If X is not in X, then X must be in X …) This paradox essentially demolished Frege’s system (though Frege’s work remained a very important contribution to mathematical foundations).

In the 1930s, a conference of physicists from the length and breadth of Germany convened to denounce the wrongness of Einsteinian relativity (for Nazi scientists, “Jewish” science must be wrong). Reportedly, when a colleague told him about this gathering, Einstein said, “if my theories were wrong, one physicist would be enough.”

So, I go about with my little prospector’s hammer, slowly and patiently tapping on the rocks, trying to distinguish iron pyrite (unsound assertions) from gold-bearing ore (assertions with explanatory power that correspond to discernible reality). With a little luck and God’s blessing, maybe I get it right a little oftener than wrong.

ianf October 31, 2015 6:51 PM

@ Mark H, you wrote: […] “In the 1930s, a conference of physicists from the length and breadth of Germany convened to denounce the wrongness of Einsteinian relativity (for Nazi scientists, “Jewish” science must be wrong).

Neither physicist, nor a historian of physics, but with a keen interest in the history of science, I must nevertheless confess that I never heard of any such Deutschland-wide gathering in the 1930s. There were plenty of local “events” at universities, when Jewish scientists were purged after passage of the Nürnberg Laws and other edicts, but one spanning the length and breadth of Das Reich? Not mentioned in the top two online hits of Google search for “german scientists denouncing einstein jewish science conference” keywords—which dovetail with my earlier knowledge of it.

    There was one such conference in 1922, when “Aryan Physics” were promoted as an alternative to “Jewish theories” in the fervor of chauvinistic and revanchist times. But 10+ years later, with the Nazis securing their power, and Einstein and others already out the door (read: freeing up academic positions that “young scientific Turks” gladly occupied), what would be the point?

Hate to have to go that road, but let’s not get carried away, German physicists were not stupid to that degree—that’s what made their work on fissionable materials under the Nazis so despicable (conversely, we should be grateful that Nazi state overseers were so narrow-mindedly stupid when it came to funding projects that would materialize only after 1943, when victory in Europe was so good as assured… hence e.g. denial for a very modest sum for electronics pioneer Konrad Zuse’s reed-switch-based binary Z-1 computer in 1942, much faster, far more reliable (no overheating valves!), and cheaper to run than anything built prior to c:a 1950. Go see it in Munich Technical Museum, the fortuitously unfulfilled promise of Nazi science [yes, there was a Plankalkül formal programming language to go with, not for the faint-hearted]).

Back to you, Mark H: what’s the source for that your claim, if other than anecdotal?

Nathan October 31, 2015 6:55 PM

Firstly, regarding faster-than-light communication (superposition based or otherwise): If you can do it, you don’t need to break cryptography. You can already get very rich, very quickly and legally, by arbitraging small movements on financial markets.

Secondly, I’m curious why there hasn’t been more discussion of NTRU for post-quantum public-key cryptography. I just spent an hour or two reading up. The encryption component of NTRU is apparently quite solid, and has been standardized. For signatures, there’s been a persistent series of attacks based on leaking information over repeated signatures, and ways to “patch” them by introducing noise to the signatures haven’t had a good track record.

One solution could be to use a hierarchy of progressively shorter-term keys, so that no one key makes “too many” signatures, but all are traceable to a root key (which could then be used to get a certificate).

Signatures also don’t need so much future-proofing, since in many applications they’re verified shortly after being made.

MarkH October 31, 2015 8:05 PM

ianf:

You got me there … I have no evidence of such a conference. I wrote above from my memory of an article I read in the Bulletin of the Atomic Scientists more than 30 years ago, in which a scientist recounted anecdotes to illustrate Einstein’s sense of humor.

So (a) I probably recalled quite foggily (don’t know a convenient way to find the BAS article), and (b) the anecdote-teller might himself have confused some of the details.

Having “looked it up online,” the story is actually about a 1931 pamphlet titled “Hundert Autoren gegen Einstein” (A Hundred Authors against Einstein). According to wikipedia, the pamphlet contains no anti-Jewish verbiage, and only 6 of the listed authors either publicly held anti-Jewish views or were connected with Nazism. There was only one physicist on the list, and another three mathematicians.

I can’t authenticate the quotation itself, or who recorded it. According to Remigio Russo’s “Mathematical Problems in Elasticity”, Einstein told “the press” (in response to a question about the pamphlet, I imagine) that had be been wrong, one author would be enough.

In one version, Einstein’s reply to the press was, “Why 100 authors? If I were wrong, then one would have been enough!”


The other example of Einstein’s humor I recall from the old Bulletin article was, “To punish me for my contempt of authority, Fate has made me an authority myself.”

This remark from Einstein’s 1919 “Was ist Relativitäts-Theorie?” shows his deep sense of irony:

Noch eine Art Anwendung des Relativitätsprinzips zum Ergötzen des Lesers: Heute werde ich in Deutschland als „Deutscher Gelehrter“, in England als „Schweizer Jude“ bezeichnet; sollte ich aber einst in die Lage kommen, als „bête noire“ präsentiert zu werden, dann wäre ich umgekehrt für die Deutschen ein „Schweizer Jude“, für die Engländer eine „Deutscher Gelehrter“.

Translation: Another kind of application of the relativity principle for the delight of the reader: Today I become in Germany a “German scholar”, in England a “Swiss Jew”; but should I once be in a position, to be presented as a “black sheep”, then I would be conversely for the Germans a “Swiss Jew” for the English a “German scholar”.

Here, Einstein implicitly refers to his growing celebrity in response to public awareness of his scientific accomplishments.


This book chapter:

http://www.bibliotecapleyades.net/ciencia/nscience/nscience01.htm

in presenting the career of an “Aryan physicist” in the increasingly anti-Jewish Germany of the 1920s and 30s properly illuminates the development of the mirror-image Nazi notion of “Jewish physics”.

MarkH October 31, 2015 8:27 PM

@Nathan

As far as I know, NTRU is the leading candidate for a public key system expected to be resistant to quantum computer attacks.

What I don’t know, is that NTRU has faced the kind of analysis and attack, by the tiny community of cryptographers who have the know-how to do so, that would offer the world a degree of confidence that NTRU doesn’t have some latent weakness.

If you have found any information about the depth of NTRU’s vetting, please post it here. If NTRU is eventually able to get the “seal of approval” from people like Bruce and his expert colleagues, it would be a great relief for those worried about the possibility of quantum computer attacks.


The longevity of signatures is a rather tricky question.

I agree with you, that the lion’s share of signature verifications take place right away.

But if you can break a digital signature, then you can forge messages and documents. With respect to certain contracts and other legal and financial documents, it COULD be useful to a criminal to produce a document from 50 years ago, saying that Jones (now perhaps conveniently dead) signed this (will, transfer of property, power of attorney, etc.).

For this reason, some cryptographers consider that the longevity of signature security is actually more critical than that of encryption security.

Anura October 31, 2015 9:02 PM

@MarkH

Recently NTRUSign was completely broken, although NTRUEncrypt is still unbroken. A more recent innovation is ECC using what they call “supersingular curves” – it uses a diffie-hellman key exchange, and thus could also provide PFS, and can also be used for signing. I’m not sure whether it can piggy back off of existing ECC cryptanalysis, but it does have the advantage of not being patented.

https://en.m.wikipedia.org/wiki/Supersingular_Isogeny_Key_Exchange

Nick P October 31, 2015 10:16 PM

@ Anura

The 2012 attack still takes a few thousand signatures IIRC. So, one could use a key for a 100 or so signatures then generate, sign, and send a new one. Or is there a more recent attack that doesn’t have that window of safety?

Anura October 31, 2015 11:30 PM

Well, maybe “completely broken” is a poor choice of words, as there are still use-cases, but another scheme is necessary for a general purpose post-quantum replacement for signing with RSA/DSA, and supersingular isogeny based ECC is one potential alternative. I also seem to remember that Daniel J. Bernstein had a practical hash-based signature scheme that looks promising.

Clive Robinson October 31, 2015 11:51 PM

@ Mark H,

Whilst there are thousands of physicists in the world, none have yet made an authorative statment…

The problem is that Newton was very definate on the no time delay in his theory of gravitation [f=g(m1.m2)/r^2]. Which agrees with observation, Eddington likewise was definate on this. But Einstein’s Special relativity did not agree with observation so he went away and came back with General Relativity which does “sort of”…

The problem is where does the energy go… That is GR says that energy must be lost in the form of radiation.

So we have a theory of gravitational radiation thus gravity waves that so far nobody has measured for various reasons.

Gravity waves are a bit of a problem in that they take almost unimaginable (in human terms) energy to create but… Due to their quadrapole nature are very very difficult to detect as they have only minute effects on physical objects (think of holding a squash ball between finger and thumb of one hand and likewise but perpendicular with the second hand you alternately apply and release preasure out of phase with each hand). Further they are of very low frequency down around one Hertz, which has two effects, firstly trying to make a lense would be infeasable due to the wavelength being atleast 10E8 likewise any practicle sized detector is going to be so insensitive, we can not detect any test signals even if we could make them…

The last time I looked the physicists were waiting on the contruction of three very long base line laser interferometers sufficiently far appart that time delay cancalation can be used to remove interference effects like earthquakes which the interferometers are very susceptible to.

So if and when the interferometers are built we wait, for some event in the energy range of a super nova to happen sufficiently close to our solar system,that a detector we can not adiquately test might pick up the gravity wave which we can then check against optical or radio telescopes. The problems with that are, if gravity waves are slower than C then we will see the event before it arives at the interferometers. If it is very fractionaly faster then the interferometer signals will be gone befor we see the event which means the posability of false correlations. If however it’s more than about 10% faster than C then we might not see the supernova untill a lifetime or two after the interferometer signal. Which with the untestable nature of the interferometers would require several close by supernova to happen before definate conclusions can be reached… I suspect that neither you nor I or our children will still be alive by that time…

Further you might find that trying to find information on gravity radiation a bit difficult to find. As far as I’ve found, there is nothing outside of the very technical highly theoretical bodies of work and of those there is only one of note and it’s not exactly current being from 1987. You will find it in “Three Hundred Years of Gravitation” edited by Hawking and Israel, and published by Cambridge Univ. Press. You might find it slightly pricy to buy –the copy I saw had an RRP of around 180USD, and it’s not easy to find even in university libraries (though illicit PDFs are bound to be up on the Internet somewhere).

Clive Robinson November 1, 2015 12:01 AM

@ Nathan,

Signatures also don’t need so much future-proofing, since in many applications they’re verified shortly after being made.

Err for the backwards and forwards of electronic messages that might be true.

But in the commercial world of contracts, signitures are often supposed to remain good for twenty to nine hundred and ninety nine years on land contracts such as mortgages and lease hold properties.

Wael November 1, 2015 12:15 AM

@Clive Robinson,

Another thing I have problems with is the “slower moving clocks” on objects that are moving. The idea that of someone travels in a rocket at close to the speed of light, then comes back to earth, he would have aged at a slower rate is redivulous for one reason: if the frame of reference were to be changed, then the traveler in the rocked round also measure the clicks on earth to be running slower. Then people on earth would have aged at a slower rate. 🙂

I am aware of applications of time dilation and SP in real life such as in GPS. But getting older “slower” sounds illogical to me.

Wael November 1, 2015 1:08 AM

@Clive Robinson,

You might find it slightly pricy to buy –the copy I saw had an RRP of around 180USD, and it’s not easy to find even in university libraries (though illicit PDFs are bound to be up on the Internet somewhere).

I’m tempted to buy it, but I’ll wait until someone posts a link to a “cheaper” PDF. Learned my lesson 🙂

I suspect that neither you nor I or our children will still be alive by that time…

Since science depends on measurements that we can perceive with our 5 senses, and you brought up the interesting “condition” that some phenomenons take place over a period of time that spans generations, then we have to come to the conclusion that we will never learn everything 🙂

Harry Johnston November 1, 2015 2:03 AM

@Wael: that’s a very old chestnut, so there are plenty of detailed explanations out there. But the short version is that you’re overlooking the fact that the person in the rocket ship experienced acceleration and the person who stayed home didn’t, an asymmetry between them that can’t be eliminated or reversed by changing reference frames.

As it happens, I wrote a long answer to a related question a few years back. It isn’t very rigorous, but it may be of interest.

Wael November 1, 2015 3:02 AM

@Harry Johnston,

I’ll look at the WordPress link you provided when I’m more awake — thank you.

Wael November 1, 2015 3:10 AM

@Clive Robinson,

For instance we believe that our universe is finite but expanding,

We aren’t even sure that the universe is finite

ianf November 1, 2015 3:10 AM

Thank you, Mark H. As I said, history of science-wise, I am not an Einstein-ist, but a generalist, so I needed to find out if I might have missed something of that kind (there were other, specifically anti-Jewish immigration of doctors “events” in various places in Europe up to the outbreak of WWII that I read of, not to mention such as the hardly now ever mentioned Zbaszyn deportation of 1938, when the Nazis expelled Polish Jews to a border town, but the none-too-philosemitisch Poles anticipating it refused to let them in, forcing 17000 urban dwellers to bivouac under bare October skies. And if remembered at all, then in the context of its indirect connection to what the Jews call November Pogrom, but the world prefers the somewhat-festive Nazi moniker Kristallnacht.)

As for Einstein’s quotes and anecdotes, there are so many of them that some can but be derivate and apocryphal. I had a postcard of old Albert photographed in the 1920s on a beach wearing women’s sandals & looking mischievously pleased with himself—definitely not Photoshoped!

Wael November 1, 2015 3:38 AM

See how relative, special is, to generally

Score keeper says you’re getting closer. 20 or 30 more of these cracks, and you’d be just about even! You can’t win with a score keeper who can’t count 😉

ianf November 1, 2015 3:39 AM

Incidentally (but within the topic), speaking of non-relativity, i.e. time-arrow absolutism of time, here’s an inexplicable example of … something: 3 timestamped posts in turn in this thread:

#c6709621 Wael • November 1, 2015 3:10 AM

#c6709622 ianf •
November 1, 2015 3:10 AM

and then, 4 (absolute) minutes after that:

#c6709618 name.withheld.for.obvious.reasons •
November 1, 2015 3:48 AM

Yes, it appears in the same perverted time order also on the Last 100 comments. This one posted BEFORE THAT, @ 3:39 AM

Make what you want of that, I’m going back to bed on non-negotiable terms!

name.withheld.for.obvious.reasons November 1, 2015 3:48 AM

@ Clive

Whilst there are thousands of physicists in the world, none have yet made an authorative statment…

Let me throw into the mix…

1.) Imagine space, as we know it–the universe, in total, without matter (mass-less)

2.) Simultaneous locate two equally massive objects at spatially opposite corners of said space, item 1.

3.) Calculate the traversal time in point space for both objects (movement of movement, acceleration, “potential” spatial displacement, etc.

3.a) Additionally, quantify using a modeling method (Elucidated n-dimensionally, choose n based on your preferred spatial geometry/algebra/kernel/theory) and calculate/plot/render the complete spatial energy/matter interactions in Riemann space.

See how relative, special is, to generally. (Another physics joke-yuk, yuk, yuk.)

ianf November 1, 2015 4:01 AM

@ Wael… I wouldn’t dream of barging in with anything flippant, but couldn’t you simply borrow that “Three Hundred Years of Gravitation” (Cambridge University Press; Hawking, Israel, editors) from ANY academic library, and read it surreptitiously between the lines so as not to leave traces of your illicitly gained, unauthorized knowledge?

Clive Robinson November 1, 2015 6:42 AM

@ ianf,

Incidentally (but within the topic), speaking of non-relativity, i.e. time-arrow absolutism of time…

Err you might want to think about when Civil Clocks go back in time, in the US this year Nov1 is a “live” twenty five hour day.

Depending on where you are it happens on different days of the year… It was Europes turn last weekend.

The worst place on earth for the issue used to be Australia, they had no Government mandated civil time across the country, it was up to what ever the town council decided…

One silly ness to arise from time legislation is the UK has a “time lord” who’s job is to request of parliment that they deviate from UK legislation on time, to that of the EU for changes to clocks including the likes of “leap seconds” that can go both ways.

Another silly but could be worth it if you want to save money is partly to do with the lost days in the Gregorian Calander… One result is in Russia their Christmas day falls on our twelfth night, so the January sales have started hopefully giving you better purchasing power. Mind you traditionally it’s “cabage soup” so it might give you an entirely different power 😉

Wael November 1, 2015 8:49 AM

@ianf,

but couldn’t you simply borrow…

One of these days, although it’s easier to watch a video on the subject from a known authority …

@Harry Johnston,

Got a chance to read the WordPress article. Well written. I didn’t verify your calculations (the 16% and so), but I take them to be correct. I have two comments… One: you didn’t take into account deceleration when returning to earth. Two: one has to make a distinction between “observing” (which is perception) and “reality” when the traveler reaches to earth. Hence I stand by my comment that I don’t believe the traveler would have aged any different once they go back to earth’s observation frame of reference.

Been a while since I looked at the subject, and I’m aware that several things with SR and QM are counter intuitive because we (as in I) tend to think classic Newtonian with time/speed/distance problems.

MarkH November 1, 2015 2:36 PM

@Clive:

Seeing that superluminal gravitation is a topic of absorbing interest for you, I’ve looked up a little reading you might enjoy.


THEORY

Here is a brief account in plain language:

http://math.ucr.edu/home/baez/physics/Relativity/GR/grav_speed.html

My take-away from it, is that in non-relativistic (Newtonian) gravitation, indeed gravitation must be instantaneous, or orbits would be unstable, essentially because the force vector would be shifted in direction due to the time lag.

However, in both General Relativity and electrodynamics, there is a velocity-dependent term that almost exactly cancels the effect of propagation delay (the cancellation is perfect when gravity is comparatively weak, as generally obtains for orbits within our solar system).

The same obtains in electrodynamics in considering the orbit between a pair of oppositely charged bodies. Note well that the delay-cancellation coming from the velocity-dependent term is NOT a fudge factor added in order to make orbits stable: it falls out directly from the field equations.

The late Tom Van Flandern did not accept the consensus interpretation of General Relativity.


BINARY PULSAR MEASUREMENT

Obviously, measuring the speed of gravitational propagation is harder than measuring the speed of electromagnetic propagation. Gravitation is the weakest of the fundamental forces, and the forces consequent to manipulable objects are too small to measure. On the other hand, we don’t have the ability (thank God) to make planet-size masses suddenly appear and disappear, or the fling them about violently.

In our solar system, where (as mentioned above) gravitation fields are weak, the motions of astronomical bodies emit (practically) no radiation.

However, astronomy has found two binary pulsars (very-near pairs of neutron stars, at least one of which has a magnetic field that produces a radio signal detectable from Earth). The perfect metronomic regularity of the pulsar enables its mutual orbit with its companion to be inferred with precision.

In these binaries, the accelerations are violent indeed. For the Hulse–Taylor binary, orbital diameters and separation distances are on the order of one million miles, and they complete more than 1100 orbits in one of our years.

Under such conditions, gravitational radiation is NOT negligible, and orbits are expected to measurably decay as energy radiates away. The rate of decay is very strongly dependent on the speed of gravitation.

Under the assumptions that (a) orbital decay in Hulse-Taylor is solely due to gravitational radiation, and (b) the field equations of General Relativity are correct, the measured decay corresponds to gravity propagating at the speed of light within 1%. Hulse-Taylor is decaying so rapidly, that in the 40 years since astronomers started measuring its orbital period, it has slowed by more than one part per thousand!

If General Relativity is wrong, then the propagation speed inference may be modified.

Measurements of a second binary pulsar are also accurately consistent with gravity propagating at light speed.


QUASAR TRANSIT MEASUREMENT

By luck, in 2002 Earth, Jupiter, and a bright quasar (powerful radio source) came into line. Timing measurements of the fading and emergence of the radio signal are claimed to confirm luminal propagation of gravity to within 20%.

Like the binary pulsar orbit decay, the inference of speed depends on the correctness of General Relativity. However, this is an even more indirect inference than the binary pulsar observations, and the interpretation has been a matter of some debate among physicists.

For an account of both this and the binary pulsar measurements, see:

https://en.wikipedia.org/wiki/Speed_of_gravity#Possible_experimental_measurements


SOLAR TIDAL FORCE MEASUREMENT

http://www.astrowatch.net/2012/12/chinese-scientists-find-evidence-for.html

In 2012, a group of geophysics researchers within the Chinese Academy of Sciences claimed to have made a direct measurement of gravitational propagation, independent of relativistic theories.

By precision measurement of solar tidal forces, they claim to have found that the time-lag between solar alignment and orientation of tidal forces has the roughly eight-minute time lag expected if gravity propagates at the speed of light (in other words, tides are aligned to the visual position of the sun, which lags its true position because of light propagation time).

Here’s their abstract, including a PDF download link to their paper:

http://link.springer.com/article/10.1007/s11434-012-5603-3

If their result holds up, it will be the most direct evidence of luminal propagation for gravity.


If gravitational propagation is indeed luminal, the good news is: when some naughty aliens cause our sun to instantly vanish, we will still have 8+ minutes of both sunshine and our customary orbit before flying off into permanent darkness amidst the far reaches of our galaxy.

MarkH November 1, 2015 3:27 PM

OK, just did a little reading on the Twin Paradox — the ageing-difference thing Wael finds baffling.

Well, it is more baffling than I expected. The wikipedia article on the Twin Paradox tells that various physicists have approached the resolution of the paradox in various ways. Einstein’s own is one of the most demanding, because it appeals to General Relativity.

This divergence of explanations indicates that there is no One Obvious Way to explain it. Also, the mathematical analysis is pretty hairy.

A physicist does his best to boil it down:

http://www.einstein-online.info/spotlights/Twins

Nate November 1, 2015 4:01 PM

@Wael: “Another thing I have problems with is the “slower moving clocks” on objects that are moving. The idea that of someone travels in a rocket at close to the speed of light, then comes back to earth, he would have aged at a slower rate is redivulous for one reason: if the frame of reference were to be changed, then the traveler in the rocked round also measure the clicks on earth to be running slower. ”

Correct! This is the original and true meaning of the Twin Paradox (ie, the actually paradoxical part): that the ordinary ‘time dilation’ of Special Relativity is symmetrical, and implies that two inertial observers moving relatively to each other at a sufficiently high fraction of C will each experience a slowing of time relatively to each other.

Which is, logically, a paradox (in the hard sense: a self contradictory assertion, not just ‘a surprising result not in accord with common sense’) at the level of kinematics and is what led Herbert Dingle to write “Science at the Crossroads”: http://blog.hasslberger.com/Dingle_SCIENCE_at_the_Crossroads.pdf

Dingle’s assertion was utterly rejected by the physics community and he’s now considered a crank, despite being formerly considered an expert in relativity who wrote an influential early textbook.

However, despite over 40 years of refutations, it’s hard for the physics layperson to see what logical mistake Dingle made; ‘quantity x is both less than and greater than the related quantity y, in the same time and at the same place’ seems to be a very difficult logical statement to made sense of using the kind of logic we normally apply.

The usual modern resolution of the twin paradox is to say that: ACTUAL time dilation is a function of ‘distance travelled in spacetime’ which is a function of an observer’s ABSOLUTE acceleration and their relative velocity with respect to another observer. NOT just a function of ‘relative velocity’ which would in fact require a logical contradiction.

Basically there are TWO contradictory ‘time dilation’ effects implied by special and general relativity. Three if you include gravitational time dilation

  1. ‘symmetric time dilation’ implied by the lambda term in the Lorentz contraction, which is observer-dependent – two observers will disagree
  2. ‘assymetric time dilation’ which is a complicated calculation based on ‘which observer was least accelerated, and how fast and how long they then travelled before rejoining’, and is not observer-dependent; all observers will agree. But the calculation must be done only in a non-accelerated frame and requires knowledge of the entire history of motion of all observers.
  3. ‘gravitational time dilation’ which is based on ‘where in a gravitational field things occurred’. All observers will agree on this too.

Frustratingly, most textbooks only show a very simplified version of the assymetrical twin scenario: one rocket leaves for the stars, one rocket stays on Earth, with no gravity. Assuming on the the rocket that goes to the stars is accelerated and moves in a loop, a calculation can be made that shows it will travel less distance in time and therefore age less.

But it’s trivial to set up a symmetrical twin scenario: two identical rockets leave for the stars in opposite directions, accelerate and decelerate at identical speeds and times, and return to meet each other. The modern interpretation of relativity will say that ‘neither rocket will be time dilated compared to the other, because they both took equal length paths through spacetime’ (where ‘spacetime’ is assumed to be the same for any observer moving at a ‘flat’, non-accelerated velocity).

However: that’s NOT what a naive extrapolation of Lorentz contraction of the original 1905 Special Relativity would imply! The naive calculation would argue that BOTH rockets would be ACTUALLY time dilated COMPARED TO EACH OTHER. If we believed that, that WOULD be a logical contradiction.

So modern relativity (post-GR) only really works by ignoring its fundamental basic equation, and also, ironically, by ignoring the relativity principle. You can’t calculate asymmetric aging by using ONLY relative velocities and relative accelerations! You also have to incorporate ABSOLUTE accelerations (though not absolute velocities) and look at the WHOLE history of motion of all observers, in order to determine which reference frame is the ‘correct’ one to make the calulation in, before you can correctly calculate the time dilation effect.

So for time dilation calculations in, eg, the spinning Earth for GPS, we take the ‘Earth Centred Inertial’ frame as the locally closest approximation to a flat spacetime, because it has the least acceleration compared to a reference frame attached to the ground or to a satellite. Then we convert all coordinates to ECI and do the relativistic calulations in that. If we do it any other way, it comes out wrong. Of course that’s ignoring gravitional time dilation due to altitude completely, but that gets factored in later.

Some people feel that for a physical theory that’s supposed to be based only on relative velocities and local, infinitesimal changes, it’s a little odd that if you actually look only at relative, local changes you get not just a wrong answer for time dilation, but a logical contradiction. It’s a little like quantum field theory: with enough banging around and feeding in external data we can coerce sensible answers out of it, but there are lots of infinities and contradictions that pop up if we try to naively extrapolate complete solutions from only first principles and local information.

This is the essence of Dingle’s Question, which to my mind still has no good answer: if you reduce relativity to its simplest case – two unaccelerated observers in the same place and time moving relativity at different velocities, you get what seems to be a physically nonsensical answer: each observer’s clock is ‘slowed down relative to the other’. Dingle initially thought that this meant that the slowing couldn’t be a ‘real’, objective effect. Later he changed his mind because the physics community seemed to agree against him that symmetrical time dilation was real, not apparent. This led him to argue that, in that case, the foundations of Special Relativity MUST be logically contradictory.

However, the current consensus seems to be that symmetric time dilation (in the case of the two identically accelerated rockets) IS in fact only apparent! But that agreement came too late for Dingle’s reputation.

The history of the sociology of relativity, like the theory itself, is full of many odd paradoxes like this.

Harry Johnston November 1, 2015 4:58 PM

@Wael, I didn’t mention acceleration on departure from Earth or deceleration on arrival back at Earth because neither has any significant effect. As for observation vs. reality, I did attempt to distinguish between the doppler effect (observation only) and time dilation (reality) but perhaps it wasn’t clear enough. Ah, well.

Keep in mind that we have lots of direct experimental evidence that both SR and GR are correct. We see time dilation in the very real sense of “this object is not getting old as quickly as it would if it were at rest” all the time in particle accelerators. While AFAIK no experiment has more directly tested the twin paradox, there really is no conceivable way to construct a universe in which the twin paradox is not real without breaking everything else we know about relativity. 🙂

(Which isn’t to say the experiment shouldn’t be performed, just on general principles. I found one proposal here.)

Dirk Praet November 1, 2015 6:04 PM

@MarkH

Dirk’s own homeland used the protection of its “racial/cultural brethren” from fictional danger and oppression, as a pretext to wrench territory from its peaceful neighbors (including staged “votes” on annexation)

I cannot recall any instance of Belgium wrenching territory from its peaceful neighbours.

I had forgotten the appalling ignorance and brutality of Dirk’s ideas concerning Russia’s aggression against Ukraine

It may come as a surprise to you, but there really are people (and media) out here that are no longer buying the USG’s party line on what’s happening in the world. Ukraine is just another example of US “regime change” policy that blew up in everybody’s face and completely destabilised yet another country. You don’t even have to read Kremlin propaganda to see that.

MarkH November 1, 2015 6:43 PM

With respect, I worry that some of you are getting lost in the brambles attempting to work through the Twin Paradox.

Please, do read the linked page I posted above — it only takes a couple of minutes. The author makes a Very Important Point:

For the dictum “moving clocks go slower” to hold, you must be an inertial observer.

(My emphasis added)

What I tend to forget — and some of you may have forgotten — is that c is not strictly invariant. Its invariance is guaranteed only as measured in an inertial frame.

The Lorentz transformation for time is a simple geometric derivation that should be manageable for seniors at a decent secondary school. Its basis is the invariance of c, and therefore is valid only in comparing two inertial frames of reference.

In practice the stay-at-home twin is not in a strictly inertial frame, though you can replace that with an arbitrarily perfect inertial frame without affecting the basic result.

But the rocket twin very decidedly departs from an inertial reference frame. This fundamentally resolves the paradox: the non-inertial status of the spacefaring twin invalidates any expectation that clocks on Earth appear to run slower.

Lest we forget, this has already been well validated by experiment. When atomic clocks run in low Earth orbit, are returned to Earth, and compared with stationary clocks … they have indeed counted less cycles of their oscillation, and by the predicted amount.


Note: to confuse things a little more, atomic clocks in higher orbits (like GPS satellites) actually run faster than Earth clocks, because of a General Relativity effect as one climb’s out of Earth’s gravity well. At an altitude of roughly 3000 km, the two effects cancel out … higher than that, and clocks run faster.

MarkH November 1, 2015 7:33 PM

@Dirk,

I mistook you for German guy. I made an inexcusable mis-identification. I unreservedly apologize, and regret any offense.

Also, I hope you will understand and accept that my criticism of views expressed by you or any other person, however strongly worded, is not criticism of the PERSON. It is criticism of the statements and ideas.

The Ukraine crisis has been a startling educational experience for me. I have heard some very good and decent people — most of them far more virtuous than myself — utter grotesque fictions about the situation, and cling tightly to their falsehoods in the face of contrary evidence. Because the decency and sincerity of these people is beyond doubt, I understand that they have “fallen into holes” of misinformation and misunderstanding.

In a way, the problem of how so many people became hypnotized is a deeper and more important conundrum than the particular crisis. I would like to see some serious study of how it came about, and what lessons the world can learn from it.


For the record, my information about Ukraine does not come from ANY government — neither the government of the US, nor of any other state.

My information comes from the territory of Ukraine; from Ukrainian citizens; and from individuals (both Ukrainian and Russian) who personally participated in Euromaidan (some also participated in the Orange Revolution).

I don’t “eat” anyone‘s propaganda: for example, I know that Ukrainian patriots have their own characteristic myths and blind spots. My discussions with them are opportunities to learn about their personal values and aspirations — not necessarily objective impersonal facts like Ukrainian history. My historical background information comes from academic historians whose impartiality (on their professional topics) is soundly established.

Also for the record, I am passionately committed to the principle that the territorial integrity, independence, and sovereignty of every UN member state must be held inviolable, excepting only certain extreme situations of human rights violations or aggression against neighbors.

To be clear, I consider (as examples) the Chilean coup against Allende, supported by the US, to be a terrible crime against international law and order; and the 2003 invasion of Iraq to be the worst breach of international law in at least a generation, and in its consequences one of the most destructive since WW II. I excuse the crimes of no one.

As I wrote above, I won’t debate the international situation on this thread. If anyone is interested in a discussion on the topic that is anchored in objective facts and critical analysis, perhaps we can find a suitable place for that.

Dirk Praet November 1, 2015 8:24 PM

@ MarkH

I mistook you for German guy.

I figured that much. But no offense taken. I’ve been called worse 😎 .

As to Ukraine, opinions tend to differ very much depending on whom you’re talking to and in what part of the country (or originating therefrom). Suffise it to say that the Ukrainian bodybuilder shopkeeper from Russian descent where I’m buying my food supplements sees things entirely different than my middle-class neighbour girl from Kiev who’s working in the film industry. The only thing they actually agree on is that both the US and Russia should have kept their noses out of their affairs.

Nick P November 1, 2015 9:04 PM

@ MarkH, Dirk

Damn near the classiest exchange on a topic like this that I’ve seen in a while.

abc123 November 7, 2015 9:09 AM

The paper you reference says:

“This is a very small fraction of the NSA’s budget. If the NSA were close to developing a practical quantum computer — or if they believed that another nation was — then they would be devoting far more money to this project.”

That is absurd logic: when roughly a trillion dollars goes missing from the pentagon’s accounts every year for a couple of decades, you cannot assume that this type of project does not exist based on your knowledge of published budgets that are nothing but pretty artwork! Most federal agencies cannot even pass an internal audit. This government is far beyond anything that you would consider accountability to the public. You can’t even make an assumption that the NSA would be managing such a project!

re: > “The NSA is worried enough about advances in the technology to start transitioning away from algorithms that are vulnerable to a quantum computer. My guess has been that we’ll see a practical quantum computer within 30 to 40 years, but not much sooner than that.”

About 15 years ago, a high ranking officer publicly stated that the military had 1 GHz computers around 1970. Since that time, others have come forward to corroborate this. For what it’s worth, a discussion is here:

http://www.abovetopsecret.com/forum/thread135313/pg1

If past performance is any indication, they might already have a quantum computer of the sort described in the paper. And if they do not, they clearly believe that someone else could. I am not convinced that they have one, but would not be so quick to dismiss the possibility. I also think the claims of Ben Rich are relevant here with regard to how far ahead the secret government actually is in all forms of scientific research.

zyx987 November 7, 2015 10:29 AM

@abc123

About 15 years ago, a high ranking officer publicly stated that the military had 1 GHz computers around 1970. Since that time, others have come forward to corroborate this. For what it’s worth, a discussion is here:

That discussion consists mostly of claims with no evidence. And that link to http://www.wealth4freedom.com/sgt-3.htm in one of the posts actually redirects to http://edovar7.zeekler.com/watch_zeekler_video.asp?/sgt-3.htm which is not a good sign. Looks like a potential malware trap.

Anonymous hero November 9, 2015 11:47 PM

I find it amazing that anyone, in light of the Snowden revelations, trusts anything that the NSA has to say about encryption best practices. It’s pretty clear that the signals intelligence arm has moved to the forefront of the agency’s mission and it appears that all of the tin foil hat wearing skeptics of the NSA were right all along.

It’s hard to tell exactly what the NSA’s intentions are here but I think it’s fair to say that anything they say regarding encryption should be looked upon with harsh skepticism. I think it’s extremely unlikely that they are really concerned about quantum computing being a threat to Suite B. No, I think we have to read between the lines. I personally feel that the answer lies in section 5.2 or 5.3 of the Koblitz paper.

On the other hand, it’s possible that they are telling the truth and that there is some sort of weakness in the curves. As the Koblitz paper says, the NSA was extremely optimistic about ECCs strength all the way back in the 1990s. This means they were ahead of the academic community in evaluating its security. I think there’s a possibility that the NSA is aware of some sort of classical break, whether that be a weak class of curves or some sort of other weakness, and instead of revealing what they know and how they know it, they just make up a story about quantum computing. It could be that the NSA feels guilty about their past transgressions and are trying to reconcile that by actually making our security stronger. Either way I think if there is a weakness in ECC then it is probably not something that anyone not in the intelligence community would have to worry about being practical.

So regardless of whether they have any nefarious intentions or not, I think we should continue using ecc. Because you have to remember that if the NSA is your adversary then the game is already lost and I suspect that this break (if there is one) is a NOBUS type of situation that only the NSA can exploit.

And remember the NSA has a suite of algorithms that are classified top secret. Those algorithms exist for a reason. This is why I don’t trust any type of publicly available crypto.

Wael November 13, 2015 11:48 PM

@MarkH, @Harry Johnston,

A little late… Thanks for sharing the research. I wasn’t aware there was such thing as a “twin paradox”. Had this marked on my “todo” list, which is quite large now!

cs December 23, 2015 2:58 PM

http://arxiv.org/ftp/arxiv/papers/0912/0912.4694.pdf

a = b * c, given a and c find b. First grade arithmetic. All the group stuff was obscuring the reality that anyone can repeat the process of walking the points from c to a and count the steps taken as b – the secret key. It took less than a few hours to realize the backdoor in it.

Also, technically it is a = c ^ b since the operations as described amount to ((((c * c) * c) * c) … * c)

anonymous January 8, 2016 7:55 PM

Q = kG

@cs: good look bruteforcing p (privkey),
as Q (pubkey) and k (counter) and G (startingpoint) are 180+ bit primes aswell.

Anon October 13, 2016 5:07 PM

Lisa: Nice idea, you need FTL data for this, something which has already been flagged as a big implication in the future. Just imagine what it would do to the stock exchange!

buy buy buy! sell sell sell!

Curmudgeon March 7, 2017 11:10 PM

Quantum crypto is malarkey. It’s a bunch of fantabulous hooka-smoker dialogue.

There’s no such thing as a quantum computer. Quantum states are pie in the sky theory that have never come close to being proven by the scientific method.

People can’t tell the difference between science and naturalistic religious ideas being hammered into scientific jargon.

I almost died laughing when I read a serious article proposing an encryption algorithm with equations for q-bits. But q-bits don’t exist. I hope the guy applies for a patent and gets a patent for something that doesn’t exist, then sues someone for using a thing that doesn’t exist.

“I will choose your delusions …”

Erdem Memisyazici November 17, 2019 3:07 AM

I totally skipped eliptical curves. Now they are phasing out. Cool. Those looping points sounded scary anyways.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.