Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

As far as I know, there's no concrete evidence that the NSA has compromised the security of the NIST curves. That would be weird for them to do, since they use those curves internally to encrypt data classified at Secret and higher.

Are you thinking of Dual EC?



sort by: page size:

To the best of my current knowledge, it's at most possible that the NSA backdoored the NIST curves. I'm unaware of anyone in academia positively proving the existence thereof.

If your threat model doesn't include the NSA or other intelligence agency level state actors, ECDSA with NIST P-521 will serve you just fine.

(ECDSA is per se a questionable abuse of elliptic curves born from patent issues now long past, but it's not a real, exploitable security problem, either, if implemented correctly.)


NIST EC DSA curves are the only ones used by CAs, are manipulatable, and have no explanation for their origin. Pretty much the entire HTTPS web is likely an open book to the NSA.

The NSA even pushed NIST into supporting a backdoored curve and bribed RSA to make it default in their software. Look up Dual_EC_DRBG.

No, but you can with the curves that the NSA proposed to NIST.

I still doubt that there's a backdoor in the NIST curves because they're still widely used and recommended for top secret information, among other reasons.

If there were a backdoor and it leaked (or the math behind it was independently rediscovered!) the result could be catastrophic. Snowden showed that the NSA is absolutely vulnerable to leaks.


>(The NIST P-curves aren't backdoored.)

Unfortunately, we have no way to validate that the NSA did not grind the "seed" used to generate their parameters to search for curves which were strong or weak against some publicly unknown property which is only found in candidate curves at a one in a billion level.

This is a weakness in the methodology used to pick the parameters, somewhat lovingly mocked by the BADA55 curves: https://bada55.cr.yp.to/vr.html

It's not unreasonable for people to be concerned about this. Injecting intentional weaknesses into cryptosystems used by others was a fundamental objective for the NSA, which drove programs as significant as the CIA literally purchasing the at-the-time world largest manufacturer of encipher machines in order to ensure that it continued ship NSA designed intentionally weakened systems for decades ( https://www.washingtonpost.com/graphics/2020/world/national-... ). That was, of course, somewhat before the establishment of the relevant ECC standards-- but the cloud of operational secrecy prevents us from knowing that much about what NSA has been up to more recently.

The curve used in Bitcoin though isn't a NIST curve, and its generation procedure is about as close as you can get to rigid parameters without having rigid parameters as an explicit design goal.

a=0 (required for the endomorphism), field is 3 mod 4 for fast sqrt, increment field from 2^256-2^32-1024 (fast limb structure) until you find a prime field with a non-trivial cube root of unity (required for endomorphism) and until you can obtain a curve with prime order, set B to the lowest value that does. The result of that procedure is secp256k1. (you can actually drop some of the requirements above and still get the same parameters too, but I am pretty confident that this was their search criteria) -- so no high entropy "random" inputs.


What about the theory that the NIST encryption curves may be backdoored ?

If this is the case, if I would be the NSA I would strongly push for free cryptography, to make sure that only the US can decrypt the communications and have a strategic advantage.


The same way Dual_EC_DRBG became a NIST standard, the NSA pulls the strings.

You can't expect a government department to provide robust security to the masses when the rest of the government is trying the prevent that exact situation.

At this point anything, cryptography related, coming from NIST should be considered compromised.


> but if you want to tinfoil hat it, just do what every modern system does and use Curve25519.

What is your take on the NIST curves being "officially" blessed for government data via Suite B (or whatever they're calling it)?

If it's good enough for government work, would it be good enough for us in the private sector? What are the chances the the NSA know weaknesses in Curve25519 or ChaCha like they knew about differential cryptanalysis attacks DES ahead of everyone else?


<tinfoilhat>

What if the NIST curves were backdoored not by the NSA as a whole, but a rogue individual within the NSA, with the goal of making that backdoor available to a foreign power and/or the highest bidder?

It seems unlikely to me, because the NSA has so many brilliant people working for them, but it also seems like (at least superficially) it would explain just about every element of their reaction to this situation. They're used to being years or decades ahead of everyone else (e.g. differential cryptanalysis), and being caught off-guard would be an uncomfortable position for them.

</tinfoilhat>


NIST P-256 is widely suspected to have been subverted by NSA, as outlined in the first link in octoberfranklin's response, page 16 of [1], and [2].

[1] https://www.hyperelliptic.org/tanja/vortraege/20130531.pdf

[2] https://blog.cr.yp.to/20140323-ecdsa.html


This kind of logic is attractive on message boards but makes little sense in the real world.

What NSA needs are NOBUS ("nobody but us") backdoors. Dual_EC is a NOBUS backdoor because it relies on public key encryption, using a key that presumably only NSA possesses. Any of NSA's adversaries, in Russia or Israel or China or France, would have to fundamentally break ECDLP crypto to exploit the Dual_EC backdoor themselves.

Weak curves are not NOBUS backdoors. The "secret" is a scientific discovery, and every industrialized country has the resources needed to fund new cryptographic discoveries (and, of course, the more widely used a piece of weak cryptography is, the more likely it is that people will discover its weaknesses). This is why Menezes and Koblitz ruled out secret weaknesses in the NIST P-curves, despite the fact that their generation relies on a random number that we have to trust NSA about being truly random: if there was a vulnerability in specific curves NSA could roll the dice to generate, it would be prevalent enough to have been discovered by now.

Clearly, no implementation flaw in Windows could qualify as a NOBUS backdoor; many thousands of people can read the underlying code in Ghidra or IDA and find the bug, once they're motivated to look for it.


This is pure paranoia, I'll be the first to criticize NIST and NSA for the damage they've done to standards and to their own credibility.

But don't over-correct. You can't just call anyone who submits to NIST an NSA puppet. There is zero parrallel with the Dual EC backdoor.


The primitive selection looks rather informed with the hindsight of NSA compromising the security of NIST curves

Only if it doesn't make their own work any harder. NSA prefer weak security for everyone if securing US assets might risk others to be more secure too:

>NIST failed to exercise independent judgment but instead deferred extensively to NSA. After DUAL_EC was proposed, two major red flags emerged. Either one should have caused NIST to remove DUAL_EC from the standard, but in both cases NIST deferred to NSA requests to keep DUAL_EC"

https://www.nist.gov/system/files/documents/2017/05/09/VCAT-... [PDF warning]


For these ciphers, it seems less likely that NSA has a backdoor that no-one else could find. Notably in the case of dual-EC there was a recommended curve chosen by the NSA. That was easy to backdoor by knowing how the curve was generated.

>"you can with some effort devise a scenario in which it would have been possible for NSA to influence the curve selection"

Let's try to be clear: The NSA (naming names, among them Jerry Solinas) worked directly with the Certicom engineers to choose the properties of the SECG curves (some of which became the NIST/X9.82/Suite B curves, there's a lot of parallel working/overlap). They helped to perform the computational search to find the seeds, which were manipulated so that the results had certain properties (some of which were concealed and then were misrepresented as "provably random"). That is not a scenario: that is actually what happened.

Given that, and how incredibly opaque the process was, why am I not screaming in horror running away from secp256r1 et al? Because the information I've found (except for the SIGINT Enabling Project's budget) all points to that manipulation and deception taking place to obscure properties of the curves which allowed patents for efficient implementation of the curves - patents the NSA had in return agreed to licence.

Also, they use it themselves, in their most critical systems (they are particularly careful about randomness and side-channels). Is that just a cover? (Are they really willing to sacrifice the IA mission for the SIGINT?) I have no evidence that it is. It probably isn't. It could be. (They also use Dual-EC_DRBG themselves, likely on the basis that since only they have the private key, "nobody but us" can break it - shame it's got a bias anyway!)

On balance, I think the NIST curves are probably OK to use for now. If secp256r1 was somehow "backdoored", I have absolutely no idea how, and on balance, I do not think that it is. I think it's more likely any negative manipulation was to deliberately standardise something that's fragile not because it's weak but because it's easy and naïve to make it weak: hard to do in constant-time, convince some people to use insecure blinding techniques, etc. (And yes, I agree RSA w/PKCSv1.5 is definitely not better - one may wonder if it's another example.)

So I'm not running away from P256. But I'd like to walk away from it and switch to something more rigid, safer and a bit faster soon - and hopefully we get that out of the CFRG process (to the grandparent comment: Curve25519 is one of the top contenders).

The slides are a bit of out context without the talk, by the way!


This comment is gross.

I've repeatedly acknowledged the existence of USG surveillance programs, including Dual-EC. I believe the "hours" I've spent discussing Dual-EC and PKRNGs in general are also straightforwardly countable; just use the search box:

https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

What you're doing here is deceptively reframing a subtle nerd technical argument as a normative argument about USG surveillance. In particular: at no point have you or anyone else ever witnessed me arguing that Dual-EC was okay; rather, my argument was that Dual-EC was so batshit, and such poor tradecraft, that it was unlikely to be a serious NSA program. My argument at the time mirrored that of Bruce Schneier: "never use this, no competent engineer would". My twin mistakes: I overestimated both NSA and the industry, in particular vendors like Juniper, which did in fact implement this standard.

What's even dumber about this comment is that you're making it in support of an argument virtually nobody in the field supports, which is that the NIST P-curves are somehow "backdoored". As you know, or at least should know before talking about this stuff, there is virtually no relationship between the P-curves and Dual-EC. But you're happy to exploit the general lack of understanding about the distinction to score points on HN, even if means dumbing the whole thread down.

To head off some other comment you'll write 6 months from now without me noticing: you can easily peruse the site history and see me consistently arguing against use of the P-curves (in fact, I cite the P-curves is a damning failure of DNSSEC, which you'll no doubt find some fig leaf some years from now to claim I also support). But not because the curves are "backdoored"; rather, because modern curves are misuse-resistant.

https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

Since I'm assuming you have at least the ability to Google, I'm expecting you to cite Bernstein's analyses of the flexibility NSA's generation process provided for hunting for vulnerable curves. To see that outré claim rebutted, here's another Google search for you: [koblitz menezes enigma curve]. Happy to help.

Meanwhile, your message betrays no evidence of understanding what misuse-resistance in curves means, though you've been happy to preen about cofactors and Schnorr signatures on other threads; you depict the choice between 25519 and P-384 as one between a "suspicious generation mechanism' and an "immature" alternative. Of course, that's not why serious designs use 25519; rather, it's because 25519 is less susceptible to curve point validation vulnerabilities and is designed to be straightforwardly implemented without side channels.

Your comment is one of the most gratuitously uncivil and mean-spirited things I've had directed at me on this site, and I have been the target of a lot of mean-spirited bullshit here.


I doubt the NIST will ever be trusted again as any standards or specs they are in favor of will be immediately suspected of having some favorable vulnerability for the NSA.

Let's say they hold a contest for people to submit next generation cryptosystems, and that Algorithms A,B, and C make it to the final. If the NIST publishes critical remarks on A and C and seems to favor B, immediate skepticism and red flags will be raised. Does B have a hidden weakness the NSA knows about?

A standards organization can only run on its transparency and integrity.

next

Legal | privacy