Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

This comment is gross.

I've repeatedly acknowledged the existence of USG surveillance programs, including Dual-EC. I believe the "hours" I've spent discussing Dual-EC and PKRNGs in general are also straightforwardly countable; just use the search box:

https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

What you're doing here is deceptively reframing a subtle nerd technical argument as a normative argument about USG surveillance. In particular: at no point have you or anyone else ever witnessed me arguing that Dual-EC was okay; rather, my argument was that Dual-EC was so batshit, and such poor tradecraft, that it was unlikely to be a serious NSA program. My argument at the time mirrored that of Bruce Schneier: "never use this, no competent engineer would". My twin mistakes: I overestimated both NSA and the industry, in particular vendors like Juniper, which did in fact implement this standard.

What's even dumber about this comment is that you're making it in support of an argument virtually nobody in the field supports, which is that the NIST P-curves are somehow "backdoored". As you know, or at least should know before talking about this stuff, there is virtually no relationship between the P-curves and Dual-EC. But you're happy to exploit the general lack of understanding about the distinction to score points on HN, even if means dumbing the whole thread down.

To head off some other comment you'll write 6 months from now without me noticing: you can easily peruse the site history and see me consistently arguing against use of the P-curves (in fact, I cite the P-curves is a damning failure of DNSSEC, which you'll no doubt find some fig leaf some years from now to claim I also support). But not because the curves are "backdoored"; rather, because modern curves are misuse-resistant.

https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

Since I'm assuming you have at least the ability to Google, I'm expecting you to cite Bernstein's analyses of the flexibility NSA's generation process provided for hunting for vulnerable curves. To see that outré claim rebutted, here's another Google search for you: [koblitz menezes enigma curve]. Happy to help.

Meanwhile, your message betrays no evidence of understanding what misuse-resistance in curves means, though you've been happy to preen about cofactors and Schnorr signatures on other threads; you depict the choice between 25519 and P-384 as one between a "suspicious generation mechanism' and an "immature" alternative. Of course, that's not why serious designs use 25519; rather, it's because 25519 is less susceptible to curve point validation vulnerabilities and is designed to be straightforwardly implemented without side channels.

Your comment is one of the most gratuitously uncivil and mean-spirited things I've had directed at me on this site, and I have been the target of a lot of mean-spirited bullshit here.



sort by: page size:

As far as I know, there's no concrete evidence that the NSA has compromised the security of the NIST curves. That would be weird for them to do, since they use those curves internally to encrypt data classified at Secret and higher.

Are you thinking of Dual EC?


>(The NIST P-curves aren't backdoored.)

Unfortunately, we have no way to validate that the NSA did not grind the "seed" used to generate their parameters to search for curves which were strong or weak against some publicly unknown property which is only found in candidate curves at a one in a billion level.

This is a weakness in the methodology used to pick the parameters, somewhat lovingly mocked by the BADA55 curves: https://bada55.cr.yp.to/vr.html

It's not unreasonable for people to be concerned about this. Injecting intentional weaknesses into cryptosystems used by others was a fundamental objective for the NSA, which drove programs as significant as the CIA literally purchasing the at-the-time world largest manufacturer of encipher machines in order to ensure that it continued ship NSA designed intentionally weakened systems for decades ( https://www.washingtonpost.com/graphics/2020/world/national-... ). That was, of course, somewhat before the establishment of the relevant ECC standards-- but the cloud of operational secrecy prevents us from knowing that much about what NSA has been up to more recently.

The curve used in Bitcoin though isn't a NIST curve, and its generation procedure is about as close as you can get to rigid parameters without having rigid parameters as an explicit design goal.

a=0 (required for the endomorphism), field is 3 mod 4 for fast sqrt, increment field from 2^256-2^32-1024 (fast limb structure) until you find a prime field with a non-trivial cube root of unity (required for endomorphism) and until you can obtain a curve with prime order, set B to the lowest value that does. The result of that procedure is secp256k1. (you can actually drop some of the requirements above and still get the same parameters too, but I am pretty confident that this was their search criteria) -- so no high entropy "random" inputs.


This kind of logic is attractive on message boards but makes little sense in the real world.

What NSA needs are NOBUS ("nobody but us") backdoors. Dual_EC is a NOBUS backdoor because it relies on public key encryption, using a key that presumably only NSA possesses. Any of NSA's adversaries, in Russia or Israel or China or France, would have to fundamentally break ECDLP crypto to exploit the Dual_EC backdoor themselves.

Weak curves are not NOBUS backdoors. The "secret" is a scientific discovery, and every industrialized country has the resources needed to fund new cryptographic discoveries (and, of course, the more widely used a piece of weak cryptography is, the more likely it is that people will discover its weaknesses). This is why Menezes and Koblitz ruled out secret weaknesses in the NIST P-curves, despite the fact that their generation relies on a random number that we have to trust NSA about being truly random: if there was a vulnerability in specific curves NSA could roll the dice to generate, it would be prevalent enough to have been discovered by now.

Clearly, no implementation flaw in Windows could qualify as a NOBUS backdoor; many thousands of people can read the underlying code in Ghidra or IDA and find the bug, once they're motivated to look for it.


This is pure paranoia, I'll be the first to criticize NIST and NSA for the damage they've done to standards and to their own credibility.

But don't over-correct. You can't just call anyone who submits to NIST an NSA puppet. There is zero parrallel with the Dual EC backdoor.


The NSA is dual purpose. Both as a signals spy agency and a signals counterintelligence agency.

And ultimately, what's the difference between a publicly vetted algorithm proposed by the NSA and a publicly vetted algorithm proposed by someone else?

Everyone points at the Dual EC fiasco, but if vulnerabilities are possible either way it seems like throwing the baby out with the bathwater.


> Back in 2015 Juniper pretended to be shocked that the Dual_EC algorithm

To complete your point:

Security researchers solved the mystery around a sophisticated backdoor embedded in Juniper firewalls. Juniper Networks announced on [Dec. 17, 2015] that it had discovered two unauthorized backdoors in its firewalls, including one that allows the attackers to decrypt protected traffic passing through Juniper's devices. The NSA may be responsible for that backdoor. Even if the NSA did not plant the backdoor in the company's source code, the spy agency may in fact be indirectly responsible [due to] weaknesses the NSA allegedly placed in a government-approved encryption algorithm known as Dual_EC. The Juniper backdoor is a textbook example of how someone can exploit the existing weaknesses in the Dual_EC algorithm the security community warned about back in 2007.[1]

[1] Paraphrased from https://www.wired.com/2015/12/researchers-solve-the-juniper-...


The NSA even pushed NIST into supporting a backdoored curve and bribed RSA to make it default in their software. Look up Dual_EC_DRBG.

Prudence sometimes looks like paranoia.

The parallel that matters is that NSA is documented as being inclined to design, allow, and promote backdoors like Dual EC to be standardized by NIST. They have no obligation to break and publish the break as Ward has done here. It’s not that the authors of rainbow are suspect, it’s that the result of the process isn’t telling us what we hope it tells us. NIST must consult with NSA by law but the law does not require NSA to help. It certainly does not require NSA to save NIST from standardizing a broken system. The Dual EC backdoor is a documented act of strategic sabotage by NSA and NIST, though we extend the benefit of the doubt to NIST.

What stands between us and additional standardized backdoors is effectively a very small number of smart academics who relative to NSA are underfunded and under resourced. Thankfully we have people like Ward breaking systems and publishing it openly. Unfortunately we have NSA funded people pushing things as well, Dragonfly and IETF come to mind here as well.

NIST standardization does not mean NSA can’t break it. Historically, we know from the Dual EC standardization that they can break it by their own design, and they let the world deploy and use Dual EC to NSA’s own advantage.


>"you can with some effort devise a scenario in which it would have been possible for NSA to influence the curve selection"

Let's try to be clear: The NSA (naming names, among them Jerry Solinas) worked directly with the Certicom engineers to choose the properties of the SECG curves (some of which became the NIST/X9.82/Suite B curves, there's a lot of parallel working/overlap). They helped to perform the computational search to find the seeds, which were manipulated so that the results had certain properties (some of which were concealed and then were misrepresented as "provably random"). That is not a scenario: that is actually what happened.

Given that, and how incredibly opaque the process was, why am I not screaming in horror running away from secp256r1 et al? Because the information I've found (except for the SIGINT Enabling Project's budget) all points to that manipulation and deception taking place to obscure properties of the curves which allowed patents for efficient implementation of the curves - patents the NSA had in return agreed to licence.

Also, they use it themselves, in their most critical systems (they are particularly careful about randomness and side-channels). Is that just a cover? (Are they really willing to sacrifice the IA mission for the SIGINT?) I have no evidence that it is. It probably isn't. It could be. (They also use Dual-EC_DRBG themselves, likely on the basis that since only they have the private key, "nobody but us" can break it - shame it's got a bias anyway!)

On balance, I think the NIST curves are probably OK to use for now. If secp256r1 was somehow "backdoored", I have absolutely no idea how, and on balance, I do not think that it is. I think it's more likely any negative manipulation was to deliberately standardise something that's fragile not because it's weak but because it's easy and naïve to make it weak: hard to do in constant-time, convince some people to use insecure blinding techniques, etc. (And yes, I agree RSA w/PKCSv1.5 is definitely not better - one may wonder if it's another example.)

So I'm not running away from P256. But I'd like to walk away from it and switch to something more rigid, safer and a bit faster soon - and hopefully we get that out of the CFRG process (to the grandparent comment: Curve25519 is one of the top contenders).

The slides are a bit of out context without the talk, by the way!


I wrote this post before the latest of Snowden's revelations and Bruce Schneier's comments on them[1].

"Prefer conventional discrete-log-based systems over elliptic-curve systems; the latter have constants that the NSA influences when they can."

I'd rather have some smart people take a thorough look at the curves we use before we make ECDHE mandatory. If we need to choose between computationally heavy DHE or possibly backdoored ECDHE, I'm afraid many companies will still pick ECHDE.

[1] - http://www.theguardian.com/world/2013/sep/05/nsa-how-to-rema...


Those aren't vulnerabilities NSA created, unlike Dual_EC, which is.

NIST P-256 is widely suspected to have been subverted by NSA, as outlined in the first link in octoberfranklin's response, page 16 of [1], and [2].

[1] https://www.hyperelliptic.org/tanja/vortraege/20130531.pdf

[2] https://blog.cr.yp.to/20140323-ecdsa.html


They were largely dismissed by "the public" --- and dismissive themselves --- because nobody believed anybody would actually use an expensive, janky PKRNG when far simpler, more performant CSPRNGs were already universally available in operating systems and standard C libraries. The revelation in the BULLRUN leaks wasn't that Dual EC was suspicious --- it had always been suspicious --- but rather that companies were actually using it, because NSA suborned RSA Security into making BSAFE use it.

(Prior to BULLRUN, I'd have been equally dismissive of the idea that people were still using BSAFE, either, but, no, as it turns out, the industry is a whole lot dumber than any of us expected it is).

NSA does actually try to help; it's ostensibly half of their mission (nobody seriously believes the IAD mission gets anything close to 50% of the resources).


Information security is all about probabilities and risk estimation and cost-benefit analysis, so I don't think people's concerns surrounding Dual EC DRBG are unfounded. That the constants are suspect (regardless who could potentially possess the underlying keying material) and that its performance is worse than similar algorithms is enough risk for many to rationally decide to abandon it---no conspiracy theories required. NSA probably (j/k!!) didn't engineer the POODLE attack into SSLv3, but that's not stopping people from abandoning the protocol (and rightly so)---it's just too unsafe. That said, the spectre of the NSA's involvement in this should concern both US nationals and our colleagues overseas, for political reasons as much as---if not more than---technical ones.

>There is no evidence that US push flawed curves.

"Reuters reported in December that the NSA had paid RSA $10 million to make a now-discredited cryptography system the default in software used by a wide range of Internet and computer security programs. The system, called Dual Elliptic Curve, was a random number generator, but it had a deliberate flaw - or “back door” - that allowed the NSA to crack the encryption."

https://www.reuters.com/article/us-usa-security-nsa-rsa/excl...


I believe he's referring to the argument Schneier is making, not to the entire notion of NSA's SIGINT mission. Dual EC was indeed an instance of NSA harming everyone's security (less, at least in the US, than is broadly supposed, but clearly and deliberately). But it's not an instance of NSA acquiring zero-day vulnerabilities and then using them to harm everyone's security.

The same way Dual_EC_DRBG became a NIST standard, the NSA pulls the strings.

You can't expect a government department to provide robust security to the masses when the rest of the government is trying the prevent that exact situation.

At this point anything, cryptography related, coming from NIST should be considered compromised.


There's an important difference between an opaque "trust us" recommendation where it's broadly impossible to verify the claim (e.g., Dual_EC_DRBG), and one such as this which is fairly anodyne and merely intended to put more weight behind getting people to move forward in their choice of implementation languages.

The NSA's split offensive/defensive responsibility is bad but that doesn't affect recommendations such as this.


This is hardly the first time suspicion has fallen on NSA proposals.

As linked elsewhere in this thread, https://en.wikipedia.org/wiki/Dual_EC_DRBG

next

Legal | privacy