The RSA CTO's answers are hilarious. He can't really be that clueless as the CTO of a security firm, can he?
That would be incredibly embarrassing in itself (which it already is), but the alternative is even worse (choosing the one with the backdoor on purpose).
Especially the part about KDFs being deliberately slow, and according to him that somehow implies that RNGs should also be slow. Whut? This guy is really a CTO?
"The length of time that Dual_EC_DRBG takes can be seen as a virtue: it also slows down an attacker trying to guess the seed."
If a system's seed is weak, one attack is to try all likely seeds, run them through the PRNG to generate keys, and see if any of the keys work. A slow PRNG indeed slows down this process.
For instance, it would have slowed down the attack on the Taiwan Cryptocards, which exploited patterns in the seed that appear directly in the key, reported here: http://smartfacts.cr.yp.to/smartfacts-20130916.pdf
That's a really poor idea. If you're concerned about guessing attacks on your RNG seed, the right response is to increase the size of your seed -- NOT to slow down the function. To put it another way, adding one bit to the seed length is equivalent to doubling the cost of the attack. In other words, a 1000x slowdown in the generator is the same as about 10 bits of additional seed material. Not worth it!
No, it wouldn't have slowed the attack on the Taiwan Cryptocards one bit. That and the related "Mining Your P's and Q's" research involved deriving the private keys from the public keys using bulk factorization.
They did not need to simulate the operation of the poorly-seeded PRNG that generated them.
They got the first 103 keys with batch GCD. After that, they found many more keys by looking at patterns in the keys and doing trial division by keys that were similar to the patterns. A better PRNG would make that harder (to reverse-engineer the patterns in the seed) and a slower PRNG makes it slower.
A better (i.e., not completely utterly horribly broken) PRNG would have made it impossible to observe and associate patterns in the output, even with poorly seeded entropy. There's no reason such a PRNG needs to be any slower than a fast cipher or hash function such as AES or SHA-2.
I've been a big advocate of things like PBKDF2 and such to slow things down where appropriate. But except for very specialized circumstances (and an RNG isn't one of them), we want cryptographic operations to be fast. In general, we want RNGs (and most things) to be fast and efficient.
And in particular, TW Cryptocards analysis never had to run the RNG. Just had to look at 2 million public keys already out there.
His answers are post-hoc justifications. The real reason they picked it was because they wanted to make money on sweet, sweet government contracts, and the easiest way to do that is to just do everything NIST says to the letter.
You're assuming that the CTO is technically competent. I've found over the years that even CTO's who once were technically competent either get lobotomy's, or suffer from hypoxia from the low oxygen at the summit of major corps.
Or, management exists cover up bad hiring practices. Take your pick. Either way, CTO's saying dumb things seems to be normal.
He's in a tricky position. Imagine somebody in RSA taking decision years ago to follow "strong suggestion" of NSA, accompanied by millions of dollars, to choose certain algorithm as the default. Now current CTO has a choice:
1. say "we deliberately built the backdoor into our software, please never buy our products again if you value your security" and go live the rest of his life in a Buddhist monastery in Tibet,
2. say some embarrassing BS which gives him a veil of plausible deniability while raising doubts of his personal competency, but who cares, he's a C-type, they don't have to know all the details, right?
This was actually going to be the first thing I posted when I read this link.
tptacek repeatedly assured everyone that this was absolutely not a big deal and meant nothing because nobody in their right mind uses the standard.
Except whoops, one of, if not the, largest players in the field. I'm sure he'll have a bunch of really great replies that manage to simultaneously say why this still isn't a big deal and passive-aggresively insult our knowledge of the situation. I look forward to reading what he has to say.
I don't understand his constant line in the sand on this issue (amongst other things). You basically illustrated exactly how he structures his arguments on this topic.
If I was smart enough to understand what this comment meant, I'd respond to it. Maybe you could clarify? I can't find a way to reconcile it with its parent comment in a way that gives me anything to discuss.
You seem to have already addressed it in another comment:
>Thanks for the link. I would have just gone on confirming my own biases without it.
This is incredibly evident in your comments on Dual_EC and sometimes happens on other comments as well. You draw a line in the sand and argue around it constantly, eventually bleeding into passive-aggressive attacks on the knowledge of others.
I find it valuable to hear from others outside of my industry (which is not crypto) who have no expertise in that field. They often have a fresh and agnostic look at something. Do you think all these comments slinging mud at you - however inaccurate they may be - were born from nothing?
> I have a different explanation for why I always seem to be at odds with people on NSA topics, but I'll wait to provide it.
I'm very interested in that actually. I'm often curious what shapes people's perspectives on these issues, particularly if it doesn't align with any obvious incentives. I always thought that you must have family in law enforcement or something, but I'd love to know the actual reason.
* NSA topics are heavy on computer security issues and legal issues.
* I'm professionally involved in computer security, like you, and have an an amateur interest in the law (I'm considering law school at some point).
* Message board nerds have a lot of weird, wrong beliefs about computer security and the law.
There is a political difference between me and HN: I'm not an anarcho-capitalist (that silly "world's smallest political test" thingy puts me dead center in "left liberal"). But politics have little to do with where I end up on the NSA threads; it's things like not understanding (or really, having even skimmed) NIST crypto standards, or not taking the time to understand what the 4th Amendment means. The things that get me into "trouble" here have more to do with taking the time to actually read primary sources than anything else.
We probably disagree about NSA a lot less than you think we do.
To be fair, the insanity of anyone using it was never called into question. We agree it shouldn't be used. As we all know, science has yet to establish any correlation between sanity and what companies are actually doing in the real world for security.
Edit: Actually, I take that back. I have no problem believing that RSA Security are perfectly sane. Would we be completely shocked if the reason they chose a questionable default was due to coercion from the spooks? Only NSA has the keys, so it's a pretty safe backdoor.
You know, props to China, had to go through the work of owning RSA's seed server last year just to level the playing field. They get so derided in the media for doing that, but it seems unfair when the other team has a backdoor. Who is the real "Advanced Persistent Threat"?
Sorry, that was worded incorrectly. I meant that we all agree it was insane to use it. The contention was over whether or not it was in use, not whether it is a good idea to use.
I should have said calling into question the insanity, will edit.
But I also expressed the view that no one would have used this. I guess it makes a lot more sense now: It seemed weird to put it in the standard, as no one was going to just use it. I'd been guessing that they hoped that it would be made an option and then they could do some negotiation attack to force it. I was missing a more obvious explanation: Someone was already willing to ship it, but they wanted the plausible denyability of it being a standard, because it looked too suspect otherwise.
I'm surprised by this. We look at a lot of stuff. OpenSSL is far and away the most common crypto library we see. Commercial crypto libraries are way, way down the list, after "reimplemented all of elliptic curve by hand".
Thanks for the link. I would have just gone on confirming my own biases without it.
That, I think, is indicative of this community. I come across it a lot, and there are some insanely high profile, non-US organizations that rely on it. SAP being an example that I'm ok with mentioning.
An interesting data point from the boilerplate about the RSA in a press release [1]:
"With approximately a billion RSA BSAFE-enabled applications in use worldwide, more than nine million RSA SecurID authentication users and almost 20 years of industry experience, RSA Security has the proven leadership and innovative technology to address the changing security needs of e-business and bring trust to the new, online economy."
I know this submission has been flagged off the front page, but I'm curious on your opinion if this number is misleading.
Let me just say that when my brain tries to survey the landscape for products that use crypto, it mentally discounts all the SecurID tokens; it doesn't occur to me to think about the RSA in-house products that use BSAFE. So yes, there's another group of deployed products that use commercial crypto libraries, because they're sold by a company that also owns a commercial crypto library.
My reasoning about this mostly comes from the fact that products usually don't use commercial crypto libraries.
Oh man, you DELETED your comment! I was replying and lost it, so it will be reproduced here due to quoting, as it is important to me that your backpedal of the century be committed to the historical record.
--
First off, my comment wasn't out of spite or anything. I would call it a friendly jab. I'm guilty of reaching temporary frustration in some previous discussions with you over civil liberties and privacy, which may have colored a few comments, but I totally respect you and don't generally disagree with you on technical matters. I'm not even going to pretend I have a leg up on you anywhere related to security. I'm sure when the day comes that there is an obscure subtopic of security for which this is the case, it will quickly become known though! (The civil liberties issues are another story, because your opinion is in fact wrong on all issues related to civil liberties, along with seemingly everybody in your "must read" list. Yes, that's a joke. Kind of.)
> Nobody does use it. How important a product do you think this is? Isn't it basically a commercial packaging of rsaref?
> Name a piece of crypto technology that you or, say, Moxie Marlinspike or, I don't know, Jacob Appelbaum relies on that uses Dual_EC.
I don't know who uses it, but that's kind of the point. I don't think anyone can really have a good idea of who uses it or who ships it as a default, we can just know who doesn't. It's in the standard, so my assumption is that it's not unlikely that someone chose it, perhaps in a proprietary implementation in corners of the internet that are not the most obvious. The fact that RSA sell it strongly reinforces my suspicion that it's in more places than you think.
> What's frustrating are the people who insist on taking the wrong message away from what I'm saying. I'm not defending Dual_EC. I imagine that I have the exact same perspective on it that Matthew Green does.
I don't think anyone thinks you're defending it (I certainly don't), just that you think that it's not a concern worthy of paying attention to. On the other hand, Matthew Green seems rather disturbed by the recent revelations. The interest in this topic involves more than a specific backdoor that got into a widespread implementation and who isn't likely affected by that particular instance.
> Also: how was this comment helpful? Yours will obviously be the top comment on this thread, and it's basically about, what are the implications of this blog post on the 'tptacek HN persona?" Is that all you have to talk about?
Yes, that was all I had to talk about. It was just a casual comment, and I didn't intend nor expect it to be the top. But I did have a point:
You are the most well-known commenter around here. and everybody looks to you for crypto and security expertise. Sometimes you get rather adamant about whichever position you adopt, which might mix opinion in with fact from time to time. You know what you're talking about dan you're usually right. However, forces that govern our universe guarantee that if one makes bold and authoritative claims on the internet that turn out to be less than correct, the strength of the claim and the number of times it is repeated are proportional to the promptness with which someone else will present something to challenge the claim. (Edit: I wrote this before noticing all of the top comments in this thread came to say the same thing. Come on, you have to chuckle...) I know you understand this because you do it all the time :P Do you want to hear about my 100% undetectable rootkit?
Anyway, I just disagreed with your dismissive comments on other threads as though it's nothing that we need to discuss. I get why one of your missions is to be the Outrage Police, but I think you're overzealous about it from time to time. I know, I'm sure I'm overzealous with my pitchfork as well...
> How many threads have you and I managed to find something to argue about in information security and privacy? How many times have I top-commented a thread about you? I don't because wow, is that ever boring.
I'm not really sure why you're making it personal. For one thing, I'm not important enough to get a top comment about in a thread I haven't yet posted in ;) I don't think I have a habit of dominating those discussions, nor do I think I have enough credibility to do. It probably has more to do with that than being boring. But why do your questions take for granted that your commenting habits are the benchmark for appropriate behavior? Is that not just a tad arrogant?
> Later edit: BSAFE is apparently used in some way by some popular consumer electronics products that I've never taken a close look at, so I was wrong about that. My point about the pervasiveness (or lack thereof) of Dual_EC stands.
I deleted my comment because I was just helping you make this thread be about me, which is the most boring thing you could possibly make it out of.
I have no idea what the "100% undetectable rootkit" thing is about; if you're referring to the talk me, Nate Lawson, and Peter Ferrie did about Joanna Rutkowska's "Blue Pill", our talk made the exact opposite claim --- that virtualization did not make rootkits undetectable.
Look: you don't offend me. I do not mind if you think I'm wrong about stuff. I'm just frustrated because I like Matthew Green's blog, and your comment hijacked the thread. I'm sure you didn't mean it to, but it was obvious to me that it would.
No, you might want to re-read what I said about the rootkit thing. I'm saying you understand the compulsion to "call out" a bold claim. The joke was supposed to be me claiming that I have an undetectable rootkit, and you will then be sent on a highly publicized mission to debunk that, which was in fact referring to the Blue Pill Drama. That's actually where I first heard of your company, and when I first came to HN and saw your name, I recognized you as "the rootkit debate guy with the company that I sometimes accidentally call Monsanto". </joke explainer>
I'm a bit surprised that you aren't just saying "haha, ok, looks like I called that one wrong." You actually say that you don't care if I think you're wrong. You are not a good sport, sir.
I don't feel like I detracted from Matthew Green's blog by commenting on an unrelated site. Also, I'm going to play the "everybody else came here to say the same thing" card to dodge any guilt.
But who are you kidding? Every crypto thread on HN is about you, whether you like it or not. Most HN readers hit the comments like before seeing the article, just to see what you have to say about it. I know because I'm one of them. That's not a complaint either, I rely on the expertise of others to balance claims put forth in articles being circulated, and you and marshray are awesome commenters for that reason, because you kind of act as a bridge between academic crypto people and non-crypto security people.
Also: that's not why we did the Blue Pill talk. It drives me a little nuts that people (incl. Joanna Rutkowska) thought it was.
We did the talk because it was a fun talk. All the code for that talk was kernel code, much of it coding directly to MSRs, looking for fiddley places where the presence of a hypervisor would queer measurement results. And we came up with a bunch of cool ideas! And, it turns out we're (mostly) right. But it seems like all people wanted to pay attention to was the drama.
Anyways, I'm explaining things because I can't resist explaining them, not because this is an important issue for us to work out.
To be fair to tptacek, he was far from the only crypto person saying that...and for good reason. In the absolute best case the algorithm was known to at least be slow and complex to implement, two things that don't usually lead to widespread adoption. The news that RSA had it as the default is interesting regardless.
Serious question: what's the difference? I would have followed the definition for backdoor since it relies on particular secret information relating P and Q that the NSA might have—but where should one draw the line between backdoors and vulnerabilities? Intent?
An intentionally introduced vulnerability can be considered a backdoor, even if it's not a matter of saying "open sesame" to open the so-called backdoor.
Nerd's law: A commenter C declares that concepts X and Y are completely different. Because parent commenter P failed to make this point, P is intellectually inferior to C. Hence C > P for all X and Y.
It's a PRNG based on a trapdoor function where apparently the NSA has the key. With that key they can recover the RNG state from just a small amount of it.
Thats a backdoor by most descriptions. This isn't just a bug.
An intentionally introduced vulnerability can be considered a backdoor, even if it's not a matter of saying "open sesame" to open the so-called backdoor.
It's a backdoor. The original paper describing the hole is (hopefully) entitled "On the Possibility of a Back Door in the NIST SP800-90 Dual Ec Prng"[0]. It's no longer a "possibility".
Uh, yes it is still a "possibility". Unless you have a source I missed, which is possible but please actually cite it, all we learned about Dual_EC from the NSA leak was that the algorithm was designed at NSA. Lots of crypto is designed at NSA; it isn't all presumed to backdoored.
I'm not sure exactly what you're talking about. When the initial constants for Dual_EC_DRBG were generated, the NSA had the opportunity to generate them in such a way that they would be able to predict all future outputs after observing just a few bytes of output. I don't see how this (1) doesn't qualify as a potential back door and (2) means that the NSA 'could not' insert a back door. The back door here is the knowledge of the 'secret key' that they could know (by virtue of the constant generation). Of course, it's just an alleged back door here; we don't know the truth.
By the way, the 'less-than-technically-competent journalist' here is Matthew Green, a cryptographer. So I think he's plenty technically-competent. Also, they're called elliptic curves, not 'elliptical' curves.
He wrote this in 1999; ECC was at the time sort of a novelty, it's headline being "a way to do cryptography with smaller key sizes", as if ECC was mostly a way to fit crypto into smaller memory software.
Schneier and Ferguson wrote _Practical Cryptography_ in 2003. It has virtually no mention whatsoever of ECC cryptography, despite ECC's already significantly increased importance by then.
In 2013, I think it's fair to say that the balance of expert opinion on ECC vs. RSA is sharply against Schneier's take on ECC. RSA is on the way out, and ECC is what's going to replace it. Equally significantly, crypto constructions that use signing and key agreement but not public-key encryption/decryption operations are modern designs; designs where a long-lived RSA key is used to wrap and unwrap secrets are dated. ECC is useful for the modern constructions and less so for the dated ones.
I don't think Schneier's writing has tracked these shifts well. I think his books are very light on, for instance, forward secrecy or deterministic signing. His research results never focused on public key encryption and never touched ECC (it's worth noting that much of his best known research was also done with John Kelsey, who is at NIST --- coincidentally, one of Schneier's best-known designs, with Kelsey, was the Yarrow CSPRNG).
Also: Schneier has a close working relationship with Niels Ferguson, who was one of the researchers that spotted the weirdness in Dual_EC. So it's possible that Schneier's concerns over the "constants" in ECC all stem from that. (It's hard to say; he could also be referring to the random seed in the P-224 and P-256 curves).
I don't think readers have any particular reason to believe that Bruce Schneier is an authority on ECC.
Cryptography is not one topic. It's actually a little silly to suggest anyone like Schneier could be authoritative on all crypto concepts. Look at the research output of famous cryptographic researchers (Schneier has some research, but is not really a top-tier researcher) --- it tends to specialize!
You're missing context because it was deleted by the comment's author. It was claimed that Linus Torvalds was "the authority" and that articles were written by non-technical reporters. Schneier is clearly not a non-technical reporter.
No claim was made by anyone that Schneier was the one-and-only security god either. Some people worship the ground person P walks on, but it's still just ground. Celebrity doesnt make 2+2 = 5, only for very large values of 2. I do think there is a common antipattern of wanting to rely upon one technology or one authority as a "security oracle," but this goes against holistic, defense-in-depth.
The other points are fine, just non-apropos.
But since you went there: Does the less elucidatbility (yep that's not a word, but a self-referencing pun) of ECC for the average joe programmer like myself make it any more secure than picking longer RSA keys? ECC seems so much easier to screw up in subtle ways that only leet mathematicians can grasp. Also, the change seems like change for churn's sake to sell more consulting. I used to work with big 4 folks, and there's always a joke a about the latest fad that needs selling, so I'm biased against popular, unjustified change.
(1) The reality is you probably don't understand all the implications of the RSA problem as well as you think you do either.
(2) "You" are just as likely to screw up RSA; the point is, "you" shouldn't be implementing cryptography directly at all, but rather using a well-vetted library. Both RSA and ECC have many viable free options for that.
(3) The reason RSA is going away and ECC is replacing it is that RSA is that RSA at acceptable levels of performance is getting too weak, and RSA at acceptable levels of security is too slow for mass deployment.
(4) It's hard to take "churn" seriously when change happens once over the course of 15 years.
(5) There is no cabal of consultants who get rich selling cryptosystem designs; for instance, for the most part, browser cryptography, email cryptography, and web site cryptography aren't implemented by consultants at all. (If you're wondering: we're not that kind of consultancy.)
Interestingly, Schneier sounds more reasonable in 1999 than now (with relation to ECC; don't hate me). That essay was pretty good: elliptic curves were relatively new on the block (but really, not even 10 years newer than RSA), and not particularly well-studied in the cryptographic setting (mathematicians were more interested in other elliptic curves, e.g., Fermat's Last Theorem).
It's been almost 15 years since then. No new classes of weak curves were discovered since then (the last one was anomalous curves, around 1998), nor advances towards a general notion of smoothness in the general setting §, despite intense study. It seems unlikely that Solinas would have been able to, in 1999, slip a weak curve past everyone for so long, given the attention the problem has gotten in the last decade. All in all, I think Schneier may be doing more harm than good with those recent ECC comments.
§ Summation polynomial-based approaches are the exception, but they are not general. They only work over extension fields, including binary fields, and are mostly impractical attacks.
Would you trust a computer security company who didn't hash the passwords of their users on their web site, and instead stored the plain text passwords encrypted in their database, with the keys to decrypt them on their server, because they claim that "Your data are encrypted on our server, if you request the password to be sent to you by email the system knows how to decrypt the information and it will send you the Email. This is for customer convenience as many customer do not wish their password to be reset each time they have a problem."
Would you trust a computer security company that when you reset your password on their web site, sent you a new password that was literally the same as your email address that you signed in with?
If this company sold closed source encryption software, would you trust that the software was competently written and did not have back doors, if the president of the company defended their actions of not hashing passwords, and of resetting passwords to their user's email addresses?
What if the president of that company had been prosecuted for computer crimes in the past, and had spend time in jail for it, because after he was first caught, he went right back to phone freaking again and got caught again?
Would you trust the president of the company, who is a convicted felon, who fraudulently made a lot of money by computer crime and got caught, but had most of the charges dropped and his sentence reduced, not to have made a deal with the government and promise to return their favor of giving him a more lenient sentence in exchange for certain favors in the future?
His company came out with a "secure" voice encryption product, and then a previously unknown anonymous hacker reviewed the product and its competitors, and wrote a suspiciously positive review of it, claiming it was the only one he couldn't break. His company then published a press release trumpeting the favorable review, right before a big mobile security conference.
A suspicious security researcher baited the anonymous hacker to post on his blog, and it turned out he was using an ip addressed registered to the security company whose product he'd written a favorable review about.
When confronted with proof, the founder of the security company denied astroturfing, denied knowing the hacker, and implausibly claimed the anonymous hacker must have been using his company's anonymous browsing service.
The same security company founder who spent three years in jail for phone phreaking, because he was convicting of hacking and defrauding profit. The same security company who stores their user's passwords in unhashed unsalted plain text encrypted with a key on their server. The same security company who resets their user's passwords with their email address. The same security company whose founder claims that "many customer do not wish their password to be reset each time they have a problem" justifies not hashing passwords, and resetting passwords to "convenient" email addresses. The same security company whose founder refuses to change his "unconventional" security policies after being confronted with these facts, and instead makes ridiculous excuses for his incompetence, and continues to betray the trust of his customers even after he's been confronted with it.
That would be incredibly embarrassing in itself (which it already is), but the alternative is even worse (choosing the one with the backdoor on purpose).
reply