Your HN profile describes you as a Cryptography Engineer, and I'm sure that's true.
You're just not presenting a convincing argument for why it's OK to trust the code but not the data when they're both served out of the same place. Do you understand that they're both served from the same place? If somebody can modify the data, they can just as easily modify the code and make it do whatever they want. Do you understand that?
You're just linking me to things other people have written instead of presenting a persuasive argument. So I'm not going to bother doing what you say. Sorry.
When it comes to cryptography, I don't think the burden of proof is on the critics to prove it's insecure. Everything is best assumed to be insecure unless there's convincing evidence otherwise.
I gather that there are enough experts in this sort of thing that aren't convinced that it seems fair to say it's insecure.
Ex: If someone built a bridge, but wasn't an actual engineer, I would assume the bridge was unsafe. I don't need an engineer to actually inspect the bridge before I make that assumption, and I would probably tell everyone I knew not to use that bridge.
If there is one place where an appeal to authority argument can hold place and not be seen as a fallacy is security. (it's not ad hominem btw)
When you're rolling out a secure system + implementation you want to use algorithms that have been proven, tested and certified as correct. It's a delicate field that can't be approached like your standard software development field. Yes, being a "random" and developing your own encryption algorithm out of the box is bad. Never trust anything like that. Period.
Can't you say the same thing about software that uses encryption in general? For example your browser, yet you still trust it. Also, what you said applies to critical software in airplanes, and cars like Tesla, yet you still somehow trust it without reading the code.
Maybe you should replace the word "Ethereum" with "sotware". "Software in general makes sense is all the following are true"
I'd argue for not trusting a cryptosystem that requires you to use a particular vendor's CPUs. Open standards and independent implementations at every level should be table stakes.
Lest anyone seriously consider this, keep in mind that the success of cryptography is that you don't need any "blind faith". Properly designed crypto systems are those which don't force you to lean on any pillar that might give way unexpectedly.
"Just trust the server to avert its eyes" is untenable. It's untenable because it's unnecessarily risky. We as a community can do better than to use someone else's design full of holes merely because it seemed to work.
Crypto systems seem to work until they don't. And when they stop working, it's likely you'll never realize. But your adversary will.
Are you just attempting to argue the pedantic point that some theoretical subset of homebrew crypto applications may actually be secure? Because taken as practical advice your position requires a lot of awfully strong assumptions.
I'm confused about why you're even trying to make this argument, Colin.
Because I'm an academic who believes in education and getting details right. And also because I don't want to have people panic unnecessarily.
Name a real system that uses H(m||k) that you trust. H(x||y) MAC schemes are a hallmark of DIY crypto, and DIY crypto systems are the ones that get broken first.
I wouldn't say that it's a hallmark of DIY crypto so much as it is a hallmark of not understanding crypto -- like advertising the use of AES-256 without mentioning the block cipher mode, or treating SSL as a silver bullet.
You're right that I wouldn't trust a system which used SHA256(m || k) as a MAC; but if I was hired to fix such a system, that would be the last thing I'd fix. I'd start by looking for the security flaws.
You probably wouldn't! Aumasson isn't arguing in favor of closed-source cryptography. He's just dispelling a particular myth about the security differences between the two types.
"Real-world cryptography often doesn't have security definitions, e.g. AES"
Block ciphers do have security definitions; what AES lacks is a rigorous proof that it satisfies the definition of security for a block cipher. There are different definitions for different notions of security, but that does not mean there is no security definition. It is also untrue to suggest that security parameters are fixed in practice; this is certainly false for public-key cryptography, but Rijndael was designed to support arbitrary parameters, as are many other practical block ciphers and hash functions.
"Coming up with a good security definition is hard, the 2013 Turing award was given for one."
Not one definition, but several definitions and an entire paradigm for definitions. The work also set the groundwork for proving that cryptosystems and cryptographic constructions meet such definitions.
Really, the importance of having a security definition cannot be understated. Without a security definition, you cannot have any falsifiable claims about security. If I claim a system without a definition is insecure, you can always refute me by claiming that the system was never designed to defend against my attack -- which is technically correct, because without a definition the system cannot be said to be designed to defend against any attacks.
Also, note that I did not say that Satoshi failed to give a good security definition for Bitcoin. What I said is that Satoshi failed to give any security definition. If Satoshi had given an unrealistic or otherwise bad security definition, then we could have a productive conversation about the definition and about whether or not Bitcoin satisfies it.
"I think it will take a long time before we get a realistic security definition for Bitcoin."
The thing is that we do have realisitic security definitions for digital cash -- the definitions just happen to rely on the existence of a central authority that issues the currency, which is a deal-breaker for the Bitcoin community.
You could apply the same logic to cryptographic systems themselves; we have very few absolute proofs of security properties for the cryptography we use for TLS and the like, so they are “imperfect” and may be vulnerable and therefore are snake oil. However I doubt you’d feel indifferent about whether a website you are sending your credit card number to uses HTTPS and stores your payment information encrypted at rest.
A large number of people who know what they are talking about have stated to you in the clearest terms possible that when it comes to cryptography and security systems, it is appropriate to place the burden of proof on the creators. It isn't an opinion, its a fact agreed on by every competent security practitioner on the planet. If you are going to continue to ignore this, no one is going to take anything you say about cryptography seriously.
Its not about dogma, its about safety. The fact that you fail to understand that is a testament to your inability to contribute to a meaningful conversation about security.
Which is all fine, and it's all very valuable work of course.
The problem is when cryptographers say something is "proven secure" it turns out not to mean that nobody can hack it, especially in an actual deployed system.
The problem is that your argument is based on false equivalence.
The counterpoint is the classic trade-off between usability and security. I would (as would many) argue that charges against the Cryptocat team of poor cryptographic implementation or indeed knowledge are rational, appropriate and correct.
Charges of unbounded incompetence are irrational, inappropriate and incorrect. The cryptocat team understand usability, UX, general architecture and programming. That does not chime with sweeping statements of incompetence. Where they fail (and you may see this as a fatal flaw, it may surprise you to see that others may not but it appears that others do) is at crypto.
When it comes to real world attacks so far, it seems that Cryptocat is dangerous, but as dangerous as using any other TLS website. I totally accept that there is substantial evidence pointing to a lack of understanding of cryptography, but unbounded claims of incompetence only undermine a cryptographer's position.
reply