That kinda doesn't matter when the amount of compute is trivially reproduced by '90's CPUs
You can just have a bunch of notes and vote out bad results. You can say "how do you know that 20 different machines you run that do not have hardware backdoor but how do you know software running ethereum is not backdoored ? There is no protection for the compute engine being backdoored, just the fact you'd need to get that backdoored software to most nodes.
Yes, that's true, and that is a legitimate concern. I personally would not trust Intel's RNG because it is implemented in a way that deliberately hides some of the underlying implementation in a way that would make it impossible to tell if it's backdoored. That is my original point: it is much harder to tell if hardware has a back door than software.
At this point, hardware crypto engines are a "must have" feature for many purchasers. Their desire for performance outweighs concerns about malicious actors backdooring the crypto engine.
And if the crypto engine is compromised, how little/great of a leap is it to believe there is microcode to backdoor a general OS or crypto library?
Still, it would be very hard to make sure that the provided code is indeed the one running on the suspicious machines. The only way I see to make sure of that would be to provide tools to compile and flash the hardware, which doesn't make much business sense. This also gives no protection to silicon based backdoors that has nothing to do with OS code.
In theory, yes. In practice it is not realistic to implement a plausible-deniable hardware backdoor targeting all CPUs being manufactured while keeping the schematics and tapeout open.
While the same CPUs are even fabbed in different locations around the world.
While also going undetected for years and while none of the engineers involved blows the whistle.
In short no, you can get away with a targeted attack but nothing so massive.
The encryption works but the endpoints are compromised. The Intel management engine and the AMD equivalent are good examples of how modern hardware is complex enough to seed backdoor hardware into a system.
Even scarier, researchers have shown that there are ways of backdooring CPUs via transistor doping so that even if the manufacturer suspects a backdoor, it may still be very difficult for them to find it: http://www.techrepublic.com/blog/it-security/researchers-cre...
There will always be trust related issues, but that doesnt mean we shouldnt improve overall situation. Currently its possible there are all kind of backdoors in: hardware itself, firmware, drivers, some of closed source software. If we would limit it only to hardware itself that would be huge win.
Or you can get your raspberry pi, some desktop from the 90s and random used laptop and get 1000x the "secure computing power" that couldn't be reasonably hardware-backdoored too.
It's also possible to build systems that are correct, such that no adversary with any amount of resources could find a security hole. Many CPUs in the past have been correct. Probably most major commercial ones before 2000 were. So it's not mathematically impossible.
Well, somewhat a solved problem if your hardware is a uniform combinatorial logic and routing mesh (E.g. a FPGA), not exactly energy efficient.
But I think this is a weird diversion: That I can't add (or pay to add) advanced security features in my CPUs even at substantial (but sane) costs is a clear reason the current closed ecosystem is inferior to an open source one.
This remains true even if even an open cpu design were not cost-effectively auditable at the hardware level, it's an orthogonal issue (and even more so— the closed cpu designs are inherently less audi-able if hardware backdoors are your concern). An open design doesn't have to be better in ever possible way to be better in some.
Good call on this. If you can't trust the CPU you're running on, you can't trust anything at all. The proper solution is to find a supplier you trust and go with them.
Pulling useful entropy away from the OS's RNG functions is the best example of an ignorant knee-jerk reaction to this security problem.
Unless they've actually found a real smoking gun, probably not even close. Besides, even if there isn't a backdoor in intel CPUs they've definitely tried.
When I wrote that, I was under the assumption that people would use open-source CPU designs from OpenCore for convenience. With a little help from Xilinx and Altera, it wouldn't be too hard for a government to have the synthesizer detect when an OpenCore design is being used and surreptitiously put a backdoor in. I admit that it would be hard to write software to simultaneously detect a completely unique CPU design is being synthesized, figure out its instruction set and weaknesses, and finally create a hardware backdoor that could circumvent any software written for that device.
As always, there's a tradeoff between cost and security. How many hardware hackers are good enough (or motivated enough) to design their own brand new ISA and CPU design, then bootstrap a compiler and OS for their homemade CPU? Maybe 0.001% of the population, if that.
You can just have a bunch of notes and vote out bad results. You can say "how do you know that 20 different machines you run that do not have hardware backdoor but how do you know software running ethereum is not backdoored ? There is no protection for the compute engine being backdoored, just the fact you'd need to get that backdoored software to most nodes.
reply