Your second paragraph pretty much nails my thinking exactly. Memory safety is a very good thing, but having to manage memory/the stack/etc is kind of an IQ test that filters out contributions of generally low quality.
At least that's the theory. Thinking about it, there are probably people out there who can nail things like memory management but are terrible at eg higher-level application architecture to such an extent that they're ultimately not competent engineers anyway. It's possible at least.
Oh jezz, anecdotes from people from academy, please.
>Some people know how to manage memory and others don't. If you know how to manage memory, even C is a safe language. If you don't, not even the most formally-verified garbage-collected language can save you.
Who knows how to manage memory then?
Windows engineers? Chromium engineers? Linux engineers?
all of them have history of failing at this, stop with this mindset.
I think the strongest point here is that memory safety is not the cause of the worst bugs. The reason why this is worth pointing out is that people on the Church of Rust often make it sound that if only you had memory safety, bugs would be impossible.
I'm not sure what "memory safety" has to do with skill regarding "language architecture." In fact, I'm not quite sure what you mean by that whole bit.
Memory safety isn't a bad thing. But maybe pushing it as the thing will end up making it harder for newer systems programmers to be comfortable dropping into "unsafe-Rust" when it's necessary out of fear, or harder for them to learn the lessons that C and C++ folks have had to when they do have to use "unsafe-Rust". We just don't know what the impact will be in that regard. It's part of being a young language--it just needs time to develop.
Are you saying that memory safety is valuable or not? If you do think memory safety is valuable, then I don't see how you're disagreeing with the author. Nobody is claiming that memory safety eliminates all bugs.
I suspect a lot of the toxic discourse originally comes from folks not understanding each other's use cases and priorities.
For some, memory safety is a means to an end: delivering value. In this perspective, one must weigh the benefits of improving memory safety against the complexity costs of proving memory safety. In some situations, the improvement is too small and the added complexity is too much. Some apps and some embedded situations come to mind. Languages like Odin and Zig can be stellar in these situations.
For others, memory safety is a responsibility, and upholding it is a basic requirement of modern software no matter what costs we need to pay for it, and if the world would just accept that, then we as a society could move past the days of rampant vulnerabilities and odd memory bugs.
Both sides are equally compelling, to me at least. What I hope people can learn is that it really depends on the situation. There is a place and time for both approaches. Once we can accept that, I think the toxicity will dissipate.
I don't think you're wrong but I do think putting all the burden and emphasis on the last mile (high-level software libraries and application software) will not be enough. Memory safety needs better support from operating systems and hardware, both of which have improved but have been much slower to change than application programming. Systems programming, sitting in the middle of the two, is caught between a rock and a hard place.
I think it's a bit funny that in an industry that (supposedly) prides itself on "meritocracy", there are many people that refuse to use (or learn) performant memory-safe languages, when memory-safe code is always better than memory-unsafe code (in terms of resource usage, reduction of bugs, etc, etc.).
Memory safety is also a helpful productivity tool, because it eliminates bugs, even if those bugs don't turn into security vulnerabilities. Whether that is worth the mental overhead of the lifetime/borrow system is something you have to decide. Personally, I find it's an easy choice.
I couldn't quickly figure out what sort of applications this person writes for a living. But for 95% of applications I can think of: whatever, you do you.
Lots of people have pointed out flaws in his arguments here. There are always at least two collaborators on a codebase: you and you in two years time. You can enforce your own documentation best practice, but it's always possible to make mistakes unless you have a probably correct way of enforcing those best practices. I'm sure there are other very valid criticisms.
But in the end, there's a good chance that 100% guaranteed memory safety actually doesn't matter for the software that write. If you told me this person writes software for pacemakers or ABS systems or something similarly safety critical I'd take issue. But beyond that it's a sliding scale from merely irresponsible to an inconvenience.
You don't have to care about memory safety, but if you don't you probably shouldn't speak for "systems programmers" as a whole. Systems need to be safe.
I'd probably agree with that characterization. It's not that people are out there specifically claiming memory safety solves everything, but rather that the 'public consciousness' seems to have utterly forgotten that any other kind of safety exists.
As a result, you get languages like (and mainly) Rust being held up as the paragons of program correctness and safety in general. Any discussions of program safety end up being entirely about memory safety.
It's just an endless hoard of comments/blogs/posts/etc of people conflating memory safety with correctness and not even acknowledging - or possibly even knowing - there are other kinds of safety out there.
So the rhetoric being referenced there is largely implicit. Anybody who actually mentions memory safety in specific terms is already ahead of the curve.
At least that's the theory. Thinking about it, there are probably people out there who can nail things like memory management but are terrible at eg higher-level application architecture to such an extent that they're ultimately not competent engineers anyway. It's possible at least.
reply