I agree. UNIX has some brilliant ideas, like isolated functions (executables) connected by streams, but beyond that it made a lot of mistakes that we are still dealing with today.
The biggest one being a C-centric view of programming that has cost the world untold billions of dollars when dealing with untrusted data. It could have used a statically analyzable functional middleware of some kind, falling back to micro optimization only when needed - the way that Clojure works with say, JavaScript.
The other major failing that I see is overlooking ideas from ZFS, that the filesystem can act as a virtual tree over any storage medium, so UNIX wastes a lot of time on things like dependency hell, permissions, and distinguishing between file and socket streams or local and remote processors. It could have jailed each process in its own sandbox where copies of libraries are reference counted by the filesystem, running in a virtual storage and thread space. We're just now seeing the power of that with Vagrant and Docker (technically it took so long to get here due to virtualization resistance by Microsoft and Intel).
My other main gripe is more about approach than technology. UNIX (and LINUX especially) stagnated decades ago due to the RTFM philosophy. The idea being that to be proficient in UNIX, one had to learn the entirety of the operating system. This goes against one of the main tenets of computer science, that we are standing on the shoulders of giants. So I really appreciate how passionately the Alto tried to make computing and programming approachable to the masses.
I keep hoping someone will release a portable lisp machine that can run other OSs under virtualization and release us from these antiquated methodologies..
> The biggest one being a C-centric view of programming that has cost the world untold billions of dollars when dealing with untrusted data.
That's pretty simple to explain: all those other options were just way too slow to get the kind of performance required out of the hardware available at the time. The difference was simply too large to be ignored.
It's all nice and good to theorize about how the past should have been, but without UNIX you probably wouldn't be writing any of this on the medium you're currently using.
It has its flaws and it is far from perfect but at the time it fit the bill nicely.
The real problem is that we are categorically unable to move on when better options are around. There is a large amount of silliness involved when it comes to making responsible choices in computing, lots of ego, lots of NIH. Those are the real problems, not that UNIX was written in C.
> all those other options were just way too slow to get the kind of performance required out of the hardware available at the time. The difference was simply too large to be ignored.
If you compare with Xerox PARC hardware not really, the major issue was the price to produce the type of architecture they were having.
As for safe systems programming, Burroughs was already doing it a decade early in computer hardware much weaker than a PDP-11.
The real problem is that we are categorically unable to move on when better options are around
Every 10 years or so the industry just restarts the same loop it's been stuck in since the Amiga (actually the Amiga is probably the first loop starting with the Alto) just with different syntax and faster hardware. Software is stagnant; ALL progress in in hardware. And with the end of Moore's Law that is grinding to a halt too.
I have some hope that with the end of Moore's law in sight we will finally be able to concentrate on the software for some progress. All that we've achieved to date seems to be prettier way to squander cycles.
And in a way that's a real pity. It could have been that if Moore's law had been a doubling in 30 years rather than 18 months that we'd have had a lot more appreciation for writing good software. As it was the crap won out over the good stuff simply by being bailed out by Moore's law just in time for the next cycle.
But in some alternate universe hardware progress was so slow that any gains had to come from better software.
> All that we've achieved to date seems to be prettier way to squander cycles.
And stack turtles.
Whenever i see a headline about unikernels, i envision doom running on DOS in a VM on top of Linux on top of some hardware somewhere. How many layers of (potentially leaky) abstractions are we looking at?
> The biggest one being a C-centric view of programming that
> has cost the world untold billions of dollars when dealing
> with untrusted data.
I'm not sure it cost anyone anything. I mean, a lot of the OSes were/are written in it, so if you were going to go down that path you'd have to not totally forget to add a rather large benefit in the credit side. It's hard to imagine but programming wasn't always about compensating for not quite understanding how the 17 different frameworks you've downloaded from github and dragged into an IDE worked by just getting a faster machine. Once upon a time people had to carefully measure how much to unroll the loop, or how small a lookup table they could get away with before the errors become a problem.
And that was already being done in the 60's with much better languages, but the UNIX revisionists don't like people spend time reading about Burroughs, Algol-68RS, PL/I, Mesa and many other languages provided by mainframe owners.
The biggest one being a C-centric view of programming that has cost the world untold billions of dollars when dealing with untrusted data. It could have used a statically analyzable functional middleware of some kind, falling back to micro optimization only when needed - the way that Clojure works with say, JavaScript.
The other major failing that I see is overlooking ideas from ZFS, that the filesystem can act as a virtual tree over any storage medium, so UNIX wastes a lot of time on things like dependency hell, permissions, and distinguishing between file and socket streams or local and remote processors. It could have jailed each process in its own sandbox where copies of libraries are reference counted by the filesystem, running in a virtual storage and thread space. We're just now seeing the power of that with Vagrant and Docker (technically it took so long to get here due to virtualization resistance by Microsoft and Intel).
My other main gripe is more about approach than technology. UNIX (and LINUX especially) stagnated decades ago due to the RTFM philosophy. The idea being that to be proficient in UNIX, one had to learn the entirety of the operating system. This goes against one of the main tenets of computer science, that we are standing on the shoulders of giants. So I really appreciate how passionately the Alto tried to make computing and programming approachable to the masses.
I keep hoping someone will release a portable lisp machine that can run other OSs under virtualization and release us from these antiquated methodologies..
reply