Meanwhile, on a system from the 80s, you wouldn't have a word processor, you wouldn't have support for any of the filesystems supported by modern OSes (well, maybe FAT?), and you wouldn't have USB anyway. Heck, you'd need either a serial port or a floppy drive on a modern computer in order to even have a chance of transferring data between them.
In those days, you could pretty much never take any software to different hardware, and even different minor OS revision was asking for trouble. What you are asking for isn't something anyone expected back then. Different hardware was _really_ different.
Its crazy to me that a tech forum poster would consider the incredibly innovative years of 1982-1992 as nothing because his preferred OS didn't exist yet.
1982: Commodore 64
1983: Apple //e
1984: Macintosh
1985: Amiga
1986: 386-based PCs
1987: Hypercard and Acorn Archimedes
1988: NeXT cube
1989: 486 released. Deep Thought defeats its first master, WorldWideWeb released
1990: Power Processor, Gopher, EFF founded
1991: PGP, AM386, Python
and that ignores the incredible BBS and shareware scene that blossomed at that time. A lot was going on in the mid to late 80s in the PC world.
I got really into computers in the mid-late 90s by which point we were mostly down to Wintel and Apple hanging on by a thread.
What I remember from that era is that nothing was compatible with anything else. It took a lot of work to interoperate between two PCs, let alone cross the gap between OSes. So for a long time, I have kind of taken the current world of few OSes that are highly interoperable as being a great thing: you can build your own Linux machine and still do real work with people on Windows and Mac, etc.
But the more I learn about computing in the 80s and early 90s, the more I’m impressed by the variety and diversity of ideas that came out of that era. I see now that today’s standardization has a real cost, which is that we don’t really see new ideas or new paradigms.
For the HN crowd, especially those who are older than me and can remember the earlier era of computing, what do you think about that trade off and where we’ve ended up today?
Are we better off living in a world where all computers can work together with minimal fuss? Or would it be better if we still had a wide range of vendors taking significantly different approaches and innovating at a much faster pace - albeit in incompatible ways?
Not sure I agree completely with that. These were simpler OSs for a simpler time. A lot of what I do today I accomplished in the 80s with an Apple II, a phone line, and a modem. I read emails (we didn’t call them that back then), wrote software, built databases, discussed esoteric topics in BBSs… Even met my first wife online over a Minitel-like system.
The main difference I see between 80s tech and today, besides the obvious increase in specifications, is the emphasis on an extremely simplified user experience. Before Microsoft and later Apple started emphasising ease of use in the mid-nineties, tech was niche and obtuse. Things were possible, but not easy to use. There were a lot of good ideas coming out but none of them were ever refined enough that Grandma and Grandpa can just pick it up and use it. 80s tech was often a solution in search of a problem, with the generous assumption that every consumer was patient enough to sit down and learn its intricacies.
An example of this I've seen was the pay-TV service Tele1st. In 1984, this was billed as a way to rent first-run movies to consumers without having to go to the video store, by recording them onto tape overnight from an encrypted broadcast. Sounds simple, right? Only, the encryption was a PITA to set up: you needed to wire this big-ass box into your phone line and VCR, and prior to every broadcast of Tele1st, you would have to adjust a knob on the bottom of the unit so that the contrast matched a test pattern shown before the first movie. If you didn't do this, the encryption would fail. See for yourself: https://youtu.be/jR8YQu1HT8Y
Another example was the generous assumption that we would all be BASIC programmers. BASIC was pushed as the next essential household skill, like operating the TV or running the washer. V-tech toys came with reading, spelling, math, and BASIC. But in the end, all Grandma and Grandpa want is something that works when you press the button. They don't have the time or patience to figure out BASIC for a recipe keeper, that's what the modern-day App Store is for.
One of the strongest lessons Steve Jobs taught the tech world, is that the ease-of-use makes the difference between the IBM Simon and the culture-changing impact of the iPhone.
There were better justifications back then though. Stuff ran on completely different hardware and layers of abstraction weren't really workable. Some of the BASIC was more or less compatible but then again most stuff was machine code and BASIC was usually just the OS's terminal and bootloader of sorts.
One could not reasonably expect the same software to work unchanged on a MOS 6502 based computer like the VIC or the Apple ][ and in a Z80 based computer like the TRS-80. And there were many more differences at the hardware and ROM-kernel level that made cross-platform software largely unworkable.
As part of my job I frequently have to work on very old PCs that are part of manufacturing tools. I started working in this area in the early 90s so I have both nostalgia for the old days but still excited about new stuff we're coming up with.
Things I don't miss:
- Managing IRQs on the ISA bus or more generally messing with the BIOS
- dedicated keyboard/mouse connectors
- physical serial and parallel ports
- large plug-in cards
- incompatible monitors (Hercules, CGA, VGA, ...)
Nowadays most of our external equipment (motion controllers, sensors, barcode readers, etc.) is either USB or Ethernet and I think that transition happened maybe 10 or 15 years ago? It's just so much easier now with fast-enough external busses.
I think part of the problem with a 100 yr scheme is that you really only get economies of scale with highly integrated memory and other support chipsets with physical standards that change as manufacturing capabilities get better.
So at the two extremes there's the hobbyist track (RPi/Arduino) and there's all the amazing tech crammed into our phones for a ridiculously low price.
Just like in early 80s I cannot install an OS on my game console, watch or phone. Just like in early 80s I can install an OS on my PC. There had always been open and locked hardware, even in the blessed 80s. And no, I don't mean that smartphones did not exist. There had been mini-computers, mainframes, various embedded systems from calculators to microwave ovens etc etc.
Though there were more overall platforms back then, the hardware for any given platform was a lot more uniform than any modern PC. OSs barely did anything where they existed at all, which also helped. Finally, codebases were a lot smaller and everything was single-threaded so there was generally just less to go wrong.
Tinkering with hardware back then was much easier because the frequency of the devices was pretty low (like 1-8 MHz) and the interfaces and system bus was much easier to work with compared to todays advanced protocols like (USB, etherhet...)
Back in the late 90's I worked on a system that shipped on 4 different chip architectures, 4 different (unix-based) operating systems, dealt with different endianness and was more reliable and easier to understand. And it was more responsive to users with 1990's hardware than stuff is today. :shrug:
reply