Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

This. The interfaces kept changing at an accelerating rate. In the 80s people bent over backwards to not change the interface. That all went out the window.


sort by: page size:

the same thing that happened with computers

They did keep iterating on the product until faster, cheaper components arrived. They had a product called GlobalView, which lasted all through the 80s and early 90s. Of course I know some people who stuck with it.

What they couldn't wrap their heads around was: no one cared anymore. The world had moved on.


It seemed all of the I/O access on 80s BASIC machines was proprietary or at least not standardized.

The main difference I see between 80s tech and today, besides the obvious increase in specifications, is the emphasis on an extremely simplified user experience. Before Microsoft and later Apple started emphasising ease of use in the mid-nineties, tech was niche and obtuse. Things were possible, but not easy to use. There were a lot of good ideas coming out but none of them were ever refined enough that Grandma and Grandpa can just pick it up and use it. 80s tech was often a solution in search of a problem, with the generous assumption that every consumer was patient enough to sit down and learn its intricacies.

An example of this I've seen was the pay-TV service Tele1st. In 1984, this was billed as a way to rent first-run movies to consumers without having to go to the video store, by recording them onto tape overnight from an encrypted broadcast. Sounds simple, right? Only, the encryption was a PITA to set up: you needed to wire this big-ass box into your phone line and VCR, and prior to every broadcast of Tele1st, you would have to adjust a knob on the bottom of the unit so that the contrast matched a test pattern shown before the first movie. If you didn't do this, the encryption would fail. See for yourself: https://youtu.be/jR8YQu1HT8Y

Another example was the generous assumption that we would all be BASIC programmers. BASIC was pushed as the next essential household skill, like operating the TV or running the washer. V-tech toys came with reading, spelling, math, and BASIC. But in the end, all Grandma and Grandpa want is something that works when you press the button. They don't have the time or patience to figure out BASIC for a recipe keeper, that's what the modern-day App Store is for.

One of the strongest lessons Steve Jobs taught the tech world, is that the ease-of-use makes the difference between the IBM Simon and the culture-changing impact of the iPhone.


It talked to real hardware so it wasn't. There was also an effect of professors doing the same thing for many, many years and after a while they simply stop following the latest and most current things in technology.

People tend to forget this was during a period when computers were computers and actual work was done on them. Now they're over powered terminals.

It also meant far fewer people were inclined to own them, and they were far less accessible.

These days we still do the same thing, those of us who are inclined - but we do it with the internet, funky web services, a thousand programming languages, and so on.

And if we want to tinker... we go buy an arduino or something.


You already had such computers, but the C64, Spectrum, BBC, Atari, Amiga and many others were following the same architectures we are going back to. It was the PC that lead the way into user customization, but the ever decreasing thin margins seem to have killed it

Honestly, we've gone backwards in a lot of ways.

I remember, in the mid-90s, looking up stuff at a library on a VT100-ish terminal connected via some sort of serial arrangement to a larger computer.

Function keys were clearly labeled, searching was fast and worked well.

None of the heavier replacement machines ever felt as snappy or as useful.


But computers were different in the 80s when X was designed (to talk about one example). We don't use networked terminals to a mainframe, the graphics stack (hardware and software) works completely differently now, user expectation on compositon and visual fidelity has increased, and the security threat model has changed (in that it exists now and never used to).

There has been no one paradigm shift, merely 40 years of incremental advance leading to a different landscape with different requirements.


One of the most bizarre design decisions of the home computer era.

It makes no sense now, but there were endless experiments with physical metaphors for interacting with computers. At the time it was just not settled that this was not right around the corner and workable.

Plus standardisation (linux, usb, arm, etc) which I understand was one thing lacking in the 80s.

It's not the users that were learning, it was the (amateur) designers. (See also: the 80s DTP revolution).

In the world of hardware:

S-100 was going to be replaced by VMEbus, then the PC came along, then the ISA bus was going to be replaced by the PS/2 bus, but it was too much lock-in, so we got EISA instead.

Laptops had expansion ports, using the PCMCIA interface, we called it People Can't Memorize, Computer Industry Acronyms

Bubble memory was supposed to make rotating disk obsolete. WORM (write once, read many) drives were supposed to revolutionize backups

Back in the early days of transistors they weren't as reliable as tubes, Magnetic Logic used ferrite cores to do logic, and computers were built out of it.

MMIC logic, where a cantilever is etched in free space, and static voltage is used to move it between contacts, could have been amazing fast, low power logic, but it didn't work out

ISDN - Integrated Services Digital Network, as going to be the ultimate connectivity, until the phone networks decided it meant "I Smell Dollars Now", and eventually it became "I Still Don't kNow"

ATM - Asynchronous Transfer Mode was going to revolutionize the phone networks, but eventually it was replaced by IP.

In the world of software:

P-code was the first cross platform interpreter, then Oak/Java, then .NET/Mono, now WASM

In leu of actual capability based security, we got VMs then Containers, now we're getting WASM

The Semantic Web was going to have all of us manually give context and labels to our web pages.

Fuzzy logic was a thing way before neural nets

As were "expert systems", which were going to automate away most professional jobs

Lots of things come and go, this industry is way more fashion driven than most people realize.


Your timing is a bit off--electronics went down in the early 90s.

It was microprocessors and then the IBM PC that dented electronics.

Note the difference in Popular Electronics from 1976 to 1982 to 1993:

https://archive.org/details/popularelectroni10unse_3

https://archive.org/details/PopularElectronics/PopularElectr...

https://archive.org/details/PopularElectronics/PopularElectr...

Eletronics went from "components only" to "components and software" and that's much more difficult. You can't just "tinker" anymore once software enters the picture. There's an extra abstraction level that requires a lot more work with no reward to get over and that unfortunately filters a lot of people out.


Now that the old equipment is at the end of its useful mechanical life people are switching.

In the manufacturing I’m involved with recently got rid of their last machine that used 8080 era processors and character only green CRTs.


As Stephenson put it,

> It’s no longer acceptable for engineers to invent a wholly novel user interface for every new product, as they did in the case of the automobile, partly because it’s too expensive and partly because ordinary people can only learn so much. If the VCR had been invented a hundred years ago, it would have come with a thumbwheel to adjust the tracking and a gearshift to change between forward and reverse and a big cast-iron handle to load or to eject the cassettes.


I think that years ago the technology wasn't ready to do what they wanted it to do.
next

Legal | privacy