Eletronics went from "components only" to "components and software" and that's much more difficult. You can't just "tinker" anymore once software enters the picture. There's an extra abstraction level that requires a lot more work with no reward to get over and that unfortunately filters a lot of people out.
Integrated circuits weren't as accessible; you had to find the data sheet, if you weren't careful you could fry expensive chips easy. Computers were actually hard to damage with software.
You could rip up old electronics for discrete components and make use of them; IC's were much more difficult to salvage for reuse. And yet those were rapidly where all the fun was.
It also meant far fewer people were inclined to own them, and they were far less accessible.
These days we still do the same thing, those of us who are inclined - but we do it with the internet, funky web services, a thousand programming languages, and so on.
And if we want to tinker... we go buy an arduino or something.
It's very instructive to look back to the 70's when electronics running a little bit of software had just come into being.
The big deal, at first, was really with memory. Your alarm clock could ring at the same time reliably. If you invested in a VCR, it could record at a programmed time. If you had a synthesizer it could store and recall exact preprogrammed patches. Pinball machines could downsize in weight and keep truly accurate scores instead of relying on tempermental relays and score reels. And so on, with every category of gadgets getting the computerization treatment. Although not everything succeeded there were lots of straightforward cost and quality improvements, with the main downside being that IC designs are less obviously repairable.
And then pretty much every year afterward, the push was towards cheaper with more software, with decreasing margins of real improvement, with the "smart" device representing an endpoint where the product is often price discounted because its networking capability lets it collect and sell data.
What comes to mind is the Rube Goldberg machines and their expression of a past era of exuberant invention, where physical machines were becoming increasingly intricate in ways not entirely practical. Our software is kind of like that.
Uggh. All this is is another reminder about how close Commodore Business Machines was in the mid-late 80's at revolutionizing the PC industry and how they then dorked it up inexplicably.
Not seeing a whole lot of "falling" in the current PC industry, but then again a lot of companies in the early 80's were on top of the world.
This. The interfaces kept changing at an accelerating rate. In the 80s people bent over backwards to not change the interface. That all went out the window.
I would be careful to conflate internet connected trash with enough ARM cores and LoC to make your brain melt with the likes of a well placed humble microcontroller.. the mechanical or analogue components they replaced were usually far more temperamental, bulky, expensive and bad at their job. There are exceptional environments like in nuclear power, but for most purposes integrated electronics have improved reliability when done well.
The inability of electronics companies to get to grips with software has to be one of the biggest, and saddest, stories of the past thirty years of industrial history.
There was a lot less cool sensors and peripherals. Everything was more expensive and more difficult, but people did build a lot of cool things, like hacking the Altair to play music on a radio with its RF interference.
Looking back, maybe the lesson is that there are some occasions when giving up on quality in order to make a cheaper product is an excellent idea, and 8-bit computers in the early 80s was one of those occasions.
(While, for example, electric road vehicles in the mid 80s was not.)
This happened around the same time mechanical keyboards and floppies were being phased out in favor of cheap membrane keyboards and liquid-impervious CDs. This was a progression throughout the '90s starting with the end of the IBM PCs and PS/2 machines in the mid '90s - replaced by cheap clones with even cheaper keyboards and CD-ROMs - to the introduction of the iMac in '98 with its (terrible) membrane keyboard and no floppy drive. Casual computing was pretty mainstream after that.
It was microprocessors and then the IBM PC that dented electronics.
Note the difference in Popular Electronics from 1976 to 1982 to 1993:
https://archive.org/details/popularelectroni10unse_3
https://archive.org/details/PopularElectronics/PopularElectr...
https://archive.org/details/PopularElectronics/PopularElectr...
Eletronics went from "components only" to "components and software" and that's much more difficult. You can't just "tinker" anymore once software enters the picture. There's an extra abstraction level that requires a lot more work with no reward to get over and that unfortunately filters a lot of people out.
reply