It talked to real hardware so it wasn't. There was also an effect of professors doing the same thing for many, many years and after a while they simply stop following the latest and most current things in technology.
You are right. Hardware hardly changed. However, one can argue, that software that we install into the new copies of hardware changed quite a bit. I think education and developing critical thinking skills should be the key to fixing those problems. Alas, the current trend is to try a way to get an education with as little effort as possible, which is destined to fail.
This. The interfaces kept changing at an accelerating rate. In the 80s people bent over backwards to not change the interface. That all went out the window.
I think Fred Brooks also talked about that in one of his essays didn't he?
(I guess I'm being facetious because I think everyone should read his book of essays. The hardware platforms have changed dramatically since the 1960s, but the wetware hasn't changed a bit)
Their research probably found people LIKED it more, not that it performed better. Computers went mainstream and functionality became secondary to seeming high tech.
i wouldn't say in our heads like we made it up, but it's just an experience that you cannot replicate if you've already experienced modern computing. growing up and experiencing the changes from analog to digital, dial-up to always on gigabit, 8bit to 64bit address space, 8bit color to 32bit, 8bit audio to 24bit, is all different because we have that frame of reference so it means more to us.
also, i didn't read manuals back then. i didn't know where they were. closest to a manual i had were Byte magazines
The joke at the time was that if your machine was more than 6 months old it was obsolete. So many aspects of computers were developing at a breakneck pace it was almost impossible to keep up.
Yes it was but the thing is that the problem to solve then got harder.
Think about video streaming, 3d acceleration, sound input and output.
Then whereas you previously had only a handful of vendors to coerce into agreement, now you have a multitude of individuals that are going to complain for every change.
There was a lot less cool sensors and peripherals. Everything was more expensive and more difficult, but people did build a lot of cool things, like hacking the Altair to play music on a radio with its RF interference.
The biggest factor working to their advantage was that the tech back then was much simpler and more robust. To get todays tech to work for a decade without interruption would be a very tall order. Layer upon layer of abstraction has made it impossible to know for sure that there are no edge cases that will only trigger once every 3 years or so.
reply