Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It talked to real hardware so it wasn't. There was also an effect of professors doing the same thing for many, many years and after a while they simply stop following the latest and most current things in technology.


sort by: page size:

the same thing that happened with computers

at the time that was true.

situation changed, ie., they came up with better solutions.

could they theoretically make the old hardware work? maybe not, but at one point they believed it.

that’s the way of the world.


You are right. Hardware hardly changed. However, one can argue, that software that we install into the new copies of hardware changed quite a bit. I think education and developing critical thinking skills should be the key to fixing those problems. Alas, the current trend is to try a way to get an education with as little effort as possible, which is destined to fail.

This. The interfaces kept changing at an accelerating rate. In the 80s people bent over backwards to not change the interface. That all went out the window.

Because the technology and hardware being debated was obsolete.

You view things perhaps too much through modern eyes.

For them, that was some amazing hardware to work with. A lot of new ideas were tested out on the 704.


I think Fred Brooks also talked about that in one of his essays didn't he?

(I guess I'm being facetious because I think everyone should read his book of essays. The hardware platforms have changed dramatically since the 1960s, but the wetware hasn't changed a bit)


Hardware has been around a lot longer than software.

So you're essentially saying it was a stop-gap solution until hardware caught up for many use cases.

yea but it turned into a machine that nerds by when they wanna automate something stupid.

Their research probably found people LIKED it more, not that it performed better. Computers went mainstream and functionality became secondary to seeming high tech.

It didn't last long, maybe a decade before integrated devices won out.

i wouldn't say in our heads like we made it up, but it's just an experience that you cannot replicate if you've already experienced modern computing. growing up and experiencing the changes from analog to digital, dial-up to always on gigabit, 8bit to 64bit address space, 8bit color to 32bit, 8bit audio to 24bit, is all different because we have that frame of reference so it means more to us.

also, i didn't read manuals back then. i didn't know where they were. closest to a manual i had were Byte magazines


The joke at the time was that if your machine was more than 6 months old it was obsolete. So many aspects of computers were developing at a breakneck pace it was almost impossible to keep up.

Yes it was but the thing is that the problem to solve then got harder.

Think about video streaming, 3d acceleration, sound input and output.

Then whereas you previously had only a handful of vendors to coerce into agreement, now you have a multitude of individuals that are going to complain for every change.


There was a lot less cool sensors and peripherals. Everything was more expensive and more difficult, but people did build a lot of cool things, like hacking the Altair to play music on a radio with its RF interference.

I think that years ago the technology wasn't ready to do what they wanted it to do.

At my university I asked about these flashy new machines which I'd never have a chance to use, I was told they were slow. That was way back then.

The biggest factor working to their advantage was that the tech back then was much simpler and more robust. To get todays tech to work for a decade without interruption would be a very tall order. Layer upon layer of abstraction has made it impossible to know for sure that there are no edge cases that will only trigger once every 3 years or so.
next

Legal | privacy