Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Yes...with a lot of asterixes.

Ask anyone who worked in retail for a long time. When they replaced the point of sale and catalog search systems in the early 2000's, everyone's productivity plummeted.

They replaced old, DOS/Norton-style keyboard-only systems with a legion of shortcuts and archaic codes with Web-based touch-screen or mouse-based UIs with modern UX etc.

The new technology was obviously more modern and allowed faster training, and it's hard to imagine any world where that upgrade doesn't happen. But the old system was faster, and allowed experienced users to do more.

Now, if you're talking about a Blockbuster Video, maybe it's the right choice.

But if you're talking about extremely specialized semiconductor manufacturing, maybe you stick with the highly optimized system that works. Some things changed a lot since 1987. Others haven't.



sort by: page size:

Honestly, we've gone backwards in a lot of ways.

I remember, in the mid-90s, looking up stuff at a library on a VT100-ish terminal connected via some sort of serial arrangement to a larger computer.

Function keys were clearly labeled, searching was fast and worked well.

None of the heavier replacement machines ever felt as snappy or as useful.


This is so true. I worked as a supermarket checkout operator at a large Australian chain of supermarkets for a few years, and the PoS systems were redone while I was there. Before were old Fujitsu machines running an old version of OS/2 with mechanical keyboards. After overhaul, Windows xp embedded machines with touch screens. The touch screens were a massive step backwards, especially when it came to punching in produce codes (many of which I can still remember today).

That and the 'express' lanes being physically smaller used a different, smaller model of scanner rather than the larger scanner built into the counter. The smaller scanner, though removable which was helpful when scanning large crates of soft drinks and other heavy items (would pass the scanner to the customer so they didn't have to lift those out of the trolley) was slower and laggier for everything else and had a smaller "field of view"


I wonder if it ever had a positive impact on sales. It seemed so overkill, even back then, and today even more so. The idea was nice but imagine the additional costs for hardware development and also for software. Seems so silly nowadays.

20 years ago yes.

Today it is a terrible monster that you use because of the lack of alternatives, not because it is fast or comfortable to use.


My first job required the use of an AS/400 for data entry. Hated it at first, but really grew to appreciate it for what it did. That single machine ran an entire manufacturing site. When we were forced to use a Windows front-end to enter data, it was an order of magnitude slower, it was that cumbersome and inefficient. Other front-ends, doing real-time visualisation of the plant operations were actually useful. But the AS/400 fills an important niche, and it's one which I don't think modern systems do a good job at replacing.

yes but they're missing the take-the-radio-apart or take-the-computer-apart or fit-whole-instruction-set-in-one's-head advantage that earlier eras were granted.

So to speak, yes. There was once a competitive marketplace with all sorts of terminals featuring totally different emulations.

Yep, especially as IT moved downmarket from mainframes run by government, military and large corporations to the average Joe, who can reboot, reinstall or just buy a new one.

No, it was my go-to for years when I worked in systems sales/deployments - affordable (by contemporary standards), predictable, reliable. I went through some Linux struggles with it, but that was true of all hardware at the time. It was a lot less painful than trying to get X-Windows running on higher resolution VGA modes.

I think so. Compaq just proved that it could be done without being bankrupted by an IBM lawsuit. IBM was the 800lb gorilla like Microsoft was in the 90's or Apple/Google are today. Being sued into oblivion was the real fear, not that it was technically a difficult job.

Some of the other companies I mentioned were the ones that followed Compaq's lead and did their own clean room versions which they then sold to everyone else who came flooding into the market.

As an aside, this was actually a problem for the first few years: take multiple independent clean room implementations, bake in some silly things that the manufacturers did with them (gotta customize, right?), and combine it with the bad programming practices of the day (i.e. programs written in assembly doing jumps into undocumented portions of the BIOS because, hey, it worked on the IBM and saved a few bytes/cycles or did something useful...) It was similar to how well web standards worked not too many years ago with multiple browser implementations doing/interpreting things slightly differently and developers doing things they shouldn't because they could... and that was with a public spec designed to be copied. Eventually things get worked out, but both had a rough start.


Productivity vampires? Is that really legacy systems you refer to, or is it just yesteryear's fad that's starting to show its true colors?

Some time ago I worked on a modern travel reservation system frontend, to replace an aging 5250-based one. When I finally spent a day with the customer's end users to validate some things and learn how they worked (those things never shine through the specs) it was like a punch to the stomach. They were happy to see me and happy for the system replacement, but they worked with such lightning speed in the old system (where all 24 F-keys were bound and known by heart) they could keep up a conversation with the client while contantly pulling out the information they needed.

Just reaching for the mouse in that situation would have incurred noticeable latency for the customer. A snowball's chance in the proverbial hell would constitute much better odds than my web based system would have to improve their productivity. I didn't stay on that project until the end (for other reasons) so I don't know how it turned out, but it was my first real encounter with "legacy" systems and it humbled me like nothing since.

There are still high cost to maintaining these systems, and their batch-orientedness makes modern always-online mode very difficult. But productivity for end users should be very hard to match, especially since they were built for kilobit lines and megahertz computers in mind.


No, you don't go from 60s era tech to 90s era in phones without s forklift. Moving from electromechanical to stores program control is a sea change.

It's like going from Hollerith card based tabulators to a System/360


Yes, but that wasn't design related. They had desirable technology and software for the time and they marketed aggressively.

No, they were already slashed back in the early 90s as systems computerized.

So sort of, but let's not look at the past too wistfully. I also remember when moderately long text had to be split across files (even multiple floppies if you were writing a book). I remember having to choose what languages my OS installation was going to be able to display. Farther back I remember not having the spare cash for the expansion card necessary to have lower case letters and typing in program listings out of books because that's how some software was distributed (Apple II). Our current generation of PCs is probably within a factor of two of the minimum spec necessary for robust machine translation and natural language interfaces. I enjoy using the old hardware but it really doesn't do most of the jobs we have now.

Computers jumped the shark in 1998. I remember dual-booting NT4 and BeOS on a PII 300 MHz with 64MB of RAM. Connected to a 256 kbps SDSL modem, it’s the best computing experience I’ve ever had. You could do everything you can on a modern machine. Indeed, even more, because the modern versions of software like OneNote and Word are neutered compared to the ‘97 versions.

It feels like all of this effort has been spent making computer interfaces worse. The only improvement I can point to between MFC and a modern web UI is DPI scalability. Besides that, there are tons of regressions, from keyboard accessibility to consistency in look, feel, and operation.


Not exactly.

Prior to the 1980s, most computing was on highly proprietary mainframes. The industry was dominated by IBM. Consumers (companies, not individuals) could not _own_ their software. It was all licensed product by IBM and the big vendors. Many vendors did not sell, but instead licensed, their machines, as well. It was very closed, and innovation cost the inventor dearly, and made the vendor a fortune.

The 1980s saw the maturation of the mid-size market, the revolution of the new PC world, an opening up of the hardware and software world, and a hugely exploding new user base. The entire paradigm for the computer market changed. Note, though, the tendency of vendors (ahem, Apple) to return to this world.

Anyone who wants the old days has not been studying history.


Agree. I supported a DOS based touchscreen Point of Sale (POS) with technology from decades ago and it was fast AF. You only got 256 colors and the fonts didn't look that great. But it was a speed demon.

No :D. Good thing we had many products, not all using that system :D
next

Legal | privacy