Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I can remember the early 90s. I think the feeling then was very similar.

We had a number of technologies that were cool, but didn't work right. Think WebTV, PDAs, speech recognition, OCR, virtual reality. We also had continual evolution in the price/performance of PCs. I remember all the IBM clones - seemingly a new vendor was popping up every couple months, and they all had dozens of models available. Even Apple was drowning in continuous, gradual improvements - this is when they numbered all of their products. Remember the Centris 660AV, Powerbook 520/c, or Power Macintosh 9500?

I think we're seeing the same effect now. Existing technologies are being gradually refined. The price of cloud computing is falling through the floor. Those industries where the building-blocks are open (notably web startups & mobile apps) face a deluge of small-time competitors. Those where it's closed (search, mobile OSes, hardware) face a number of small incremental improvements. And on the horizon you have a number of exciting technologies that are far away from commercialization (SpaceX, self-driving cars, Bitcoin, 3D printing, wearables, VR...still).

I bet that the next big thing is being worked on in someone's garage or living room right now, and it's probably nothing we've heard of. The WWW came out of nowhere in 1995. Except it didn't - it built on TCP/IP (1973), DNS (1983), the personal computer (1975), and graphical user interfaces (1984).

If you read Kuhn, he describes the history of science as long periods of gradual refinement of theories ("normal science"), punctuated by overthrows of the established scientific consensus ("paradigm shifts"). Tech is much the same. It's been a long time since we had a paradigm shift, but that doesn't mean that it'll never happen again. Rather, it may mean that there's fertile ground for one to happen now.

The one caution is that very often, paradigm shifts don't look like paradigm shifts to people inside the old order. They look like trivial toys, because they grow out of small experiments within the existing paradigm. Scientifically, it usually takes a whole generation for new paradigms to take root, because the old guard of existing scientists never considers them important - they have to die off before the new paradigm replaces the old. The same thing happened with webapps - old-school mainframe and desktop programmers considered them trivial toys - and it may be happening with mobile. So, something for all the folks on HN who say there's been no technological progress - has there actually been none, or are we just the dinosaurs who're out of touch with what kids these days are using?



sort by: page size:

It's kind of sad that we are so stuck in the current paradigms. The 90s were definitely much more exciting in terms of innovation.

Sure, but that's not technological innovation, but mere gradual evolution. Things were moving a lot faster in the nineties and the oughts. I think most people who remember the advances in computing in the nineties feel that we're in a technological slump. We hoped that with more people going into computing things would speed up, but that hasn't happened. That's not necessarily anyone's fault, though. Maybe the low-hanging fruit had already been picked.

Could just be where we are in the technology cycle. Using Carlota Perez terminology, in 2010 we were in the midst of Synergy for web technology and Frenzy for mobile. Now both web & mobile are nearing Maturity and whatever the next big technology cycle is still in Irruption.

If you looked at PCs from 1993-2003 you would've had a similar view. PCs from 1983-1993 underwent dramatic progress: you went from 16-color TV outputs, 64K of RAM, 8-bit CPUs, floppies, command-line interfaces, and BASIC to 24-bit color, 3D computer graphics, GUIs, 16MB of RAM, 200+ MB hard disks, 32-bit CPUs, IDEs, desktop publishing, CD-ROMs, modems and Internet access, even speech recognition and text-to-speech on some Apple machines. From 1993-2003, you had incremental progress: Microsoft won, Windows 3.1 became Win95 and then eventually Win2k, CPUs got faster, RAM and disks expanded, broadband happened, but what we used the computer for didn't change much, except for the advent of the Internet. The Internet itself was supposed to revolutionize computing, but the dot-com bust happened in 2001 and in 2003 it was still pretty much a toy. And other much-hyped developments like WebTV, VR, voice recognition, and AI had fallen flat.

There are plenty of toys that are still in Irruption now. Cryptocurrency was supposed to change the world; the bubble burst in 2018, but maybe we'll see it come back in 2020 with DeFi the way the Internet did in 2005 with social media. Drones are literal toys right now. So is VR & AR. There's been a lot of progress in computing for kids with things like Scratch, RoBlox, or Minecraft.


I think there are two distinct phenomena happening:

1) Rate of change in fundamental technology.

2) Feature churn.

I don't think #1 is actually changing all that fast, compared to previous decades. Consider the period 1991-2001 and the period 2011-2021. I think technology change in the former was much, much faster than in the latter. A typical PC went from {486, DOS, 1-4MB RAM} to {Pentium 4, WinXP, 256MB-1GB RAM}. Linux had only just launched in 1991. ~Nobody had a cellphone in 1991. ~Nobody was on the internet in 1991.

But look at 2011-2021, and is anything really that different? Computers are faster, but it's nothing like the growth rate of the 90s. iPhones, Bitcoin, GPUs, broadband, cloud, Minecraft ... we had all these in 2011. They're just incrementally better now.

Fundamental tech is still incrementing but revolutions are few and far between.

#2, on the other hand, is in its golden age. And it's all for the wrong reasons, largely articulated by others on this thread. My addition: our ability to create new software has outpaced our ability to think of new ideas that are beneficial for users.


That's a good point, but if you believe that the rate of technological advancement is still increasing as it has been throughout modern history, you would expect each successive paradigm shift to come more quickly than the last (aside from some noise, obviously). Of course, the shift may not come in a field we currently think of as being involved with personal computing. (Much like we wouldn't have thought of cellphones as such in the 90s.)

Are things just not as fascinating now? Is there anything similarly cutting edge today. I know that part of it I was much younger but I also remember most of the people around were way older and seemed to share in the wonder and amazement at those GB+ hard drives and the thought of 32mb of ram.

I guess there's just not anything new that's advancing so rapidly.?.? Like my computer is 13 years old and is still overpowered for 99% of tasks. And they are still selling brand new computers with far lower specs. Imagine in 1997 being satisfied as a developer with a computer from 1984!

I also have other interests and it feels like they have plateaued similarly. I guess I'm still getting some dopamine from solar and battery tech, price drops at least, and some neatness around microcontrollers and IOT but like even food has stopped seeming innovate. It used to be worthy of a day trip to drive into the city to eat at new exotic offerings and now every small town has mostly the same stuff and there's nothing really new in the city either.


Potentially inflammatory question, but I'm asking with sincere curiosity. It feels to me that the transformative era of tech is over, and we are now in "maintenance mode," for lack of a better term. The '80s and '90s revolutionized computing, and the web matured into a real platform during the '00s and '10s. The big players grew from one- or two-person projects in literal or proverbial garages into firmly established global juggernauts. (In some cases this began in the '70s.) Whole industries—publishing, entertainment, transportation, retail, communication—were fundamentally changed, and whole new ways of connecting people were developed. New ideas and technologies were sprouting up everywhere, and they were exciting and solved obvious problems. It wasn't all roses of course, and there's much discussion to be had on the externalities, but it was undeniably revolutionary.

Now it seems, to me at least, that the party is more or less over. I'm wondering if anyone else feels this way or if I've simply become jaded and cynical. (I am prone to jadedness and cynicism so this is entirely possible.) Things like "the metaverse" and the bizarrely-named "web3" feel like desperate attempts to keep finding the next big thing in a field that has simply run out of big things to find. I can't identify a single small company working on something that feels like it'll be huge—the next Google, Netflix, or what have you. The spirit of the early days seems impossible to find now.

This is just my personal experience. I would love to hear both from people who feel this way and from people who disagree.


> Shifts in computing paradigms are incredibly rare.

They've only happened every decade so far: 1960 (IC), 1970 (DARPA), 1980 (PC), 1990 (GUI), 2000 (Internet), 2010 (smartphone).


In my eyes this kind of feels like another sign that things have stagnated in tech (on the fundamental innovations front, not the 'people are doing things' front). There was a massive explosion of thing happening a decade ago but today not so much, even to the point that you can get away with using years old tools without really much disadvantage.

We seem to be in an interesting time where everyone is casting around looking for the next "big" idea, regardless of whether it works, and as a result the only way to do useful "small" ideas that work is to fund them yourself or get ordinary, non-import people to help fund them (i.e. crowdfunding or ICOs). All the attention is on flying cars, self-driving cars, killer robots, alternative currencies, artificial intelligence, 600 mph vacuum transportation, and missions to Mars.

The last time I can think of when the tech landscape looked like this was the early 90s, when everybody was hung up on artificial intelligence, pocket computing, handwriting recognition, voice recognition, WebTV, 3D graphics, and virtual reality. We ended up getting many of those, 15 years later, but the real huge story of the decade was the WWW, which was really unimpressive when it first came out (I remember comparing it unfavorably to Gopher in 1993; Gopher at least was semi-organized).

The WWW overshadowed everything else because the problem it was solving - which many people didn't know they had - was more universal than the problems solved by any other technologies that had just entered the market, and its solution was just barely viable enough to solve that problem. Meanwhile, the tech for many of the other much hotter problems of the time was 15-20 years out; they couldn't actually be solved by the processing power available in 1992. I wonder if there's a similar overlooked-but-universal problem that someone in a garage is working on now, that'll spark a new wave similar to the dot-com boom.


Tom Peters (In Search of Excellence, 1982 w/co-author R. Waterman) in his subsequent late 80s & earlist 90s books mildly complained there seemed to be no next big thing on the horizon. Computers, office networking, and home computers had become embedded in society. What was next? BOOM! We're living in it.

Another love-to-hate techie (Micro$oft founder) said in "Business at the Speed of Thought" that we tend to overplay expectations of innovations for the first couple years, but underplay their impact over the following decade.

Screechy dial-up took a long time to become giga-bit transfers of the entirety of human thought. Now augmented in "AI."

What's next? The thousands of patents long sequestered away from the world in the name of national security. Innovations that've been perfected for decades and used _against_ the greater good.


I'm pretty sure those days are gone forever. Even if there's a slowdown of CPU power, there's still quite a lot left over for the sloppy coders of the world. Especially when we consider a lot of software isn't CPU bound, but storage bound, and even the most commodity SSD blows away the spinning disk. Oh, we're also all 64-bit now and computers shipping today come with 8gb of RAM.

We recently played the constraint game with the mobile/tablet space. It was fun while it lasted, but now with quad-cores and 1 to 2gb of RAM standard, its over as well. Tablets have become the new laptops. Phones have become the new tablets. Its boring.

I think the days you pine for are forever gone, at least in the general computing space. I find that once something new comes around, its naturally constricted and if it allows people some level of freedom and creativity, then the weirdo early adopters will rush in, and in that group there will always be a handful of super-stars. This minority makes the big crazy strides or the crazy efficient game and then go into more respectable work.

We also saw this with the early web, which was so much more innovative and experimental than the "social marketing" mode we've all agreed works best. That matured quickly as well. Look how the web browser is pretty much a platform, if not a quasi-OS, in itself. We keep reinventing the big bloaty OS and big bloaty applications for reasons that make sense; because people want big bloaty toys and the bells and whistles they offer.

So where's the next new constrained system young hackers are going to blow ours minds with? Maybe 3D printing. Maybe drones. Maybe VR. Maybe automated and electric cars. But even those have a certain level of maturity already. We might not know until it actually happens. Who saw TBL and the www coming? I suspect very few.

I personally hope to wake up one day to a new Jobs/Woz combo offering me something straight out of sci-fi, like a home robot with useful arms and enough AI to make use of it or a lucid dreaming machine that actually works or a nootropic that makes us all near geniuses. Come on guys, stop writing bejeweled clones and financial apps and blow our minds again. I suspect I still have a few mind blowing events left in my lifetime (almost 40 now).


I do have times when I look at my phone and think about the first "computer" I touched. I'm still amazed sometimes of how tech changed since I was 6-7 years old(talking almost 40 years)

I'm no fan of lab growing anything because or banning people from raising animals or having food plots because those that make the food make the rules. That being said... I wouldn't be suprised to see, in 2-3 decades, something taking off in that regard. All it takes is one stroke of genius and a ton of elbow grease to change it all.


For those who weren't alive at the time it's probably hard to understand what an inflection point Windows 95 was.

Both in terms of the number of homes with computers in them and what those computers were able to do things skyrocketed.

The software business went from primarily being targeted at businesses, schools, and hobbyists to being targeted at mainstream consumers in a once ever opportunity to establish a brand name with people who have literally never purchased anything in this class before.

It was the most dramatic shift certainly that I've seen in my life with the adoption of the internet being second and the adoption of mobile phones being third.

It's been more than a decade since we've seen anything like those shifts. The 2010s feel a lot like the 1980s to me. Lots of progress, lots of it incremental, but no inflection point. I do wonder if AI will be the next big paradigm shift like Windows 95, the internet, and the iPhone.


Yes, but between 1995 and 2005 was a decade when everyone seemed to think that "tomorow we'll wake up, in a different, better, world". And they had inventions to back up that feeling. I don't know anyone who feels that way now, and I don't seem the inventions to back up such a feeling if it were to be felt. And the beleif at the time, was, that we would see growth in inventiveness as tech improved, it would enable more tech.

90s were a time of ideas in tech. Not all of them were good, but it does seem like there were a lot more of them. Now it just feels like re-arranging the same furniture in the same room over and over.

Hmm I don't agree. We're far away from the frantic hardware and software progress in the 80s and 90s. Especially in software development it feels like we're running in circles (but very fast!) since the early 2000's, and things that took just a few months or at most 2..3 years in the 80s or 90s to mature take a decade or more now.

Leading up to the New Year's, I was watching the CNN history documentary on Netflix. It was the one executive produced by Tom Hanks, and covers the 70s, 80s, 90s, and 2000s. I went backwards and have just finished the 80s.

I know the history of computing technology pretty well... but it was striking to see it laid out like that, in the context of changes with art, culture, domestic and international politics.

Retrospecting on that, in the heyday when Moore's Law was so visible, before the PC revolution gave birth to the Internet revolution, new computing devices were changing so fast. You could go into a store a month later and see a lot of things that are new. I remember the Columbus OH Microcenter. There was a tiny used computer store nearby. Both were packed. That business worked because there was so much turnover.

You don't really see that nowadays. The most exciting thing that come out tbese days is the Threadripper, but for most consumers, it isn't necessary anymore.

Most of the rapid changes are taking place where people can't really see it anymore (unless you are an insider) -- in the cloud, with AI.

Even though changes are still happening rapidly, I think the _visible_ changes have petered out for the consumer.

What I am saying is that, even if Fry's were able to compete on price with internet retailers, something more fundamental has shifted.

Now maybe I am just saying this from the perspective of the US. I hear that in China, the development and competitiveness of smartphones is a lot greater there than in the US. Apple has a difficult time competing in China because there are many stylish, hot smartphones within that domestic market. If so, then perhaps it isn't that changes are no longer being visible ... but that the US is no longer the focal point of consumer product innovation that it one was.


I wonder whether we'll be seeing a return back to basics in terms of the internet/applications soon. I find it amusing how programs from the 90s can have much better usability, more features and better reliability than something released today in many cases.

The optimist in me says that it will happen as almost everything popular becomes enshittified and unusable, so the "old" alternatives come back because they are just that much better, so a new cycle of innovation starts again. The pessimist in me says that it won't happen and in 5 years, we'll be using our computers as slightly larger phones and everything replaced with hyper-monetised, LLM-generated trash.

next

Legal | privacy