Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I mean it's not like most consumer hardware has to be on the latest nodes. The current hardware is more than enough for any consumer applications I can think of.

Maybe just need to reuse more hardware. Standardize and commodify replacement parts for mobile devices like framework/fairphone. Reduce the amount of IoT crap in toasters/etc. Use multiseat instead of thin clients. Upcycle old computers.



view as:

> I mean it's not like most consumer hardware has to be on the latest nodes. The current hardware is more than enough for any consumer applications I can think of.

Electron says, "Hold my beer!"

I would love to see lower software requirements, more long lived systems, etc.

Meanwhile, Microsoft is obsoleting... basically the last decade+ of hardware for Windows 11. It's an uphill fight.


Meh, I keep multiple electron apps, node processes, and chrome windows and tabs open at the same time, and they're really not at all CPU intensive (except for node running tsc builds or tests). It's the RAM usage that is annoying, and even that isn't too terrible.

Edit: of course, I'm running swaywm on linux, so my environment is generally pretty light as it is compared to Windows.


I generally try to use ARM small board computers as much as possible. They really struggle with running a bunch of Electron stuff at the same time, and while you can reasonably use some modern websites on them, it's amazing just how much absolute crap you can get rid of with NoScript/Ghostery/etc, and still have a perfectly functional website.

My massive, powerful, "Do absolutely everything!" computer in college was a dual Pentium III 866, with 768MB RAM. And a hex core, 2GHz ARM box with 4GB RAM can't do half as much, because of just how heavily bloated software is. I still write code, browse the web, chat with people... and I need radically more resources because that's what software development has decided is easy. Meanwhile, stuff like Hexchat for IRC just... uses no resources and works as wonderfully as ever.


I'm guessing RAM is your limiting factor. I've got about 20 tabs open across two chrome windows (separate profiles) and I'm using less than 3gb for my entire system. CPU usage is basically nil since I'm not doing much at the moment but typing.

Throwing slack / spotify / vscode / steam / whatever else on top of that could easily bump it up several more gb. VScode is pretty good, up until you start adding in language servers. The quality of the rest is really hit or miss.

Edit: Out of curiosity, I opened steam, and it immediately took up a little bit of ram, then by the time it finished "loading" it ballooned up to a gig- presumably because of all the fancy animated images and video players and what not the storefront needs. I could definitely do without those things, though I imagine they do correlate to more purchases (there's a name for it, I forget- something about moving images triggering a capture of attention going back to hunting / fight or flight response).


I strongly suspect that we have stopped trying to reduce bloat. The cost in engineering time cannot be sustained. As a dev I can't even afford the time to optimize my local development machine to reduce bloat.

Multiple electron apps run just fine on my 2015 MacBook pro too. Yes the ram usage is annoying but it's not unusable.

OTOH my battery is dying, and may not be worth replacing.


> I mean it's not like most consumer hardware has to be on the latest nodes.

Maybe developers should take some responsibility and stop fucking writing code that's slow as shit and unusable on anything past their company-paid for macbook pros.

In an ideal society programmers would be sat down and given a seven year old mid-range laptop and threatened that if their program lags in the slightest they would be fired and blacklisted from the industry.


Nah. "Raspberry Pi Day at Google." One day a week, your dual Xeon workstation is replaced with a Raspberry Pi 4. 8GB. Overclocked if you want.

It is absolutely adequate to do a lot of things one wants to do. But it will show when you've done something stupid with CPU.

I'm still bitter that Google ruined the Blogger editor interface. Fancy, shiny new interface... that lagged horribly if you had a low power CPU and a bunch of photos in a post. The old interface handled it perfectly, because I wrote an awful lot of blog posts on an old Atom netbook with a nice keyboard.

But, yes, any new hardware performance is more than chewed up by new software abstractions.


Well they’ll just limit table rows to 6 instead of 10.

That’s what React does: Instead of showing a table of 200 rows, the performance of React is so poor that it’s a design trend to make you paginate through 10-line pages. So, soon 6, “for better performance”.


Pay for it my dude. Software developers are perfectly able and willing to develop efficient and snappy software if given enough time to do it.

The reason it is not happening is because market forces don’t favour those solutions. But since you seem to care enough to threaten, fire, and backlist them surely you care enough to pay for the craftmanship required to get what you desire.


This is the problem, and we can't actually do this because we have a shortage of even basic "blub programmers" who can get business logic correct. Developers who are able to produce basically correct simple code are expensive, how much more expensive are people who are able to build low-level secure and optimized code?

Legal | privacy