Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
A New Old Idea; Permacomputing (systemstack.dev) similar stories update story
83 points by pabs3 | karma 43824 | avg karma 6.39 2022-12-27 00:04:44 | hide | past | favorite | 40 comments



view as:

I definitely understand what they mean by a fork, at some point I stopped using computers with explicit purpose and now it can be doing, or just consuming. I think that was the fork, when computers became entertainment and consumptive devices as well as tools for doing.

I have a specific workststion for music production, work, and games, and I think that was an attempt to make each workstation more purposeful and less consumptive.

When I find myself on my phone too much, I try and switch back to only using it when I need to complete a task. But they are very good at sucking you back in.

To the permanence part of the article, I think that is what attracts me about embedded devices. You build it, and it works until it breaks, rather than until the complex alignment of software dependencies misaligns against your codebase or hardware.


> But they are very good at sucking you back in.

I call it 'needy technology'. It needs your attention all the time or it will start to harass you until you give in.


A strange inversion, these used to be tools, limited but tools, to get something done, and now they're nearly unlimited but you do very few with them. Very odd.

I do lots with them. But I don't let them control my day.

do you have some hygiene or routine ? like being offline while working ?

I do a "lot" but it's noisy, stretched thin and confusing, which I want to describe as "few".


I block many sites myself, even HN at times. I don't use any feed based apps either, instagram/fb/tiktok etc. I have had Instagram, I ditched it when they switched to infinite interest based feeds instead of just my friends.

I use the 'two computers' trick, but have it simplified to 'two screens', one for work and one for other stuff. Just getting up and changing screen is enough of a mental barrier to keep me on point.

I noticed that some times, but so far the most efficient trick was a ISP black out. Having no internet changes my brain chemistry by a wide margin.

Hehe, I can see a market for a very unreliable ISP ;)

I foresee a luxury market where you sell absence of product as a feature.

There are a lot of people recycling this particular idea and typically they will end up sticking a Raspberry Pi (assuming they can get it) into an enclosure that serves the intended purpose. You can then use the base OS to hack around on it until it looks and feels like the device you would like to have. Some of these are very impressive.

One example: (there are 100's)

https://www.raspberrypi.com/news/scripto-distraction-free-wr...


Devising custom word processor for custom device seems... overkill compared to cheap laptop with word processor installed (and nothing else) and in autostart...

Absolutely. I recognize some of this though: I wanted to learn how to play the piano. Other people might go and find a teacher, I dug around to find some software, used it for a while, found all kinds of shortcomings, designed my own software. Probably not the most efficient way of learning to play the piano, to put it mildly. But very satisfying once it works.

Maybe the goal isn’t to be able to use a word processor, but rather to have an aesthetic experience. In that sense, this is more akin to art than engineering.

I think it's an important aspect of computing, we can often forget the human experience element.

Seasoned creatives I talk to find their environment and the feel of it very important to their productivity, and I think it's the same for computing.

When coding in Windows the feel is that of fluorescent lit room in 2005, sitting next to a filing cabinet with a Cisco phone on my desk that rings every 15 minutes.

Coding in my personalized linux setup makes me feel like a netrunner in cyberpunk and I love that. I know I can achieve the same work in Windows, but the vibes are different.


> Coding in my personalized linux setup makes me feel like a netrunner in cyberpunk and I love that.

Absolutely my aesthetic, too. Sometimes, I'll spin up KDE just to get away from the terminals for a few hours, then back into i3/Sway.


This is really nice, though. I would love to see more variety in computing than what we have now, since a lot of it is the same old same.

It used to be that we bought books, and they accumulated (and we sometimes even used them again) until your heirs had to get rid of them. Now we read things online, and the disks are wiped when our heirs get rid of them. We've always been consuming more than we're producing.

What would be nice is to be able to ingest content as before, but have it all stored automatically, then made searchable. I'm told that this is available[1] for a Mac based machine. I'd like it more generally available.

The point of having books, or notes, is to be able to retrieve information with context. We now have tools that can summarize and categorize things automatically. If we, as a hive mind, can add just a little bit of multidimensional voting/tagging to the things we see, along with their cryptographic checksum and consent mechanisms, we could collectively organize everything we see with almost zero individual effort.

Vannevar Bush's vision for the Memex runs afoul of copyright laws, this might be a way to route around them a bit.

[1] https://www.rewind.ai/


I always liked the Lifestreams concept (look for subheading "Lifestreams" somewhere in the middle, it's a long article): https://www.wired.com/1997/02/lifestreams/

Yes, Lifestreams are very cool.

Particularly the idea of having time as your unique identifier / primary index, so things are always stored automatically and any further organisation is an optional add on.

https://www.cs.yale.edu/homes/freeman/lifestreams.html


There's a lot to tell about how things were and how they digressed. Beside the physical aspect or ergonomics, there was something calming of a very small core software. Powering an old machine throwing you into a BASIC shell in one second is relaxing. Modern day suffers from the paradox of choice, too many things to digest, too many variants, too much stuff you never gets to understand.

There were also more 'mystical' aspects to old machines, they were slow but fit your brain more, you can't swallow the whole internet .. you need time, those machines being limited, gave you time to think, they also had their own peculiar mechanics, and it made for a nice ritual (told that before but the pleasure of hearing a tape drive ramping up, a LP deck spinning.. it all feeds your brain with a soothing feeling)


Just go into terminal. I've settled with zsh after forever with bash, but nothing really changed.

It works in all systems, mac, linux, even Windows (with WSL). The same script that I configure it can be used anywhere so I put it on git and just pull that whenever I need to configure a new system for temp/permanent use: https://github.com/Aperocky/unix-setup


Try developing for Android. You will have to install a load of tools before you can even type hello world. Even if you only use a terminal.

Shell != terminal. The terminal it's where your interactive shell (or terminal software) is being launched from.

For instance, xterm -e top spawns no shell under top by default.


I went into the article under the impression that permacomputing was a form of computation resilient to change over time, something akin to markdown or plain text files. I missed a more clear definition in the text.

It's hypertext, so you follow the permacomputing link located in the first sentence of the second paragraph on the page, where you then find a definition and an extended discussion.

All nice things eventually end and current world peace will probably end one day too. If we want to preserve our technological advancements through civilisation ending disasters then people should start working on enabling anyone to manufacture chips easily. Something like 3D printing revolution but for silicon. Only then Open Hardware initiatives will really make sense.

> current world peace will probably end one day too

Depending on where you live it may have already ended.


yup

Peace is certainly unevenly distributed around the world.


3D Printing, once you have the printer, only needs plastic as an input. I think a comparable revolution for circuit manufacturing would need a similar situation. A process that only takes relatively simple inputs.

To note, it has proven hard to maintain advanced circuit manufacturing pipelines even in relative peace.


I also jotted down my experience so far: http://talk.binarytask.com/task?id=8015986770003767235

eInk has to low refresh rate to be able to watch video or play games.

12v IPS screen is a better alternative.


Sorry, but I don't get it:

>What drew me to old computers, and what still draws me to them after all these years, is the sense that people who used them new had a different faith in them: of computers as a tool for thought, as something that people could use to accomplish something new. The idea of computers as “the bicycle for the mind” felt like a lost future

Didn't we accomplish all that and some? Isn't a modern smartphone the best "bicycle of the mind" we could 've imagined in 1990s?

Sure there are disadvantages. I would love a large(cheap) eink display as much as the next guy. I'm also a big fan of retro computing. I still have my commodore 64, a 386sx,a slot1 windows 98 pc and more. I like all of them, but I would never give up my modern Ryzen PC with a nVidia 2070(yes,not newest, but pretty powerful), nvme drives and fast ram for something with less capability.

We have all this and at the same time we have single board computers and 8 bit microcontrollers where they are appropriate.

If not for the current "price gouger paradise" conditions I would say we have it the best it ever was computing wise.

The only things I miss from the past are two aspects of software development. First, it was much easier to understand "everything" about a computer back then when few tens of kB was a large program. Second, it seems programs were written to work well on current not future hardware.

Also, before Windows showed up there was an almost a decade long time period when a typical pc did everything a typical user wanted. There was little point to upgrade the hardware back then. Remember new computers sold towards the end of that time had a special button to slow them down! (hilariously named turbo). Only once graphical interfaces became a thing faster and faster PCs started selling. There is a certain comfort for a programmer to be able to make software for one machine for a decade. You can really get to know it well... Now? You can't even get one Android API to know well before they change it 180 degrees. This even starts to extend to programming languages. Despite android running a java virtual machine, java(and C) is no longer sufficient to use all features of the platform. Now, we have to switch to Kotlin to use certain concurrency features etc. I think the pace of change in this has direct impact on overall software quality we have now


I think a major dofference between a bicycle of mind and our smartphones is that many people feel like their smartphone is riding them, not the other way around. I agree with the author. I don't think that all purpose tools like modern computers are good for single purpose things like writing [ https://coral.hashnode.dev/computers-are-not-for-writing ]

> Isn't a modern smartphone the best "bicycle of the mind" we could 've imagined in 1990s?

No. Not even close. Smartphones are more akin to a weird combination of shopping mall and the Stasi than anything like what that phrase was supposed to mean.

Compare and contrast with "Dynabook" https://en.wikipedia.org/wiki/Dynabook


Honestly, I think this concept applies to mobile phones for children too. A solid phone that does the basics without a camera or bloatware. Couple that with limited access to the internet and app store.

The purpose of cell phones, in this context, is to keep children safe. They fail to accomplish the task.


That's a closed source turd, unusable after a few years, with a shitty interface, atrocious input and no thinkering.

If any kids should have a basic micro-machine with Forth and graphic plotting support.


Good job constructing a narrative to then attack. You did it!

FWIW, I'm building a simple 32-bit computer system based on Prof. Wirth's Oberon RISC CPU and a simple language called Joy that combines the best parts of Forth and Lisp.

The primary motivation is to avoid unnecessary complexity. But there's also a Permacomputing angle.

Permaculture is a school of thought about what might be called the art and science of applied ecology.

From that POV the main concerns of ecology in re: computers would be efficiency and reducing environmental load of making chips, etc.

The other common theme seems to be a kind of post-apocalypse computing (e.g. Collapse OS).

I don't really know where I'm going with this... Simpler computers would presumable be more efficient and easier to make?

- - - -

Really, the crucial thing from my personal POV is the reduction of complexity. I'm pretty sure that that's the primary measure of the durability of our large global interconnected economies: "Make it as simple as possible, but no simpler." :)


A long-lasting device, with similarly long-term, reliable storage, will likely cost a fortune to produce, if it can even be done at all. One of the issues preventing electronics from lasting that long will probably be whisker growth: https://www.semanticscholar.org/paper/Examination-of-tin-whi...

Legal | privacy