Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

You know who wrote a nuanced book on the social relevance of automation? Martin Luther King Jr.

It's called "Where do we go from here: chaos or community?" It was his last book, published less than a year before he was killed.



sort by: page size:

I immediately had to think of Kurt Vonneguts "Player Piano" published in 1952, with similar a theme of automation of all production (by "taping" actions) and the impact of that on society. It also briefly mentions a "third industrial revolution" in which even human thought will be performed more efficiently by machines.

I think the story is manna, by the founder of How stuff works. https://marshallbrain.com/manna1

it actually ends on a positive note, it's trying to compare two possible ways society could respond to the automation revolution.


> I think the story is manna, by the founder of How stuff works. https://marshallbrain.com/manna1

Yes, that's it!

> it actually ends on a positive note, it's trying to compare two possible ways society could respond to the automation revolution.

Yeah, but I stopped reading at that point. Partially because I had something else to do, and partially because it seemed like a utopian fantasy.


If you have not read the sci fi short story "Manna" by Marshall Brain, then you should. It is a 20 minute read looking back at the automation revolution that pretty much destroys and then remakes civilization. http://marshallbrain.com/manna1.htm

"I am convinced that if we are to get on to the right side of the world revolution, we as a nation must undergo a radical revolution of values. We must rapidly begin [applause], we must rapidly begin the shift from a thing-oriented society to a person-oriented society. When machines and computers, profit motives and property rights, are considered more important than people, the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered."

-Beyond Vietnam (1967), Dr. Martin Luther King Jr.

http://www.americanrhetoric.com/speeches/mlkatimetobreaksile...

It seems futile to silo this as a problem of "tech", when really it's a problem of society at large: that profit motives are largely considered more important than people.


> All this being stated, ya'll should check out Horkheimer's essay "The Concept of Man." He wrote it in ~1952(might've been 53 or 57, I'm forgetting the exact date)--and it's crazy how prophetic that essay turned out to be. It shows how all our innovation really just led to an amplification of social structures and patterns that were already emerging during the dawn of automation and mechanization. I think it's relevant to your project.

Do you mind sharing a link to it? A quick search didn't return anything close to that title written by Horkheimer.


I'd love to recommend a book on this, but it never was translated to English: http://www.goodreads.com/book/show/8892205-menschen-wie-g-tt...

It was originally written to be a satire, but ended up showing a civilization way passed the everything is automated for basic needs. From there on you need to teach people to use their capabilities to create whatever they have in their mind, be that music, inventions, etc. If you take away the purpose of jobs, such as to do something with your life, you need to give alternatives, such as the option and the need for creation.


Right. If you’re from a tech background and are looking for books that lead to other books, Cybernetics is a way in. A specific example is that Cybernetic Hypothesis book, which is full on hard-leftest French writer group writing under a pseudonym and covering the social<>anthropology<>economic<>tech aspect of “all this.” That same term will also lead to Norbert W and dry engineering Cybernetics, although his later books skew social impact as well iirc.

The domestic terror angle is something I worry about. I’m in no way of condoning it, but the Unabomber is a clear example of ~tech despair leading into violence.

So today, when the advocates for paying attention to this are basically Andrew Yang discussing out of work truck drivers and that Jon Stewart/Obama/Bezos dinner thing, it’s concerning to consider.

The Unabomber occurred pre-iPhone and still escalated into something awful.

So, what if tech/cybernetics/whatever the catchall phrase is the driver to this discontent which then leads to violence.

Well, there’s a lot, to put it very mildly, of dehumanizing tech out there now and a population of users getting steered by it for years now. Many of them react via collective labor actions, but what will the others do?


Player Piano by Kurt Vonnegut. The value of labor has been devalued by automation, government is subservient to industry, people's place in society is determined by standardized test score (those who score poorly may either join the army or do manual labor, such as repairing potholes).

Migration, automation, and the meaning of life in the new western world.

Someone should write a book for white males, sadly we keep leaving this discussion to neonazis and white supremacists online instead of having it ourselves in the public realm...


This article reminded me a lot of stuff written by Mumford, Jacques Ellul, and even Ted Kaczynski (the Unabomber). Their stuff is worth reading.

Industrial Society and Its Future http://wildism.org/rca/items/show/13

Also Ellul's The Technological Society


Here is a true beauty that people should read..

https://www.jfklibrary.org/Asset-Viewer/Archives/JFKCAMP1960...

John F Kennedy talking about how automation is going to change America in 1960.


It really is a shame that he ended up getting violent, because "Industrial Society and Its Future" is one of the most interesting, insightful, and fascinating things I've read. I recommend it to everyone.

IMHO it's a classic example where "the author is excellent at identifying problems, not good at identifying solutions." Unfortunately almost nobody reads the first (identification) part because the solution part is so unpalatable and unacceptable. For anyone who doesn't know, Ted Kaczynski was the Unabomber and his solution to the problem of technology was basically to destroy the entire system by wiping it out in a way that leaves no ability for humans to resume technological progress, and violence was his way of beginning the societal destruction part. From a purely theoretical/philosophical view it makes logical sense, but for most people who have a sense of compassion and empathy the costs are extremely unpalatable.


I really don't think you can beat Kaczynski's piece. He does have a work that he published in prison (Anti-Tech Revolution: Why and How), but I think he predominantly builds on, rather than reiterates the point he makes in industrial society.

I would need to read his second book and is elucidated on his thoughts more, but I think a theme that he made without trying was on centralized technologies. It has been a bit, but there was a part regarding power plants where he points out the centralized nature of it. Further, technology is good and an extension of ourselves, but when we go from a nice tool to have to mandatory, I think we should be cautious.

thanks for posting and sorry for your loss.

I've heard the name TK first when reading Bill Joy's "Why the Future Doesn't Need Us".

15 years later I was burnt out and increasingly aware that Tech was not a force of good. I went on a 3 year journey trying to understand what made TK chose violence. It can't just be the struggle of a Pollak getting people to pronounce his name correctly, however hard that is. I started by studying his manifest. I dissected it like a surgeon, then read every book/reference that he gave. What would happen if I read every book he read. How does a bright young man go from reading philosophy and science conclude that men can't be saved.

How does a person go from being at Harvard to concluding the only way to make it better is to first make it worse. TK was an accelerationist who believed the means justifies the end. TK felt that it was a necessary evil to collaborate with those he disagrees with (fascists, neo-nazis etc) as long as they can help bring down the current world order.

One dominat figure that kept popping up repeatedly when going through TK ramblings was Jaques Ellul, a French philosopher, not too well known outside France[1][2]. Ellul must have left quite the impact because even his manifesto is a homage on Ellul's biggest work: The Technological Society (La Technique)[1].

Ellul doesn't just talk about Technology but the whole domain of what we today call Systems Thinking. And he opens your eyes about edge cases and the victims of this thinking in ways even heavyweights like Nassim Taleb will seem like a rookies in comparison.

Ca 2018 I've read everything Ellul wrote and also read multiple times what TK wrote. I was depressed, like really really bad. There is no way out. I still wasn't criminally insane though. There had to be more than just "self-radicalization by information". Something was missing. It was isolation!

So I went on to teach myself bushcraft living off grid for a year in a similar way as TK. "Living off the land" as we call it in infosec (only it's the literal land :)). And this made me understand why TK had so much hate for civilization. It's the same hate the Sentinelese people must have for those visiting their shores[3].

Although I had no guns to feed myself with game meat as TK did. And I did not spend the same time out there. But after a few months without human contact I did understand what humanity has lost and how our connection to mother Earth has been severed.

While I never turned criminally insane and always disagreed with violence, I managed to understand what radicalized TK. It was being isolated. It happens in less severe ways to all of humanity via algorithm and screen time.

If you really want the full TK experience I implore you to live in the woods for a decade and see how you feel about humanity. No laptop instagram or electricity. Will you have any empathy left for the rest of society upon your return?

Jacques Ellul is not an easy read when you're vulnerable and searching for answers. Ellul will push you off the cliff of society and do something that today is unthinkable: to question technology and technological progress itself.

TK is not comprehensible to anyone functioning and depending within society because he hated society and he believed it needed to be destroyed. It's pretty hard to have any empathy with this. Because he caused a lot of pain and suffering he robbed himself of leaving a legacy or a lesson.

[1] The technological society https://archive.org/details/TheBetrayalByTechnologyAPortrait...

[2] https://archive.org/search?query=creator%3A%22Ellul%2C+Jacqu...

[3] https://www.npr.org/2019/08/01/747368557/the-story-behind-jo...

Edit: yes heavily edited, lots of typos, lots of context cleared up. It still has potential to be misunderstood ofc


That's not what Player Piano was about at all. Player Piano, his first novel, was actually a remarkably prescient story about a future dealing with the crisis of total mechanical automation, except with the utopian-in-retrospect idea that postwar America would continue its New Deal-meets-General Electric policies so that the biggest problem for people with low socioeconomic status who lose out from automation is societal alienation and loss of the dignity of work, rather than the actual reality of increasing wage peonage and poverty.

I think you might be talking about Galapagos.


I just finished reading Ted Kaczynski’s (the unabomber) book Technological Slavery. Yeah he was a domestic terrorist who most likely hurt technological reform more than helped it, but there were still a lot of good points related to this post.

The more we rely on technology the more powerless and less free we become.


As one of my favorite books, I'm overdue for a re-read, but Waldrop's Dream Machines is a book about J.C.R. Licklider, but Engelbart definitely comes up again and again, and for sure, the Augmenting Human Intellect thread is highly present. At least we no longer reduce Engelbart to "the man who invented the mouse." But yeah, the not-exactly philosophical but big-idea-ed-ness, the frame-setting that folks had is such a valuable contribution. Engelbart had such a clear & sharp objective in mind, that he wrote about so early, & it's so compelling & humanistic a value. Few other examples of tech-workers across the decades seem to radiate with the clear moral & progressive clarity.

Thank you for this call out. This idea, of whether computers are here to mechanize processes that people fit into, or whether computers are tools to augment & enhance our individual agency, is a huge question.

Ursala Franklin has some good material on Technological Society. There's a couple different points of view she proposes, but I really enjoy the distinction she makes between Holistic technologies, where the capabilities of the user are expanded & opened, the user gains control, versus Prescriptive technologies, which relate more to macro-level control, to enforcing set-down processes[1].

[1] https://en.wikipedia.org/wiki/Ursula_Franklin#Holistic_and_p...

next

Legal | privacy