Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

All kinds of things. Personally, in the medium term I'm concerned about massive loss of jobs and the collapse of the current social order consensus. In the longer term, the implications of human brains becoming worthless compared to superior machine brains.


sort by: page size:

I think it's a great question "What are the social-political-economic ramifications of the dawn of superhuman machine cognition?" It's too big a question for me I am not smart enough to imagine! I hope at least some humans survive and I hope it's not in some horrible dystopia.

Yes, extremely concerning. Recently new advancements in AI have shown it is possible to read your mind. Reconstruct images of your mind from what you have seen.

We are headed into a deep tech dystopia. Even more disturbing are the number of people with eyes filled of Utopian dreams of what the AI future is going to bring. Some are eager to become the new Borg or Cybermen.

Society is in a state of anxiety largely driven by technology's effect on socialization. The irony is that many will immerse themselves deeper into tech to try to escape the anxiety. A society death spiral.

I've written a lot more on the topic as well - https://dakara.substack.com/p/ai-and-the-end-to-all-things


I'm way more worried that society will fall apart for reasons not very related to technology.

I'm not sure I think everything is going downhill. More that the type of tech we are developing, extrapolating from our current standpoint points in some scary directions. Eg. Authoritarian control and loss of freedoms.

I worry about the ever increasing ways, humans have to escape reality, and isolate themselves.

Drugs and alcohol. Video games, soon, full immersive. Streaming videos. Porn. One demand deliver of everything. Sexbots, coming soon. More remote work.

Literally someone could work, eat, and never leave their house if they wanted to.

It seems like technology is already making people more isolated. I really worry about this.


We are going to be drowning in a sea of autogenerated noise. I think the early excitement is going to fade into a bit of frustration and misery.

It is very difficult to reason about the future as it becomes even more unpredictable each day. Emotional well being requires some semblance of stability for people to plan and reflect about their lives.

I've spent many hours contemplating how this is going to shape society and the outlook is very concerning. My much deeper thought explorations - https://dakara.substack.com/p/ai-and-the-end-to-all-things


" and now the threat of AI."

Probably the icing on the cake for me, looking like no job prospect and potential for serious consequences ranging from mass scale disinformation to extinction.

We need to get our priorities straight as a species.


I can imagine that future, and it is frightening. As a society we imagine our technological progress to somehow imply that we are actually advancing as a species. Recent events suggest otherwise, and putting evolutionary control in the hands of those kinds of people (read: anybody) is likely to be really bad for us all.

I'm already deeply pessimistic about what we achieved so far when it comes to machine intelligence. It's impossible to stop or reverse this progress and we're on the fast track to creating tools with the destructive potential of nuclear weapons which are simultaneously available to everyone with enough money.

Don't get me wrong, I'm not talking about mushroom clouds and some kind of machine uprising, the consequences will be far worse and more insidious. We'll see completely new levels of manipulation, oppression and surveillance instead. Any kind of tool will be abused, this tool might turn out to be too powerful for us to handle.


I worry about corporations and/or authoritarians getting even more edge. Personally I am not worried about a Terminator/Skynet scenario, more about greed and people holier than thou using this technology to cement their position.

Very interesting thoughts. It does seem we're heading towards some kind of technological dystopia, but there is an increasing minority of people who are aware of it. I can see it leading to some kind of schism.

My concern isn't some kind of run-away science-fantasy Skynet or gray goo scenario.

My concern is far more banal evil. Organizations with power and wealth using it to further consolidate their power and wealth, at the expense of others.


It freaks me out.

Regarding A(G)I, automation, robots, etc. Are we on the edge of a revolution that will completely destroy the ‘human touch’?

I feel, and concerned, with the fact that we’re becoming more spoiled, softer, and losing our human touch. Everything is dehumanizing.

More and more tools are built that are suppose to make us do less work, think less, have an easier time.

Don’t you fear that at some point we will be so off what we are now and lose our touch?


The sci-fi scenarios are a long-term risk, which no one really knows about. I'm terrified of the technologies we have now, today, used by all the big tech companies to boost profits. We will see weaponized mass disinformation combined with near perfect deep fakes. It will become impossible to know what is true or false. America is already on the brink of fascist takeover due to deluded MAGA extremists. 10 years of advancements in the field, and we are screwed.

Then of course there is the risk to human jobs. We don't need AGI to put vast amounts of people out of work, it is already happening and will accelerate in the near term.


I have a bad feeling too...but for different reasons. I fear society just doesn't give a damn anymore. I fear all our warnings will steadily fall on deaf ears and we will eventually become ostracized into oblivion. I fear humanity will embrace a system which pushes profits before people, ego over empathy, and lust above love. I fear elitists will eliminate innovation and erase the integrity of the internet and information. I fear for our future, but I have some hope in knowing their future fears us. Game on.

Selective memory loss.

The dawn of infinite power was heralded in with the annihilation of 2 cities in Japan, and the death of many innocents exposed to open radiation in the development process. The industrial revolution began with pollution that worsened the lifespan on the 'winners' for decades before we figured out how to make these spaces safer. Modern medicine started off with bloodletting & modern psychiatry started off with lobotomies.

The first few generations that adopt a revolutionary technology see their lives get worse before they get better.

Problem is, AI, Medicine & Automation are all likely to produce revolutions that are bigger and faster spreading than anything we've seen before. I do not wish to be a 'growing child' in an era of increasing adoption of this tech. Growing up with just the internet was confusing enough as is.

More than all of these, the scariest is a technology that I've started hearing whispers of. We might 'solve aging' in this century. I hope I get to die before the advent of perpetual life seems to be on the horizon.


I feel like there's some sort of parallel with the Machine Apocalypse.

"What if we build intelligent machines that execute their goals at the expense of humanity," we worry.


So what is the scenario that we should be worried about? Because I’ve only heard skynet prophecies.

Or weird nanobot bullshit.

Is there a really boring possible outcome that leads to the destruction of humanity?


I feel the same way, but with existence in general. Authoritarianism is way up, the planet is frying, AI will destroy jobs and supercharge surveillance ... I'm not sure I want to witness the result.
next

Legal | privacy