Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

  Isotope seperator - plague kit
I assume this is downloading nuclear bombs or viruses? And that means that we need AI to monitor us constantly in case some idiot kills everyone?


sort by: page size:

a nuke and a bioweapon ready to download..

I have to say it.

You wouldn't download a nuke...


There was a (rumored) integer overflow bug in the game Civilization which would cause an AI that was configured for pacifism to start launching nukes.

https://knowyourmeme.com/memes/nuclear-gandhi


This is essentially the Colossus argument, from “Colossus - The Forbin Project”

https://en.m.wikipedia.org/wiki/Colossus:_The_Forbin_Project

Machine gains super intelligence, and so makes new goals to use nuclear weapons to reduce the human population.


Step 1 .. Connect nuclear weapon to network.

Step 2 .. End civilisation.


There are other ways to kill tens of thousands of people other than nukes.

Weaponized viruses is one. Other indirect ways such as triggering a large scale war, economic warfare, etc.


This basically worked perfectly with nuclear weapons. Everyone took their development extremely seriously, and we’ve managed to avoid a nuclear apocalypse. If anyone could get on their computer and buy a nuke from Amazon, we would all be dead within the week.

I think the idea is that after a nuclear war, the only two things left are your data and some cockroaches.

One of my big concerns about the future is nuclear war being triggered by a new plague.

My concern is that an huge population allow for "parallel processing" for pathogenic evolution to find weak vectors in humans, and this will be mistaken as a nation-state attack leading to a nuclear war.


> It exists in a physical (if distributed) location and requires inputs and outputs in electric cables and fibre cables

"Nuke all of civilization right now!"


At some point we're going to just produce all this electricity so some AI can troll itself on the blockchain to decide who is going to trigger the nukes.

That's literally what we do with civilization ending tools like nukes and research diseases.

But what is the treat? What is the logical step between gpt 4 and human extinction? I really don't see it - and this is supposed to show obviously that AGI is more dangerous than nuclear war?

There are at least two failure cases here:

- a military AI in the hands of bad actors that does bad stuff with it intentionally.

- a badly coded runaway AI that destroys earth.

These two failure modes are not mutually exclusive. When nukes were first developed, the physicists thought there is a small but plausible chance, around 1%, that detonating a nuke would ignite the air and blow up the whole world.

Let's imagine we live in a world where they're right. Let's suppose somebody comes around and says, "let's ignore the smelly and badly dressed and megalomanic physicists and their mumbo jumbo, the real problem is if a terrorist gets their hands on one of these and blows up a city."

Well, yes, that would be a problem. But the other thing is also a problem. And it would kill a lot more people.


The threat of nuclear war seems to be rising rapidly:

https://www.lesswrong.com/posts/Dod9AWz8Rp4Svdpof/why-i-think-there-s-a-one-in-six-chance-of-an-imminent

https://www.theguardian.com/world/2022/oct/09/biden-armageddon-russia-nuclear-threat-pentagon

Is anyone aware of a 'Civilisation OS'? E.g. instruction manuals for rebooting civilisation from scratch in the event that some large % of it is destroyed?

I am thinking about something like nested txt files covering everything from how to make a fire to how to smelt aluminium to how pencils are made.

If it doesn't exist already, anyone want to work on it?


Putting Java on nuclear subs would allow the Bulletin of Atomic Scientists to turn the doomsday clock back by ten minutes or so, thanks to the infamous InvalidCoordinateArgumentYouCannotTargetTheMoonException. ;-)

Nuclear Annihilation has ALWAYS been a threat. We just forgot about it. All it would take is one mistake, for all the holes in the plates to line up, for one bomb to be loaded instead of a training round and dropped too close to a city. For one president to go crazy at the right time.

We need to get rid of them if we wish to continue as a species.

Aside: I wish HackerNews had a feature that allowed us to flag something as paywalled so it would bring up a little tag.


Nuclear war notification.

nuclear war may not wipe out mankind, but superintelligent AI intent on wiping out mankind can probably do it.
next

Legal | privacy