This basically worked perfectly with nuclear weapons. Everyone took their development extremely seriously, and we’ve managed to avoid a nuclear apocalypse. If anyone could get on their computer and buy a nuke from Amazon, we would all be dead within the week.
One of my big concerns about the future is nuclear war being triggered by a new plague.
My concern is that an huge population allow for "parallel processing" for pathogenic evolution to find weak vectors in humans, and this will be mistaken as a nation-state attack leading to a nuclear war.
At some point we're going to just produce all this electricity so some AI can troll itself on the blockchain to decide who is going to trigger the nukes.
But what is the treat? What is the logical step between gpt 4 and human extinction? I really don't see it - and this is supposed to show obviously that AGI is more dangerous than nuclear war?
- a military AI in the hands of bad actors that does bad stuff with it intentionally.
- a badly coded runaway AI that destroys earth.
These two failure modes are not mutually exclusive. When nukes were first developed, the physicists thought there is a small but plausible chance, around 1%, that detonating a nuke would ignite the air and blow up the whole world.
Let's imagine we live in a world where they're right. Let's suppose somebody comes around and says, "let's ignore the smelly and badly dressed and megalomanic physicists and their mumbo jumbo, the real problem is if a terrorist gets their hands on one of these and blows up a city."
Well, yes, that would be a problem. But the other thing is also a problem. And it would kill a lot more people.
Is anyone aware of a 'Civilisation OS'? E.g. instruction manuals for rebooting civilisation from scratch in the event that some large % of it is destroyed?
I am thinking about something like nested txt files covering everything from how to make a fire to how to smelt aluminium to how pencils are made.
If it doesn't exist already, anyone want to work on it?
Putting Java on nuclear subs would allow the Bulletin of Atomic Scientists to turn the doomsday clock back by ten minutes or so, thanks to the infamous InvalidCoordinateArgumentYouCannotTargetTheMoonException. ;-)
Nuclear Annihilation has ALWAYS been a threat. We just forgot about it. All it would take is one mistake, for all the holes in the plates to line up, for one bomb to be loaded instead of a training round and dropped too close to a city. For one president to go crazy at the right time.
We need to get rid of them if we wish to continue as a species.
Aside: I wish HackerNews had a feature that allowed us to flag something as paywalled so it would bring up a little tag.
reply