Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Here's why I disagree: smart human beings evolved from not-very-smart primates.

That proves there is a natural process by which greater intelligence can be created.

Therefore there is no reason that an even greater intelligence cannot be created with help from man. And for singularity's sake super-human intelligence doesn't even have to be an AI. Genetically-engineered super-intelligent primates would do the trick as well. The idea is that once you've created something smarter than yourself, not matter how you did it, it will then be able to figure out how to make something even smarter. And so on. That's the singularity.



view as:

What if natural evolution has become quick enough now (and it does seem to accelerate all by itself, if you put the evolution of species on a time scale) so that the "next step" (the superhuman beings) will emerge without any conscious and/or voluntary input from us?

I'm not saying this is certainly what's going to happen, it's just a (literal) what-if question.


The author's argument hinges on humans creating more intelligent AI, not humans genetically engineering beings capable of evolving greater intelligence than humans. I would also argue that humans engineering new species isn't exactly "natural" in an evolutionary sense. The author even points out that he isn't considering natural evolution in his argument.

Legal | privacy