I lean toward the view that for information theoretic reasons the availability of meaningful information (training data) is likely the fundamental constraint on any rapid explosion of intelligence.
That being said I don’t think you need a god-like superintelligence to be more intelligent than humans. You just need something marginally better that can remain focused longer and doesn’t tire. As to whether that represents a danger to humans I think it depends on what we do with it and/or what kind of society or environment we embed it within. If we train or prime it to compete and dominate that’s what it will do. Same as with humans who are more criminal and violent when raised in unstable or abusive homes.
> As to whether that represents a danger to humans I think it depends on what we do with it and/or what kind of society or environment we embed it within.
Agree, and I think this echoes one of the author's best points, which is to question whether engineers who are convinced their creation will be a sociopath are the most well-equipped people to actually prevent that fate. (Especially, as the author suggests, given the commonness of asocial/antisocial-ity among the builders.)
That being said I don’t think you need a god-like superintelligence to be more intelligent than humans. You just need something marginally better that can remain focused longer and doesn’t tire. As to whether that represents a danger to humans I think it depends on what we do with it and/or what kind of society or environment we embed it within. If we train or prime it to compete and dominate that’s what it will do. Same as with humans who are more criminal and violent when raised in unstable or abusive homes.
reply