Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

If you're concerned that humans are as smart as it's possible to be

It's not about humans being as smart as possible though, it's more about being "smart enough" to where a hypothetical "smarter than human AI" is not analogous to a nuclear bomb. That is, are we smart enough to where a super-AGI can't come up anything fundamentally new, that humans aren't capable of coming up with, as bound by the fundamental laws of nature.

then I would recommend reading Thinking Fast and Slow or some other book on cognitive psychology

I'm reading Thinking, Fast and Slow right now, actually.

And just to re-iterate this point: I'm not arguing for this position, just putting it out there as a thought experiment / discussion topic. I'm certainly not convinced this is true, it's just a possibility that occurred to me earlier while reading TFA.



view as:

Legal | privacy