This is exactly the right question. Further complicated by the fact that everyone has differing operational definitions of the words "consciousness", "awareness", and "sentience".
This is the problem with terms like this. There are definitions in academia/industry that differentiate between awareness, consciousness, qualia, intelligence, sentience, and more. But these distinctions require some pondering to understand and are divergent from general understanding which treats them as roughly the same thing. For example, consciousness for me is the ability for something to take directed action to affect their surroundings. This means that a robot can be conscious. But now the question for what we’re talking about here is: is it aware? It is possible to display consciousness without awareness (see blindsight studies), but to my knowledge it is not possible to be aware without consciousness. And then we get to sentience, which is still very slippery, and often relies on definitions involving “qualia”, which is also quite slippery.
There is the notion of panpsychism: that consciousness defined in the basic sense is extant everywhere, all the time, in many varied forms and scopes. By the definitions above, awareness would be restricted to those systems which could reasonably be considered "cognitive", and sentience would belong only to those who can conceptualize "cogito ergo sum".
This is the correct answer. But left unsaid is the fact that we don't have any objective definition of consciousness yet. I posit that until we do, artificial consciousness if by definition impossible.
The definition I use is very similar. I like to think of consciousness or sentience as the ability to model (or predict) one's self.
A pattern-matching entity that can match its own patterns.
I might disagree about the intelligence aspect. It seems to me any entity that is at least moderately interesting (has any kind of behavioral complexity) would need some level of intelligence in order to model itself to a respectable degree.
Can't agree with you there. These terms are not exactly super solidly defined, but most uses of the term "consciousness" I've seen usually refer to something more advanced than basic signal processing. To quote Wikipedia:
> It has been defined as: sentience, awareness, subjectivity, the ability to experience or to feel, wakefulness, having a sense of selfhood, and the executive control system of the mind.
It's a sufficiently vague term that I can't use it to describe what I mean because it will most likely be misinterpreted, and it conflicts a lot with modern views on what is or isn't aware enough that its experiences happen, i.e., the Turing test and the current trend of concern as to whether or not a given animal is intelligent enough.
Consciousness is a word searching for a meaning. It's telling that this entire thread is people explaining their own personal definition of this reification that we use to separate ourselves from the animals.
Consciousness is the state of being awake, hence able to react normally to sensory stimulation. Computers are nothing but consciousness, unless they're freeing memory.
Look, nobody knows what makes a thing conscious or not, and anyone who claims they do with certainty is talking out of turn. Is this guy being a bit silly? Yeah, I think so. But let's not pretend that anyone on earth has suddenly answered a question that humanity has been thinking about without significant progress for thousands of years.
I think sentience/consciousness can currently only be "known" by the entity in question. I only know that I am conscious. I infer consciousness in other entities based on their similarity to me.
Right -- consciousness is whatever it is that is aware that it is self-aware. In fact, anytime you pinpoint consciousness as any particular awareness, you beg the question. How can you be aware of your awareness?
I believe the broadest consensus defines consciousness as having a subjective experience.
What's unclear, and some deem impossible, is how we can externally detect consciousness. A machine could act and respond in a manner indistinguishable from humans but still be unconscious.
I suppose there is a concept of sentience from outside and a different concept from internal sentience. The movie "Johny Got His Gun" by Dalton Trumbo discusses a situation where a badly injured soldier in WW1 is considered brain dead by outsiders while he's fully conscious and sentient internally.
I haven't studied neuroscience so I don't know how you define consciousness. I have read Julian Jaynes's "The Origin of Consciousness..." which in my untrained opinion makes a compelling case that consciousness is a hard term to define.
I don't understand. Is consciousness a feeling or an external manifestation? It's something purely internal or something also visible on the outside? If it's only internal you cannot, by defition, say if a machine is conscious, but also every other human. If tomorrow we would discover an alien civilization and start a meaningful communication, would you question their consciousness?
Personally, I don't think that all the discussion about consciousness is meaningful. Consciousness could be the human feeling of abstract reasoning, like cold is the human feeling of registering a lower temperature. A machine could reproduce abstract reasoning, and asking about its consciousness would be like asking if a thermostat feels cold.
Yes, this complete lack of agreement on terms is possibly the hardest problem in discussing it ;)
I tend to use consciousness as the same as awareness really. Being awake and experiencing. This is where qualia are found, and
is something I readily ascribe to most if not all animals, and even possibly other living things.
I find the term self consciousness meaning meta cognition to be problematic, as many animals are aware of themselves as beings (e.g. recognise that a reflection is them). I don't think they can reflect on that much though.
We support much higher levels of abstraction in thinking than other animals, and we can reflect on our own thoughts, including our self in that model. These are cognitive skills rather than being about awareness.
I would say that an artificial intelligence can possess deep cognitive skills but have no awareness at all. Although it could have that too - I just don't see that as a requirement.
reply