Quoting my previous comment:
"I think it also useful to semantically separate cognitive consciousness (i.e. Knowing and expressing the existence of your thought processes through arguably higher, more abstract thought processes - which might even go on recursively - knowing that I know that i am conscious, etc) from the externally unmeasurable 'conscious experience', (i.e qualia, the awareness of sensory or thoughts at the most essential level, 'seeing' what one sees, etc).
One could imagine a living being with one but not the other, for example qualia without cognitive consciousness (if I had to guess, I would imagine this experience to be similar to being drugged to the point of having no internal monologue, no complex thought process, but keeping your sensations and vision, etc, or being a barely-sentient animal in purely instictual mode of thought and action)
The opposite, cognitive consciousness without qualia - a.k.a the philosophical zombie - or a computer which can argue the existence of it's thoughts without feeling them is, I gather, a more controversial state of being.
What I find interesting is that in separating the two 'consciousness's, the former ends up taking almost all of the importance and the latter none - anything which can be externally measured ends up in the first category (which is a computable logic process), which leaves very little of utilitarian/evolutionary/algorithmic importance in the second. However, in much discourse about consciousness the latter takes a disproportionate role (i.e fear of losing your unmeasurable consciousness when teleporting, etc, though the cognitive consciousness, being by definition a logical and measurable process is theoretically preserved)"
In this case although the article, to get quick views, appeals to our love for the mystical attributes of the 'conscious exprience phenomenon', the paper actually studies the (in my opinion much more) interesting empirical characterisation of the level of cognitive consciousness in insect 'brains' - i.e., how elaborately their software models its own processes.
The opposite, cognitive consciousness without qualia - a.k.a the philosophical zombie - or a computer which can argue the existence of it's thoughts without feeling them is, I gather, a more controversial state of being.
What I find interesting is that in separating the two 'consciousness's, the former ends up taking almost all of the importance and the latter none - anything which can be externally measured ends up in the first category (which is a computable logic process), which leaves very little of utilitarian/evolutionary/algorithmic importance in the second. However, in much discourse about consciousness the latter takes a disproportionate role (i.e fear of losing your unmeasurable consciousness when teleporting, etc, though the cognitive consciousness, being by definition a logical and measurable process is theoretically preserved)"
In this case although the article, to get quick views, appeals to our love for the mystical attributes of the 'conscious exprience phenomenon', the paper actually studies the (in my opinion much more) interesting empirical characterisation of the level of cognitive consciousness in insect 'brains' - i.e., how elaborately their software models its own processes.
reply