Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

You have to Google and read those thought experiments to see why. You may not be convinced (I'm not), but they give good reasons. We have mechanistic explanations for all of those organs, and even if we lack some explanation, we know one is possible in principle. They argue this isn't the case for consciousness.


sort by: page size:

Agreed.

The only proof of consciousness case we have (i.e., consciousness arising from Big Bang gases and getting to this conversation) incorporated sense-motor-environment-sense loops. Don't know if it can happen without such interactions, but the one route we know worked, used them.


Before you dismiss this as another kooky consciousness theory, please let me add that this the only mechanistically precise one out there

It doesn't seem that unrealistic to me. The staggering number of elements making up the system we call a brain is likely to lead to some truly unintuitive solutions.


This kind of experiments always make me think why we experience consciousness, and where is the evolutionary advantage of having it at all.

I'm sorry, you lost me.

Is your point that we can only demonstrate consciousness through pharmaceuticals or surgery?


I don’t think that it would be verifiable, but it is a default hypothesis that we don’t assume some yet unseen way of interaction between something and our material bodies.

Also, I don’t think that it is unreasonable to imagine “consciousness” happening as an emerging feature of sufficient complexity. Remember, we have 3 billion neurons with orders of magnitudes more connections between them.


Quite an interesting theory. For a moment I was inclining to believe that consciousness can be computed. But I guess I'd have to agree with most of your arguments

It seems like what should follow from this is rejecting the idea that the brain should be under conscious control. Brains just don't work that way. Why not find a way to make use of that?

That is speculation.

We have absolutely no idea how consciousness is produced from electrical and chemical signalling between a large ball of fats.

We know that it is (because we perceive our own), but we can't map from one thing to another.

If consciousness is merely neural activity in a brain, then why can't we simulate really simple brains?


Very nice reasoning. It still doesn't explain how a paramesium, without neurons (so no neuron aggregate there), can experience (a sub-human) consciousness.

Ref: https://www.youtube.com/watch?v=L5OMuFMFUIU


These are all good questions and probably allow our realised thought experiment to gain more insight into what aspects are necessary for consciousness and what are just biological baggage.

One possibility about the machinery of consciousness that would help address the Hard Problem is that the brain is not creating consciousness from scratch, but tapping into some currently misunderstood or unknown physical phenomenon. It's hard to see how information processing alone (which can be done by monks with paper and pencil, if incredibly slowly) can give rise to subjective experience.

It seems that this phenomenon, whatever it is, plays a central role in sensory perception, and there's reason to think that it's present even in animals with simple brains. So I suspect that we're looking for some kind of simple operation that can happen on the scale of a small number of neurons, maybe even a single neuron.

This is all speculation of course, informed by some knowledge and intuition, but speculation nonetheless. But it's the only way to push the frontier, and the unwillingness to engage with consciousness as a matter of serious study seems to be a major failing of brain science and AI.


  I still fail to see why having a single 
  neuron that circumnavigates the brain should 
  lead to consciousness.
Me too, but what an experiment it could lead to, and the questions it would answer.

Reasons it makes sense include the singular perception of self. No longer does that sense of being in one place, and perceiving one experience seem mysterious. If there's one container, and one self, the idea that the core of our living self ultimately resolves down to a single-celled animal depending on a collective of subordinate cellular tissues wraps up so many other details in an easily understood package.

Of course it raises the question of what mechanism drives such king-making? Is the sheer size of the neuron a factor? Is status determined as a zygote? Does status last from cradle to grave? If there's one master neuron at a time, does the master neuron's identity get exchanged or relayed to other neurons throughout life? How would hand-off work? What if there are conflicts? I'm sure there's no simple answer in any of these ideas, and the reality of such details will remain as complex as ever, if this is even the right track to follow...

But if it were true, maybe it opens the door to mastering one's entity of self, such that a person, or any similar organism might be capable of existing temporarily in a petri dish, lunching on agar, while the body goes in for a tune-up.


No, it isn’t at all counterintuitive. E.g. my brain is a system of neurons and that system experiences consciousness. That is perfectly intuitive.

I think they're arguing that consciousness is an emergent property of a sufficiently large network, rather than being located in a particular cerebral structure.

I resist the idea that consciousness can be preserved like this, but moreover in any way. I think the litmus test for understanding consciousness is making something that is consciousness. I don't think anyone has done this.

Why are these scientists so desperate for a Magical Quantum Solution to consciousness? Shouldn't they first disprove the much more likely hypothesis that consciousness is an emergent property of the (macro level) structure and operation of the brain?

I've read about something similar called split brain where there can be separate consciousness in the brain and maybe elsewhere. Interesting experiments have been done on it and it just all seems so weird. Very tricky figuring out this consciousness thing from the inside.

There's a hint of "god of the gaps" about that argument.

I don't think we know enough about consciousness to even speculate usefully about it.

See e.g.

http://www.telegraph.co.uk/news/science/science-news/1114444...

And I think uploading has become a bit of a cliche now. If brain copying were possible, then brain merging and direct experience/memory sharing would likely be possible too.

What if instead of worrying about uploading brains for individual survival, we started turning into a connected colony organism? We already are in physical ways, but our consciousness is lagging behind.


Thought experiment 1

We all start as a single cell. What biological event starts consciousness? Keep in mind you can't be half conscious — that's still conscious.

Thought experiment 2

Look at this random symbol: #. What are you conscious of? Chances are, conscious of things, of objects, of elements, of concepts. Things that have some unity to them. But, things are mental constructs, typically socially shared. So, much of the content of consciousness arrived after birth, once we could learn these mental constructs.

Thought experiment 3

What is more complex: a single neuron (with thousands of mitochondria and other organelles, thousands of synapses) or a worm, with only a few hundred synapses? Similarly, what is more complex, your own body or the company you work for?

next

Legal | privacy