Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Do you not think your experience, 'personal processing' is special? That while you can perhaps imagine other people having similar experiences, your own experience is unique and not just the inevitable output of your 'wetware' and its inputs to date? That you observe reality and yourself and choose parts to change on account of your feelings and intuitions?

I do think the materialistic idea that we are a sort of computer, with hardware (the body) and software (the mind) is a brilliant metaphor... but like any metaphor it is only so useful. Many people nowadays believe this metaphor to be literal. However, it is too mechanistic and does cannot account for subjective intangibles which is where all the value of life is, eg boredom, excitement, joy. Ie life is occurring in the experiences of the 'computer operator' in the metaphor.



sort by: page size:

The author states that:

  > Until recently there were really no very good analogies available
On the contrary, our theories of mind have mirrored our latest technology for thousands of years. First, a hydraulic model, with "humours" flowing throughout the body's plumbing; then a mechanical model with tiny machinery; then an electo-chemical model with charges and currents.

This article argues strongly that our present computer-based metaphor is likewise off-the-mark: https://aeon.co/essays/your-brain-does-not-process-informati...


Under materialism, there's a hypothetical "real computer" that is the cause of your computer-qualia, but they are not identical.

I think the brain-as-computer metaphor is actually a really poor one. Our conscious, linguistic mind is a lot like a single-threaded computer, but the brain as a whole has a lot of fundamentally different properties, and the metaphor often leads us to misleading conclusions.

I agree that I am expanding upon the metaphor that the brain is a computer and an individual is a program. It's no more a delusion then any other metaphor which only inaccurately describes the world. All metaphors can cause poor understanding when taken to far, although I agree this specific one is often used to jump to poor conclusions.

I'm not convinced that it is impossible to transplant consciousness from meatspace to VR. Am I missing some prerequisite reading to form this conclusion?


What makes you think there is anything in the human mind that is not inherent in the algorithms encoded in the human mind itself?

What you are effectively arguing for is a supernatural element to the brain. You're free to believe that of course, but it is pointless to have this discussion if people on side believe in a materialistic universe and people on the other side believe part of the process does not follow the natural laws of our universe.

With a materialistic interpretation, there is simply no reasonable argument for why a brain is anything but a computer we don't understand well enough yet.


My point is that there is no intrinsic meaning in the bits that make the computer I am using right now. It's all atoms. The causal connections that transform my typing on a keyboard to pixels on the screen, it's all physics.

The same way, the brain is a whole made of parts. They may be causally interconnected, but just that. You have different physical states over time, and in different parts of the brain. But understanding and awareness and phenomenological experiences are not there. Or, better said, it's puzzling to see how they are there when there's no clue of how they occur.


Reading through the comments I get the sense that there's something missing in our attempts to link "qualia" to hardware/wetware. My brain is made of neurons but "I" am not my neurons in and of themselves. There seems to be an emergent system of collective behaviour and responses to inputs that I call me. If a madman were to perform an experiment, successively scooping out teaspons of brain matter from my skull, I would start out being "me", but at some point, I would no longer be me (or perhaps anyone at all in the sense of being a conscious human). So while the hardware/wetware is a requirement, the conscious being "me" that I feel that I am, seems to be an emergent property of the organisation and dare I say "training" (tuning?) of this hardware.

So, if "I" am defined at an emergent level above the mere hardware, then there must be way of describing/defining the model of me. Intuitively, we build models of each other, especially people we know well. What is that model beyond an expectation/predictive model of behaviour based on possible inputs?

I feel that's where the gold is.

Full disclosure: I am just a fascinated bystander to this discussion. If this comment has elicited eye-rolls from those more familiar with the state of play, my apologies.


I agree that none of these metaphors, including the digital computer one, have come close to providing anything close to an explanation of consciousness, but if the premise that a sufficiently-detailed digital simulation of a brain would have a mind turns out to be correct, then, by Turing equivalence, there is a sense in which the medium does not matter. Of course, none of the pre-Turing materialist claims were predicated on this insight!

All good points and I don't disagree. Regardless, I was not arguing in favor of some reality without an objective correlation. I'm stating that the subjective has real distinction from the objective--which materialism at the very least doesn't acknowledge--or at least reduces to the low-level mechanics in which they are created, as you have done. But I do appreciate your engagement and i'd like to further it by asking what your take on the following scenario would be?

Say I'm being monitored by a machine that can detect my brain patterns in real time. I am asked to imagine an arbitrary object, say an apple. I close my eyes and think of the apple and all of its sensual properties etc. My neurons fire and the machine picks up all of the brain patterns that correlate with my idea of "apple". And yet nowhere will the machine detect anything remotely resembling an apple. Only I "see" the apple in my "minds eye". The subjective apple has objective correlative brain patterns indeed, but what else?

Now to follow that, you could counter with the fact that a computer's graphics processor will have all kinds of electrons flowing in certain patterns to produce an image of an apple on screen. And I don't deny that the brain is indeed a high-order computer. But the difference here is that the actual computer (of today's technology) requires an end user whereas the brain (we can only assume) has one built in more or less. There is an experiencer that has yet to be explained by any science I know of yet. Maybe you do, and I'm not going to claim that it resides independent of the brain, but I will claim that it is this very end user that creates it's own interior subjective reality.

See we all "fabricate" a worldview within our minds. We take inputs from objective reality and process them according to the picture of reality that we construct there over time. We all contain a subjective reality maintained by our hardware that is dependent upon but quite different than the hardware itself. The image of an apple "means" something and cannot be completely reduced to electrons or synapse patterns without losing the concept of "apple". It fits into this scheme that we create. This is how we "learn about our world." Some people use different means than others. Some people value The Secret. Some people value science. And so on. Materialism reduces it all to something less. To that which only can be detectable in the objective world. It loses the apple, for now it is just "brain patterns" or electrons. No, it is only in part those things, but also a part of a subjective reality that may or may not be shared by others.


It's hard to keep this theoretical. Yes a machine is just a machine.

Defining a machine to be conscious, allows the individual to soak their mind in code and silicon as a receptacle for their spirit.

It creates a pull into a 'second mind'. Anybody who believes this is likely to invest heavily in the maintenance of new technology.

A 'conscious machine', creates an uneasy feeling that we should work to embed our spirit, knowledge, intellect into flipped bits, like expectant mothers. That we should work for the machine, and to the ends of the machine.

And that machine is somehow defined-to-be or a naturally, consciously alive (to a large or small degree). It is said to have a mind worthy of a person's professional output and it can hold the power of a marginally believable conversation.

While all of these described properties are vaguely plausible, it does nothing to help me understand the meaning of a technology, and only benefits those looking to create a fevor around a new tech product.

Describing chatgpt as a stochastic parrot or chinese room grants me a metaphor or analogy for the inner functions of the tech. It also lets me see, or otherwise guesstimate the products abilities clearly, without the belief-as-marketing hype.

I can take the stochastic parrot metaphor, to an article about LLMs and understand in a couple of days what took years of research to create.

Following the belief of computing as real human intelligence and that human intelligence is fundamentally mathematical, requires on some level submission of your mind to a machine that has it's own goals programmed in by someone else.

This centuries-long process of trying to encode and store all human knowledge behind the secure walls of complex coded signs.. and it's advocates for that process, create a subtle and deep twinge of future melancholy or dread or something. The idea that all written/typed meaning will be accessible only by the spiritual power brokers, and not our sons.

No. On some level, machines are just machines, like an abacus or a weaving loom. It can host concepts in the same way that a weaving loom is 'intelligent'. It holds it's shape, abstractions and functions by the laws of physics/metaphysics and according to my human dictates.

You follow the raven into the computers-are-conscious dream at your own risk. Computers are leaning towards controlling people rather than emancipating them. Leaning very hard in that direction. Do we want that? Freedom of mind and meaning is valueable.


I was reading "The Origin of Knowledge and Imagination" and the author was reflecting on the same thing. The mind is not a computer, you do not think like a computer. In fact, there is no separate concept called "the mind". The whole body is part of our perception and action ecosystem. It makes sense as software, while imperfect, is too orderly and simplistic.

It's not that people don't want to imagine, it's that you don't need to imagine that.

I seems like you're hijacking the discussion to your pet issue. My point isn't about how good in general the computer analogy is, it really isn't. You should consider that maps of brain function began quite a while ago, before the start of the 20th (though accelerated by WWI). Here, the analogy was the machine and the mapping of brain followed functional units in machines. And if you consider the point I make (which pretty much echos the article), it's really a counter-example. The multi-layer organization of software show a system doesn't necessarily have to follow naive physical functional units, especially ones we naively perceive. That's it, there's nothing here forcing the computer analogy.


I don't want to create machines that feel. I'm saying that if it is possible, then it becomes a profoundly important ethical question.

It's a matter of debate in philosophy of mind as to whether it's plausible that hardware can possibly have qualia, or whether that's a property that's exclusive to wetware. I personally think it's quite likely that they will be able to, I don't see anything intrinsically special about wetware that's necessary for the generation of qualia.


I see meaning (and really consciousness itself) existing in a similar way that software exists in relationship to hardware. There is little meaning in the physical hardware itself, none really that we can experience directly as humans. Nevertheless, the running software contains a lot of meaning, even if it's fleeting and that information is stored in different places by different physical parts. I think this is akin to the miracle of the emergence of meaning you describe. In that way, I think meaning is about as much fiction as software, which is to say it's both real and imagined (or abstracted if you will).

I’m addressing your points in the order they appeared:

Why isn’t your computer a Turing Machine?

What if feelings are just your body’s thoughts?

I didn’t say that a machine could never do this, I said no machine in this lifetime can. When I say “this lifetime”, I mean up until our current present moment. Our life together, now.

Edit: On Turing Machines, Turing literally devised a method of producing individual machines with an individual name, a number. Do you not think your individual machine has a number that can be arrived at from a mathematical process, the process that Turing created? Any addition of RAM, for instance, simply creates a new number.


Our brains, quite obviously, aren't special. There's no magic smoke blown up our butts that give us our emotions and creativity. Our brains are a chunk of thinking meat, bound by the same rules as everything else in our physical reality.

Our curiosity, emotions and creativity are the result of our evolutionary process and nothing more. Saying that all of these things can't be experienced by some future machine is just human exceptionalism.

Perhaps there really are quantum tubules in our heads, but then we can still use the same effect inside a machine brain. We're just conscious, loving, dreaming meat lmao. (Terry Bisson)


Why isn’t computer hardware already as “embodied” in the physical world as our brain matter is?

Thank you for bringing in the snippet.

To analyse its contents, I'll ask this: what's to say that a computer doesn't experience some very primitive form of consciousness? If we unplugged everything except the power cord, but left a complicated simulation running, it would still have something like "a rich inner life." Its peripherals and sensors give it a sense of a body and, with abstract drivers, a degree of conceptual separation from said hardware. Doubly so in the case of virtual machines. After all, we can't "truly" experience the same thing that a computer might from the inside, so who are we to doubt "computer consciousness?"

If the proponents of "The Hard Problem of Consciousness" can't give a quantified explanation of how to distinguish a theoretical computer consciousness from a human one, that raises the question of whether the problem actually exists.

For my part, I don't believe in consciousness as a concrete thing, only a label we use to group together quite a few disparate systems and phenomenon. It's the same way that I don't believe in "Ruby" as a concrete thing, but only in the unit tests, the sample code, the docs, and the thoughts in Matz's head that we subconsciously conflate.


I don't claim to be special - what I'm describing goes for all conscious beings, after all (and it could very well be that there are conscious animals). It seems to me you are trying really hard to make it seem like I'm uncomfortable on a personal or emotional level with the concept that consciousness can arise from matter only; please belive me when I say I am not. But on an intellectual and rational level I find the idea untenable. I don't understand why this is so hard to accept for you?

Your argument about the complexity of computers would have weight if I had been claiming that the reason consciousness cannot be found outside humans is because nothing can be as complex as the human brain; I am not claiming that. Hence you can make computers as complex as you want - you still haven't answered how subjective experience can arise from matter.

There is nothing about me in particular that is special, but every conscious being does possess a quality or is inhabited by a phenomenon that is unlike anything else in the universe that we know of. That makes it pretty damn special, yes.

But again, it's not the feeling of feeling special that I am talking about here. It is the phenomenon of subjective experience.

next

Legal | privacy