> This is not true. Computing is a system where the physical hardware (the computer) is following rules that are explicitly encoded in itself - you can analyze the system and discover where and how the program it is following is encoded (as we did with DNA).
1) No. The rules are just physics. Electric signals going through physical material.
2) Why is this event relevant? I can say the same about water pipes in a sweage system . The way the water moves through pipes is encoded in the design and connections of the pipes. You can inspect it to figure out how it behaves.
I've yet to see a model of consciousness that does not imply that the sewage system is conscious.
> Consider a big supercomputer doing finite-element analysis. Or a big network of packet routers.
There's another perspective here. We always argue from a very anthropocentric point of view and measure experience based on actions and reactions that a human would perform. Ie. asking questions, seeing something and saying something based on what we see, etc. The supercomputer you described can hardly be considered conscious with regards to such actions. However it might be very conscious with regards to a different set of actions, ie. turning off some switches which allow for intercommunication between the different CPUs (the computer might react by rerouting some packets), it might react to raising the room temperature (by spinning up the fans), it might react to removing some RAM (by allocating more data on the hard disk), etc.
> In conclusion I want to shatter our presumed consensus: computers do not compute. Not even computers. We compute. Computers just help us along.
Define "us". Define "compute". Seeing as the former is very fuzzy, and the latter is suggested by the article to be undefined, trying to use this to "prove" that brains are not computers is, well, nonsense.
Consciousness is a side-effect of electrochemical interactions. Nothing more, nothing less. Trying to believe otherwise - that consciousness is some "special snowflake" that can somehow exist independently of the machine which creates it - is about as folly as trying to believe a magical sky-wizard sculpted mankind from clay. Whether this counts as "computation" depends on how "compute" is defined.
> It leads me to think there's no possible way we're not living in some type of computer simulation.
I find this line of reasoning inadequate. In the 19th century, would you not be compelled to believe that the universe was made of mechanical pulleys, levers, and pipes because the universe operated with the perfection of a well-designed machine?
OK, so there are some phenomena that correlate (bear a resemblance to) with how computers behave. well, there are phenomena that correlate with a great many number of things; we shouldn't be surprised that computers and information theory are one of them
> Let us say that we have a computer system so complicated that it is capable of mimicking a human being with perfect accuracy. You would probably say that that computer is not a being that has consciousness, because it is merely doing calculations to replicate behavior.
Actually, I would seriously consider the proposition that it is intelligent, especially if it wasn't mimicking one specific person, and would be inclined to take Occam's razor to the proposition that it is actually faking it in ways that cannot be explained.
> No one would ask whether the water-channel system 'groks' things
I would. It's very common to describe the flow of electricity as similar to the flow of water. If it's electricity in my brain that allows me to understand, why couldn't there be an analogous system involving water which also understands?
Any substrate which supports the necessary computational properties ought to be sufficient.
> I don't think you can say that life is built on computational processes unless you use a definition of "computation" that is so vague and all-encompassing that it becomes effectively meaningless.
I mean, there's nothing physically stopping us from simulating a brain, right? It's a finite object with a finite amount of physical ingredients, and therefore with a finite amount of computing power we can simulate what it does. To me personally, that's a computational process. Maybe that's an overly broad definition of computation, but I think these debates tend to be about whether there is something fundamentally different about "life" (by which I assume you include consciousness). But maybe that's not what you're saying.
> He points out that "if we are to suppose that the brain is a digital computer, we are still faced with the question 'And who is the user?'"
What does that question even mean? I think it seems deep because we humans have a tendency to ascribe some sort of supernatural aura to our lived experience. Life is something incredible but that (at least to my knowledge) is not uncomputable...
> There is no computational process that will produce the sensations of colour or sound or touch.
Got one: the brain!
> At best you will have some representation that requires having actually experienced those sensations to understand it.
> I don't know, but it doesn't matter, just like it doesn't matter if the code on your computer is on a HDD or SSD.
But it does matter! You arbitrarily dismiss the brain as "wrong level of abstraction" then bring up the computer which internals we know and understand as an example of parallel system.
If both are same, where's the hard drive in the brain? Which part of the brain is the CPU dedicated to consciousness? We know regions of brain are specialized in function, like the computers.
> But nevertheless, until people realize that this is what they should attempt, or research how it can be done, I feel we won't make any progress on the nature of consciousness.
This reminds me of the famous Andy Grove's fallacy. To paraphrase:
The engineers tend to apply knowledge and concepts of their field to biology because they think both engineered and organic systems are the same. In doing so they miss the elementary difference that engineering is about creation and development of new systems, and biology is about researching and understanding existing systems we have no prior knowledge of.
> Then, later, I define an incredibly complex abstract computer, such that the sand falling out of the bucket and shifting along the ground exactly represents a self-aware AI within my defined computer going through its states.
Is it possible though? There is likely no equivalence between all arbitrary dynamical systems, including those with very high entropy such as a bucket of sand, and systems capable of performing what we call computation.
Otherwise building a computer would be much easier than it currently is.
> In reality there are multiple systems that work together over multiple timescales to produce the behaviors we observe. Some of those systems can have their contributions mimicked by other interventions. Because of this complexity you can never say 'it's really about X', the best you can say is 'X plays a major role' or 'X contributes Y percent to this observed phenomenon'.
You can say the same thing about computer systems - as long as you don't understand the underlying logic. If you don't understand that the chemistry of transistors doesn't matter as much as the C code, you can say exactly the same critique about how a thinkpad works: "So while applying an external electrical voltage can act in a similar manner as causing a neuron to fire, it is far less precise than the calcium and sodium channel mediated depolarization which implements normal firing. Said another way 'bioelectricity' is not simple....In reality there are multiple systems that work together over multiple timescales to produce the behaviors we observe. Some of those systems can have their contributions mimicked by other interventions."
Once you do understand the logic - the 'why' of von neumann machines and Javascript and transistors, it's clear that your claim isn't true and there is an underlying logic. The trouble is, until we positively identify that logic, we can't know if it exists or not and we're stuck debating the bioequivalent of the fundamental computational significance of the clock cycle speed of a CPU.
> I don't have much justification, but I believe it's false. There is a lot more to the universe than computers. Computers cannot simulate even elementary physical systems accurately.
Correct me if I'm wrong, but wouldn't a system that computes a non-recursive function be some sort of box where, given the same inputs, you invariably obtain the same outputs; however there is no way of finding any mathematical or algorithmic expression that could, in any finite amount of time, predict the same result? To find yourself in that situation, you need a physical process that is intrinsically non-computable; being unable to compute the sum of computable processes is not enough.
> "Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.
Humans, on the other hand, do not – never did, never will."
In addition to neuroscience, going deeper to the atomic level and beyond, humans really do operate on symbolic representations of the world: representations that are stored as states of physical elements. And the algorithm that guides everything we do, would be the laws of physics.
> Brain doesn’t perform computations in any CS accepted sense (neither it is a turing machine nor it has any encoded program to execute any defined algorithm).
That's a big statement. Anything that can be effectively described with math, can be described using Turing machines and algorithms. Any Turing-complete system is equivalent. As far as we know, all of physics can be described by equations, therefore is (theoretically) computable. What makes you think brains are special? Are there any other physical systems that you think are uncomputable?
The only arguments I've found for why the brain can't be described by algorithms go into unconvincing pseudo-scientific arguments about the magic of quantum mechanics, which I find very unconvincing (quantum algorithms are still algorithms, and describable through math after all). Do you have a better one?
> Would you call an ants colony a computer? A tree? A government? But they all obviously processes information and seemingly perform computations, don’t they?
Some of your other examples are more complicated, but all of them can be described through the computational lens, and modeled as Turing machines. you will find many scientific papers filled with equations trying to describe the algorithms behind each of them
> We as humans use numbers and algorithms (which are really figures of speech, not things), to explain physical reality, not the other way around.
The numbers and algorithms here aren’t what anybody is saying could be sentient. Those numbers and algorithms are representations used to coerce physical objects into doing computation. Processors don’t care about the algorithms you see on your computer screen. So then the questions are “can computation be sentient”, or “are sentient things sentient by way of computation”, and I think that’s much, much harder to answer.
>The patterns we coax out of the machine are only meaningful to us, as humans, after we have interpreted the data and turned it into information.
I disagree. The number of processes that do something computation-like out of the space of all processes is vanishingly small. That we can input some informative sequence of bits into a computer, and the computer transforms that sequence into output which is then a different informative sequence of bits, tells us that computers are intrinsically information processors, and that information is not in general subjective.
> How is it that we can so easily describe some things about our brain states, that we'd never realize by physically looking at what brains do and how they behave.
As a software developer, this should be simple. Programs have their own state that you can understand without knowing the exact formulations of electrons in the processor that ultimately make up that state.
X = 5 sure but good luck figuring that out looking at the motherboard of a running computer.
> So you're claiming that organisms such as ourselves cannot be run on a Turing machine? Any particular reason/evidence?
This is just my thinking: a turing machine, or equivalently a digital circuit, doesn't want anything. It's an artificial construct, effectively a simulation, that can exist as an abstract mathematical idea.
If there is a real world where real stuff happens, real things have their own goals - generally increasing entropy or minimizing energy. Although it seems like we should just be able to capture this mathematically, in doing so, we remove the actual desire... this sounds a bit nutty maybe, but what else is the difference between reality and simulation? An abstract calculation, anything equivalent to a Turing machine has no skin in the game. Something happened in the universe does - and this could include conciousness, as a property of matter, which for example could be effectively sum of the compulsions of the composing matter to minimize their energy (you can look this up, there are theories about conciousness that posit it's a property of matter).
If there is no difference between reality and a simulation, I'm wrong, we can all be represented on a universal computer, and so as abstract math, and in some sense existence is meaningless it just follows from math. My experience doesn't support this, but presumably I would have evolved a blind spot if my existence actually was meaningless.
TLDR, I think there is some undiscovered difference between reality and simulation, that I would guess relates to desire / conciousness, that means we can't simulate conciousness or real intelligence
I'm a scientist, I know the above doesn't withstand any scrutiny, I'm just trying to share my speculation because you asked
> They're effectively just mathematical functions (albeit extremely complicated ones), which simply take inputs and return outputs without any intervening subjective experience.
So are human brains, which are subject to the laws of physics, and which work just as mechanistically as any computer.
Unless you hold a dualist view that the brain accesses a spiritual realm outside of the physical world, then the fact that a computer operates mechanistically does not mean that it lacks consciousness.
1) No. The rules are just physics. Electric signals going through physical material.
2) Why is this event relevant? I can say the same about water pipes in a sweage system . The way the water moves through pipes is encoded in the design and connections of the pipes. You can inspect it to figure out how it behaves.
I've yet to see a model of consciousness that does not imply that the sewage system is conscious.
reply