A wire on its own is not computational, neither is an amino acid but they do make up computational systems. It seems like a bit of an arbitrary distinction
I do not see how this is a bad thing. It seems incorrect at some level, but claiming that things that compute are somehow categorically different is an appeal to magical thinking.
I would agree that claiming that a rock computes by not simply vanishing from one plank time to the next is not satisfying. This leads me to think that computation has much more to do with whether a particular being has reach a thermodynamic local minimum than anything else (lava does not compute since, despite being far more active than a rock, its behaviour can be explained by the fact that it is a couple thousand degrees hotter than a normal rock). Energy dissipation also does not fit the bill since stars dissipate energy but do not compute.
Unfortunately the thinking surrounding things like proteins look incredibly similar, their behavior changes as a function of ph and temperature, and most arguments that a protein computes are based on defining a function for that protein. This gets us nowhere, but it does suggest that it may not be possible to define computation in a way that excludes systems dissipating energy to reach thermodynamic local minima.
As someone who has a PhD in biochemistry and has been coding since age 5, I find the headline rediculous. You won't be computing in biology. Consider, a transistor is about 50-100 atoms wide. A protein, which cannot itself even be a minimal unit of compute, (maybe a molecular transistor like NiFe hydrogenase has a shot) is already bigger than that.
The things described in the article are not biology used for computation, but principles from CS applied to biology which has some validity. In one case I achieved strides in protein design by reducing the engineering cycle from months to days. Imagine hitting "compile" and having to wait a 72 hours to know your result. That's biology, even with a fast organism like yeast or e coli.
Again I agree with you, because I had a similar experience. But, again, my conclusion is different than yours.
You write that we should not talk about biochemistry as computation, as far as I understand. Instead I'd say that we have not studied enough how nature does computation without programmers or even human friendly semantics.
Is still computation, involving space and physics. Too complex to efficiently simulate it (for now) but not big enough so that the emerging behaviour is simple, like for a gas.
Except that it doesn’t represent logic, it represents strings of amino acids that form proteins that fold up in unpredictable ways and interact with lots of other things to do stuff that half the time is weird and inefficient and doesn’t make sense.
I just assume poor phrasing. Proteins are much more complex than logic gates, so the universe is doing some "free" computation for you. I.e. you've pushed more computation to the runtime, and out of your source code.
Computation chemistry has been going on for decades. Attempts at modeling how molecules behave is still quite primitive. We're talking about modelling the behavior of a object that is comprised of a few dozen atoms. That's it, pretty simple right? We'll the models aren't that good at predicting molecular behavior.
Let's move up a step now. Computation chemistry is used heavily by the drug industry. Get an x-ray structure of a protein (maybe a few hundred to a few thousand atoms) and see if it binds to a drug. Wow, now it's getting complicated. How successful is it? Not very. I can remember a computational chemist saying "oh hey, the model say if you replace X with Y, you'll increase binding by 10x". So we try and guess what? The binding was worse.
Now we move up to a biological system. Now we have hundreds (if not thousands) of proteins floating in a matrix of water and ions. We have a DNA strands of millions of base pairs, of which maybe 10% we actually know what they do. We also have small signalling molecules that do something we understand, but probably also do 10 other things we have no idea about.
It is very impressive how far biological "design" (genomics) has come so far, but right now the tools are incredibly blunt and the analysis is incredibly crude. I have no doubt our understanding will improve immensely over the coming decades, but I would guess we understand less than 1% of what's going on inside of complex living organisms.
I do not agree at all. I think that there must be one (or a few) simplest possible computing systems and that it would be both informational and interesting to figure out the basis upon which biological computing is premised.
I think from my understanding they do have this same kind of computational framework but in a biological context. For example, multiple RNA strands can be processed by ribosomes at once - they are essentially reading code to produce outputs (multi-threading/processing?). Bacterial populations will transmit information via chemical secretions, and coordinate in similar ways (slime molds). I feel like that complexity already exists in nature.
hey I am glad that you are an expert and your criticism is of this nature! With my limited knowledge, I see exactly what you mean and I agree with you. There is another difference, however, between the two computes that I think you have omitted in your comment. The cell's inner complexity shares the same information space as the organisms "computes," unlike a computer CPU (and the operating system) which is defined and built by a file entirely outside of the 10mb limit I've set. The cell must fit all of it right next to each other.
Doubtful. You can to computing with DNA http://dna.caltech.edu but it is still debated if the actual role of DNA is being a code (some recently weasel out form that stance by saying it's an “app”, like there's a difference, machine code being a code). Coding theory applied to DNA yields inconclusive results. Galois theory usually has power over any kind of information encoding, cryptographic, computable or not. One constructed one for DNA convinced mathematicians it's not the way to go at all. If it's computation it's nothing like what we mean by any model computation, you may as well say it's magic instead of making stretched analogies.
About OpenBSD and evolution, that's so fetch... these folk invented the attack in the first place, not evolved a response to some market force in the early oughties. Broadly I don't think designed systems are in business of evolving, otherwise living organisms could perhaps evolved electro-hydrostatic instead of hydraulic power system with a pump being a single point of failure.
”It goes on about complexity and emergence, but why would complex interactions not emerge from a computer simulation just as they do in the real biochemical system?“
I think the point he is making is that if your goal is to simulate the human brain you also have to simulate and thus understand all the little details of biology because transistors don’t magically have the same properties as proteins.
> But this lucid vision of circuit-like logic, which worked so elegantly in bacteria, too often fails in more complex cells. “In bacteria, single proteins regulate things,” said Angela DePace, a systems biologist at Harvard Medical School. “But in more complex organisms, you get many proteins involved in a more analog fashion.”
That nature computes (many, if not all natural processes can be thought of as computational in some respect), and that many computational systems can be thought of as at least analogs of physical or biological processes.
I would not call gene systems computers. They may be programmable, but they do not generally do computation (that's not to say you can't force a gene systems to do very crude computation).
The amount of computational power in biological systems is simply staggering.
In extremely simple organisms like roundworms, there are on the order of hundreds of neurons; for most insects you're in the 10k-1M range.
A honeybee contains one million neurons, which are computational devices that we have a hard time fully and accurately mapping, and something like a billion connections between them.
Each of those neurons contains the entire genome for that honeybee, around 250 million base pairs. Those code for all of the ~thousands of proteins that make up a honeybee - proteins are made up of sequences of amino acids which arrange themselves into shapes with different molecular interaction properties. Figuring out that shape given the amino acid sequence is so computationally difficult that it spawned the Folding@Home project, which is one of the largest collections of computing power in the world.
The process of translating from DNA through RNA to a protein is itself substantially harder than it sounds - spend time with a bioinformatics textbook at some point to see some of the features of DNA, such as non-coding regions in the middle of sequences that describe proteins, or sections of RNA which themselves fold into functional forms.
None of this is even getting down to the molecular level, where the geometry of the folded proteins allows them to accelerate reactions by millions or trillions of times, allowing processes which would normally operate at geological scales to be usable for something with the lifespan of a bacterium.
The most complex systems we've ever devised pale in comparison to even basic biological systems. You need to start to look at macro-scale systems like the internet or global shipping networks before you start to see things that approximate the level of complexity of what you can grow in your garden.
reply