> "Brains have memory" is a bit of an understatement of what they actually do.
...but, I don't think anyone thinks that "brains have memory" is a complete statement of what they actually do. It's not a complete statement of what computers do either.
For Robert Calin-Jageman, it’s exciting that forgetting
seems to be a biological process like digestion or excretion
because that means it can, at least in theory, be ramped up
or down.
Huh? Of course it's biological. Why would one believe otherwise unless they made naive deductions from the metaphors du jure--computers and neural networks. Perhaps the journalist is misinterpreting the researcher, and was, indeed, working backward from modern metaphors to conclude that it's somehow novel to think the mechanisms intrinsically corporeal.
No, you're missing the point. Our brains don't work like von Neumann machines with efficient architecture. What you can and cannot remember is a - still poorly understood - function of many variables, including the various neurotransmitters and hormones that get released when you browse your phone in bed, when you get a reaction to your online posts, when you get a notification,...
We might be messing with our brains more than we think, that's the point of this article.
Ya, cool metaphor. Boids meets reactive programming.
In a book stuffed with ideas, Lingua ex Machina: Reconciling Darwin and Chomsky [2001] had two metaphors that have stuck with me.
There are Darwinian processes within the brain. Like your ear hears some noise, could be interpreted as either "cap" or "cat", those two possibilities fight it out until one wins.
There's a song, some kind of pattern sequence, weaving back and forth, which might be how parts of the brain intracommunicate, which might be how memories are encoded and retrieved (and may also explains why memories are changed by being recalled).
Having no scientific background and inclined to do computer analogies, I tend to think that our brain works like a RRD database [1], that is, we don't completely discard old information; instead, we store an aggregation of the said information as new information comes in. Then, these aggregations become our knowledge about a given subject; we don't need to store every detail, as this aggregated view is enough to help us understand new information.
Of course, if we ever need to reconstruct every detail of old info, we simply cheat, in the confabulation sense [2], possible with disastrous consequences [3]
I do not suppose that human memory is like a computer. Thinking of how it works, I believe it works a bit like LLM, you formulate a prompt and get memories. It is an automatic process an you do not see the details, like what the prompt was used.
And suppose if at some point you've learned how to a) store memory in a categorized form; b) use prompts formulated in a language (or at least in categories of language), then you can "forget" the old way of prompt generating.
> "Remembering" trauma could also mean that the brain's structure changes in some way as a result of trauma without necessarily recording how the event exactly went.
This is too vague to my taste. Isn't any memory a brain's structure change?
Isn't that sentiment come just from being used to our own limitations?
Making computer that forgets is like making more and more fantastic cars that run no faster than a horse because some people feel uncomfortable with them going faster.
I personally hate forgetting and I'd very much liked if I could call up, a computer system and recall every bit of my life. Everything I've seen, heard, and any data from any sensors that were in my vicinity.
Thanks for sharing this and I've been certainly thinking along the same lines.
To add to this, I've known many folks that can accomplish certain tasks almost automatically and creatively. If you asked them to recall exactly what they did to achieve it they couldn't. And this usually isn't action on concrete information either but on intuition alone.
If humans worked primarily on memory we'd have been toast a long time ago. There's too much variation in the natural world to confront it solely on the basis of memory. I'd say we're more experientially oriented as opposed to memory oriented
Right, I mean memory is sufficient for most sorts of things, but if you want to do the sort of retrospective debugging and analysis the article talks about, you either need to be born a savant or rely on some external system. Eventually, when we have good neural interfaces, I think this will be done automatically, but that may or may not happen in any of our lifetimes.
Natural selection is good for ensuring you make it to a bit past reproductive age, but for everything else it's really hit-or-miss. As one would expect. We're just the first known lifeforms that seem to have an imperative beyond replication.
Except its like both, and unlike both, and like and unlike many things. Sui generis.
I don't need to think about what my brain is like to figure out how to use it. My brain is the only thing in the world I really know how to use at all. The feeling guides the use.
If I feel like I'm forgetting lots of important things, that feeling is what should guide my intention to write things down. Not some weird metaphor like, I thought I was a database, but really I'm a cache.
Imagine someone saying, "I'm gained weight because I thought I was a garbage disposal, but now I realize I'm a community garden." This is wrongthink.
I mean, a cache is a database, so the metaphor isn't even useful.
So many different answers. We really don't know much about how our brain really encodes the data and we are so far away from actually reproducing it. Scary when you think all of us have a copy of this memory structure.
Technically, our brains do get full, even this article basically says so.
It's just that we don't experience error messages or shut down when that happens - instead, old memories are replaced with new ones.
The total capacity is limited, but the ability to create new memories is unhindered.
I wish we could choose what we forget, but it seems it's possible to choose what you remember - just relearn that stuff or recall it more often, then it will be at the top of the search results, so to say :-)
I’m a mathematician and tend to take things literally. I should not have mentioned the infinite tape part. What I should have said is that according to the article we don’t store memories in the way that a Turing machine does. There is no tape as such and there is no set of rules that the brains abides in terms of how to do the next step so to speak.
I gathered that the quote I referenced means that the state of a brain at time t is not sufficient to reconstruct memories or other meaningful information. The fundamental point of contention between you and others criticizing the article appears to be that you all believe that there is a storage mechanism in the brain in a similar (analogous?) fashion as a computer. I gather the author claims this is not so. Information is not stored in neurons in such a way that one “retrieves” it by accessing a storage location.
I don’t know enough about this stuff to intelligently comment on the veracity of it. I just know that someone far more knowledgeable than me and just about everyone else commenting says that our intuition about how this stuff works is wrong. That alone is worth causing me to reconsider my intuition on this stuff.
>Hand the same input to the same mind and you'll get a different output every single time, unless the mind willed itself to act rationally.
Are you sure? Remember that memory also counts as an input if it's used in a computation; it seems to me that this applies to both humans and computers.
For a harrowing account of what a mind may do when exposed to very nearly the same inputs, you may be interested in one segment from this Radiolab episode: https://www.wnycstudios.org/story/radiolab-loops
It describes a patient with transient global amnesia who has a looping conversation with her daughter. (There's a link to a video of the conversation on that page as well.) Under normal circumstances this wouldn't happen, as once you've had a conversation you also have memories of having that conversation. But if you're unable to form memories...
The human brain also forgets, something that may be a feature instead of a bug. Also, beyond compression––brains are simulation machines: imagining new scenarios. Curios to understand if ML provides anything analogous to simulation that isn't rote interpolation.
> Eventually we may be able not just to play back experiences but also to index and even edit them.
Like most things this may have unintended consequences. I think our ability to forget is an important "feature" of cognition. What would happen if we were unable to forget even petty squabbles between friends, loved ones, supposed enemies? How far could this escalate? Our ability to forget and put things behind us may be the reason we're still around.
Computers aren't humans and LLMs aren't human brains.
We have no way to reconstruct memories from a preserved brain (yet). The exact ways in which humans form memories and store information isn't even known yet; we're still drilling into the specifics from higher-level concepts.
Modeling the human brain like nodes with weights ignores a lot of biological processes. Blood/oxygen flow, hormones, neurotransmitter decay, physical locality, chemical delays and interference from things like myelin sheaths, and other physical processes affect the synapses that are partially mirrored by computer simulations of neural networks. Unlike neural networks, human brains also don't work based on a single clock signal triggering input and output from all notes in instant steps.
Human memories are also not just "data in, weights out". They are heavily modified by things like mood, concentration, language(s) spoken, context, and emotional triggers. There's no way to feed a dictionary into a brain. Memory preservation consists of multiple stages, with differing memory types, involving various brain segments with dedicated functionality that can actually grow back due to neuroplasticity in some cases.
Efforts are being made to emulate living cells on computers, but LLMs aren't that. Inversely, efforts are also made to feed brain cells artificial signals and train them to play video games, which results in different behaviour compared to the systems we use for LLMs or other AI systems.
What if our brains are not easily shaped? And maybe our brains are good at forgetting experiences?
reply