←back to thread

593 points gmays | 3 comments | | HN request time: 0.436s | source
Show context
earless1 ◴[] No.45772465[source]
So biological garbage collection pauses then? skip sleep, and the brain tries to run gc cycles during runtime. Causing attention and performance latency spikes. Evolution wrote the original JVM.
replies(5): >>45772560 #>>45773351 #>>45776679 #>>45777047 #>>45778878 #
layer8 ◴[] No.45772560[source]
Luckily it doesn’t clear all unreferenced memory, though.
replies(5): >>45772666 #>>45772718 #>>45773046 #>>45773081 #>>45773625 #
ghurtado ◴[] No.45773625[source]
I realize you're making a joke, but there is no such thing as "unreferenced memories", as in, something that is no longer in use and has been removed from the brain.

Every memory your brain has ever produced is still there, even if most are beyond conscious access. Memories quite literally become a permanent part of you.

A lot of people mistakenly think of human memory as a sort of hard drive with limited capacity, with files being deleted to make room for new ones. It's very much not like that.

replies(4): >>45773664 #>>45773783 #>>45773860 #>>45773941 #
pdonis ◴[] No.45773664[source]
If you are implying that human memory has infinite capacity, that's not possible. The human brain is a finite, physical thing. It can't store an infinite amount of data.

If you just mean that human memory has a finite capacity that's much larger than anyone has come close to reaching by storing the memories of a normal human lifetime, that might make sense.

Do you have any references for your statements about memory? I'm not familiar with whatever science there is in this area.

replies(3): >>45773815 #>>45774224 #>>45775057 #
1. jjk166 ◴[] No.45775057[source]
The claim that everything is there does not imply infinite, or even large capacity.

Consider an exponentially weighted moving average - you can just keep putting more data in forever and the memory requirement is constant.

The brain stores information as a weighted graph which basically acts as lossy compression. When you gain more information, graph weights are updated, essentially compressing what was already in there further. Eventually you get to a point where what you can recall is useless, which is what we would consider forgotten, and eventually the contribution of a single datapoint becomes insignificant, but it never reaches zero.

replies(2): >>45775647 #>>45779822 #
2. pdonis ◴[] No.45775647[source]
> The claim that everything is there does not imply infinite, or even large capacity.

It implies enough capacity to store everything. But what you describe is not storing everything.

> lossy compression

Which means you're not storing all the information. You're not storing everything.

> When you gain more information, graph weights are updated, essentially compressing what was already in there further.

In other words, each time you store a new memory, you throw some old information away.

Which the person I was responding to said does not happen.

3. balex ◴[] No.45779822[source]
And this description is based on what?