←back to thread

190 points baruchel | 1 comments | | HN request time: 0.36s | source
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
heavenlyblue ◴[] No.44422406[source]
Why does it need the same bits of memory as steps?
replies(2): >>44422581 #>>44423096 #
1. mjburgess ◴[] No.44422581[source]
The basic intuition of the two extremes are:

* High space requirement: we need a bucket per step to track the state

* Low space requirement: we only need a single bucket, if we can mix/unmix the various components of the state

High-space requirement algorithms will be those with a large amount of "unmixable" complex state. Low-space reqs will be those with either a small state, or an easily mixed/unmixed state.

Example of mixing: suppose we need an odd number and a parity (+,-) -- then we can store odd/even numbers: taking even to means (-1 * (number-1)) and odd means odd.

So 7 = +7, 8 = -7

I don't know if this example is really that good, but I believe the very basics of the intution is correct -- its to do with how we can trade "interpretation" of data (computation) for the amount of data in such a way that data can kinda be mixed/unmixed in the interpretation