←back to thread

190 points baruchel | 6 comments | | HN request time: 0.218s | source | bottom
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
1. heavenlyblue ◴[] No.44422406[source]
Why does it need the same bits of memory as steps?
replies(2): >>44422581 #>>44423096 #
2. mjburgess ◴[] No.44422581[source]
The basic intuition of the two extremes are:

* High space requirement: we need a bucket per step to track the state

* Low space requirement: we only need a single bucket, if we can mix/unmix the various components of the state

High-space requirement algorithms will be those with a large amount of "unmixable" complex state. Low-space reqs will be those with either a small state, or an easily mixed/unmixed state.

Example of mixing: suppose we need an odd number and a parity (+,-) -- then we can store odd/even numbers: taking even to means (-1 * (number-1)) and odd means odd.

So 7 = +7, 8 = -7

I don't know if this example is really that good, but I believe the very basics of the intution is correct -- its to do with how we can trade "interpretation" of data (computation) for the amount of data in such a way that data can kinda be mixed/unmixed in the interpretation

3. TrueDuality ◴[] No.44423096[source]
Bits are a bit misleading here, it would be more accurate to say "units of computation". If you're problem space operates on 32-bit integers this would be 32bits * number of steps, these papers solve for individual bits as the smallest individual unit of computation we can commonly reason about.
replies(1): >>44423222 #
4. mort96 ◴[] No.44423222[source]
Wait but if "bits of memory" here was supposed to be "units of computation", that means that:

* The old assumption was that if you require t steps to complete, you need t/log(t) units of computation

* This new proof shows that if you require t steps to complete, you need sqrt(t) units of computation

Surely this doesn't make sense? Using any definition of "unit of computation" I would intuitively assume, computing t steps requires something proportionalt to t units of computation...

replies(1): >>44423637 #
5. nyrikki ◴[] No.44423637{3}[source]
There is a bit of nuances here that is difficult to explain fully but note from the paper.

> Our simulation reduces the problem of simulating time-bounded multitape Turing machines to a series of implicitly-defined Tree Evaluation instances with nice parameters, leveraging the remarkable space-efficient algorithm for Tree Evaluation recently found by Cook and Mertz [STOC 2024].

The "time-bounded multitape Turing machines" with bounded fan-in means that that particular abstract model has access to the bits of those tapes current head position.

Mapping the quirks of the various abstract models can be tricky, but remember having access to the symbol currently under the tape head is 'free' in TMs.

It is a useful abstraction but doesn't directly map to physically realizable machine.

It is still an interesting result for trees in physically realizable machines.

replies(1): >>44424050 #
6. conradev ◴[] No.44424050{4}[source]
Cook and Mertz call the wider area they work on “catalytic computing”: https://www.quantamagazine.org/catalytic-computing-taps-the-...