←back to thread

190 points baruchel | 2 comments | | HN request time: 0s | source
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
cubefox ◴[] No.44422855[source]
> Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory

At least or at most or on average?

replies(2): >>44423326 #>>44423662 #
1. jeanlucas ◴[] No.44423662[source]
In the best case
replies(1): >>44428343 #
2. utopcell ◴[] No.44428343[source]
This is a worst-case result. In the best case, it is easy to see that the gap can be exponential. Imagine a program that increments a counter until it reaches value n. This "program" would only need O(logn) memory: mainly the counter itself.