←back to thread

190 points baruchel | 1 comments | | HN request time: 0.228s | source
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
andsoitis ◴[] No.44424342[source]
> Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

Does it require one to first have solved the problem in uncompressed space?

replies(1): >>44427541 #
1. jasperry ◴[] No.44427541[source]
As I understand it, this result is about algorithms that solve any instance of a given computational problem (like finding the shortest paths through a graph.) The specific problem instance (the graph) is the input to the algorithm. The result shows how to take an algorithm and construct another algorithm that solves the same class of problems in less space, by transforming how the algorithm reads and writes memory. Now you have an algorithm that still solves any instance of that problem but uses only sqrt(t) space.

So, to answer your question more directly: no, you don't have to have solved a specific instance of the problem, but you have to already have an algorithm that solves it in general.