Most active commenters
  • eru(3)

←back to thread

190 points baruchel | 11 comments | | HN request time: 1.21s | source | bottom
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
1. taneq ◴[] No.44422458[source]
Huh? What did any of those theorists think about the phrase "time/space tradeoff"?
replies(2): >>44422643 #>>44422948 #
2. mhuffman ◴[] No.44422643[source]
They were likely the ones that invented the phrase.
replies(2): >>44423188 #>>44423268 #
3. bubblyworld ◴[] No.44422948[source]
That phrase is usually about a specific problem, right? This is more like given an arbitrary algorithm (you don't know which up front), can you optimise it's memory usage? Seems kinda surprising to me that you can.
4. taneq ◴[] No.44423188[source]
I'm not convinced of that if they thought that s=O(t/log(t)) where s=space (bytes or similar) and t=time (clocks, steps, etc.) I dunno about theoretical limits but in practice, there's at least 1-2 orders of magnitude to be exchanged between space and processing time.
replies(1): >>44424669 #
5. mort96 ◴[] No.44423268[source]
Really? But the whole idea behind the space/time trade-off is that in a bunch of cases, if you want to compute something in less memory, you need to solve it in more steps; or if you want to solve something in fewer steps, you need to use more memory.

This seems to wildly contradict the idea that the amount of memory needed is roughly proportional to the number of steps.

replies(2): >>44424665 #>>44425736 #
6. eru ◴[] No.44424665{3}[source]
The amount of steps is an (over)estimate on the amount of memory needed.

You can write some trivial programs that need this much memory. But most of them can be optimised to use less space (and still compute the same answer).

The big question was: can you always optimise the space usage down from the ridiculous maximum and to what extent?

replies(1): >>44424702 #
7. eru ◴[] No.44424669{3}[source]
That's in practice for practical problems. But they were interested in the worst cases.
8. mort96 ◴[] No.44424702{4}[source]
But plenty of programs can only be optimized to take less space by making them run slower (i.e take more steps)...
replies(2): >>44424731 #>>44425039 #
9. eru ◴[] No.44424731{5}[source]
Yes. But even with infinite time there's a limit to how much you can compress the space usage. And that's an interesting property to study.

In addition you can look at how much extra time you actually need: infinite time is a vast overestimate. The new proof gives much more reasonable time bounds.

10. Ar-Curunir ◴[] No.44425039{5}[source]
That is also what is happening here. When you reduce space to sqrt you are increasing time complexity. The interesting part of the result is that you can make this tradeoff for _any_ problem.
11. svat ◴[] No.44425736{3}[source]
> in a bunch of cases

That's precisely the issue: of course for many problems there are obvious ways to trade off space and time. Smart people who have spent their lives thinking about this are aware of this. :-) The issue here is: can this always be done? For an arbitrary computation that takes t time, can it always be simulated in much less space (using more time of course), no matter what the computation is? Lots of people have tried, and the best result until now was t/log t in the 1970s.

Edit: A comment on a previous discussion of this result by the author of the Quanta article, which substantiates and links to researchers' opinion that it's the universality (works for every problem) that's the surprising part: https://news.ycombinator.com/item?id=44075191