←back to thread

190 points baruchel | 1 comments | | HN request time: 0.206s | source
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
zombot ◴[] No.44422352[source]
> log(t)

log to what basis? 2 or e or 10 or...

Why do programmers have to be so sloppy?

replies(5): >>44422370 #>>44422485 #>>44422742 #>>44422905 #>>44422929 #
1. anonymous_sorry ◴[] No.44422905[source]
It's not sloppiness, it's economy.

You can convert between log bases by multiplying by a constant factor. But any real-world program will also have a constant factor associated with it, depending on the specific work it is doing and the hardware it is running on. So it is usually pointless to consider constant factors when theorising about computer programs in general.