←back to thread

190 points baruchel | 2 comments | | HN request time: 0.001s | source
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
zombot ◴[] No.44422352[source]
> log(t)

log to what basis? 2 or e or 10 or...

Why do programmers have to be so sloppy?

replies(5): >>44422370 #>>44422485 #>>44422742 #>>44422905 #>>44422929 #
eviks ◴[] No.44422742[source]
Another reason is because base e notation is ln, not log
replies(2): >>44424223 #>>44424343 #
1. tempodox ◴[] No.44424223[source]
If only! The math libs I know use the name `log` for base e logarithm, not `ln`.
replies(1): >>44424277 #
2. eviks ◴[] No.44424277[source]
hm, you're right, this unfortunatley proves the original point...