←back to thread

190 points baruchel | 1 comments | | HN request time: 0.367s | source
Show context
zerof1l ◴[] No.44421424[source]
Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

replies(7): >>44422352 #>>44422406 #>>44422458 #>>44422855 #>>44423750 #>>44424342 #>>44425220 #
zombot ◴[] No.44422352[source]
> log(t)

log to what basis? 2 or e or 10 or...

Why do programmers have to be so sloppy?

replies(5): >>44422370 #>>44422485 #>>44422742 #>>44422905 #>>44422929 #
Tarq0n ◴[] No.44422485[source]
This is very common. Log without further specification can be assumed to be the natural log (log e).
replies(4): >>44422511 #>>44422753 #>>44422859 #>>44423290 #
1. mort96 ◴[] No.44423290[source]
In this case, and in many other cases, log without further specification is meant to be understood as just "it is logarithmic" without further specification. With big O, we don't differentiate between log bases, just as we don't differentiate between scale factors. In fact, converting from one log base to another is just multiplying by a constant.