←back to thread

105 points mathgenius | 4 comments | | HN request time: 0.197s | source
Show context
antics ◴[] No.43624766[source]
This is only tangentially related, but one thing that I am still kind of hopeful about is analytic combinatorics will free us from the tyranny of asymptotic analysis.

I suppose this is kind of a hot take, and in some ways less relevant than it was (say) 10 years ago, but I believe the theoretical CS community has spent a lot of time analyzing the performance of various algorithms over the years and kind of pointedly not inventing algorithms which accomplish some result at all, even if it is only known to be computable in a pathologically inefficient way. This means a lot of breakthroughs that could have been TCS breakthroughs have been invented, usually worse, by other communities.

I am not the only one who thinks so, here[1] is an obscure talk where Michael Mitzenmacher (a well-known CS theory prof at Harvard) makes more or less this specific complaint.

One of the nice things about Flajolet/Sedgewick[2] (mentioned in this post) is that it gives a nice set of combinators for (at least it seems to me, in theory) precisely enumerating how many "steps" an algorithm would take given some input. I optimistically imagine a world where we replace the relatively non-specific asymptotic bounds of an algorithm with a very specific quantification of the work an algorithm will take. And that this in turn would free TCS to focus on things that are more germane to the field as a whole.

With that all said I'm open to the argument that TCS has fundamentally changed and this is no longer such a big issue. But perusing the various TCS conferences I sense it isn't.

[1]: https://www.youtube.com/watch?v=p3anjSAnKBo

[2]: https://algo.inria.fr/flajolet/Publications/book.pdf

replies(8): >>43625074 #>>43625202 #>>43625358 #>>43625382 #>>43626260 #>>43627410 #>>43628547 #>>43637318 #
1. openasocket ◴[] No.43625382[source]
> One of the nice things about Flajolet/Sedgewick[2] (mentioned in this post) is that it gives a nice set of combinators for (at least it seems to me, in theory) precisely enumerating how many "steps" an algorithm would take given some input. I optimistically imagine a world where we replace the relatively non-specific asymptotic bounds of an algorithm with a very specific quantification of the work an algorithm will take. And that this in turn would free TCS to focus on things that are more germane to the field as a whole.

Check out the Art of Computer Programming, by Knuth. He writes all of the algorithms in a hypothetical machine code called MIX (MMIX in later editions). He does go through algorithmic analysis to the degree you describe, showing exactly how many cycles a particular routine may take to complete, based on different variables. This is actually fairly complex even for simple programs. Consider the simple case of computing the maximum in a list:

    let current_max := 0
    for x in array_of_numbers {
        if x > current_max {
            current_max = x;
        }
    }
    return current_max;

Anyone can tell you this runs in O(N) time. But exactly how many cycles would this take? Ignoring the architectural details, you would need to know how often the condition "x > current_max" is true. If the list of numbers is in ascending order, this will be true every time, if it is in descending order it will only be true once. The difference in the number of cycles between those two cases is significant. I believe Knuth works through this example, and shows that, for a randomly-ordered array, this would be true around log_2(n) times.
replies(2): >>43627140 #>>43628868 #
2. antics ◴[] No.43627140[source]
I actually have read a chunk of the relevant TAOCP, but didn't mention it because the comment was already kind of long. I remarked to a friend the other day that it's kind of funny that we've a little bit come full circle on asymptotic analysis in that we are now counting FLOPs, and it is not a little bit like what Knuth does here. :)
replies(1): >>43663155 #
3. pfdietz ◴[] No.43628868[source]
Natural log of n, actually. That's because the expected number of times a new max is found is H_n, the nth harmonic number, where n is the size of the array. And H_n = ln n + gamma + o(1).
4. ogogmad ◴[] No.43663155[source]
I think this needs the use of a CAS (Computer Algebra System, like Mathematica) to calculate the runtime of some code relative to different choices of computer architecture. Then it looks practical. Could even be added to IDEs. But without the help of a CAS, few people would attempt this, because it would be too tedious to do by hand.