←back to thread

688 points samwho | 9 comments | | HN request time: 0.209s | source | bottom
Show context
fracus ◴[] No.45018314[source]
Having gone through electrical engineering, I always felt that Big O notation was never properly explained unless I happened to miss every time it was introduced. It always felt like it was just hand waived as something we should already know. I'm curious what level of math or computer science is usually responsible for introducing it.
replies(7): >>45018615 #>>45018641 #>>45019080 #>>45020045 #>>45021112 #>>45027527 #>>45029029 #
1. leni536 ◴[] No.45018641[source]
A function f(x) is said to be O(g(x)) if f(x)/g(x) is bounded, that is there is some C so that for every x, f(x)/g(x) < C .

In computer science f(x) is often some complexity function, like number of some specific operations when running an algorithm to completion.

replies(2): >>45020166 #>>45020570 #
2. mvdtnz ◴[] No.45020166[source]
What an asshole way of explaining it. For practical purposes (don't @ me if this is technically wrong, I don't give a shit) it's a description of how the cost of an algorithm grows with the size of the input. For example if the cost doubles every time the input increments then it's O(n^2). If the cost increases in step with the input size then it's O(n). Simples.
replies(4): >>45020348 #>>45020521 #>>45021650 #>>45023105 #
3. lern_too_spel ◴[] No.45020348[source]
If the cost doubles for each increment in the input size, it's O(2^n).
replies(1): >>45020563 #
4. tauroid ◴[] No.45020521[source]
Okay unless you're doing numerical analysis where they use it for the growth rate of the error term, which has nothing to do with cost or algorithms
5. mvdtnz ◴[] No.45020563{3}[source]
Right you are, my bad.
6. blt ◴[] No.45020570[source]
This needs to be refined: f(x) is O(g(x)) if there exists some X >= 0 such that f(x)/g(x) is bounded for all x > X.

Otherwise, we cannot say that 1 is O(x), for example.

7. badosu ◴[] No.45021650[source]
I don't have a computer science education, but I had a math one. I found it a great explanation, but I understand why others would feel it's not.

It definitely is not an "asshole" way of explaining it though.

8. xigoi ◴[] No.45023105[source]
This is not only “technically” wrong, but completely wrong. O-notation has absolutely nothing to do with algorithms.
replies(1): >>45023329 #
9. writebetterc ◴[] No.45023329{3}[source]
Let's not over correct, of course O-notation has something do with algorithms for the working programmer.