←back to thread

688 points samwho | 2 comments | | HN request time: 0.713s | source
Show context
fracus ◴[] No.45018314[source]
Having gone through electrical engineering, I always felt that Big O notation was never properly explained unless I happened to miss every time it was introduced. It always felt like it was just hand waived as something we should already know. I'm curious what level of math or computer science is usually responsible for introducing it.
replies(7): >>45018615 #>>45018641 #>>45019080 #>>45020045 #>>45021112 #>>45027527 #>>45029029 #
bongodongobob ◴[] No.45019080[source]
Comp sci 101. Learned it my first year. There really isn't anything to it. It just describes how the number of operations grow as the number of inputs in your algorithm increases. It looks scary, but it's very simple and straightforward.
replies(2): >>45020265 #>>45023496 #
1. nwatson ◴[] No.45020265[source]
It can get a lot more complicated though. At times there are more parameters to an algorithm's complexity than just `n`. E.g., the parameters for big-O might be `n`, `K`, and `ρ`, and some expressions in the big-O might involve combinations of those parameters.

In such cases as well, one might know for a specific application of the algorithm that, for example, `ρ` is bounded, and so the big-O becomes more influenced by the `n` and `K`.

replies(1): >>45021271 #
2. bongodongobob ◴[] No.45021271[source]
Yeah that's fair. Calc 1 should be enough to understand that.