←back to thread

688 points samwho | 4 comments | | HN request time: 0.662s | source
Show context
dawnofdusk ◴[] No.45017382[source]
Whenever I read content like this about Big O notation I can't help but think the real solution is that computer science education should take calculus more seriously, and students/learners should not dismiss calculus as "useless" in favor of discrete math or other things that are more obviously CS related. For example, the word "asymptotic" is not used at all in this blog post. I have always thought that education, as opposed to mere communication, is not about avoiding jargon but explaining it.
replies(3): >>45017452 #>>45017670 #>>45017900 #
samwho ◴[] No.45017452[source]
Part of the problem is that a lot of people that come across big O notation have no need, interest, or time to learn calculus. I think it's reasonable for that to be the case, too.
replies(3): >>45017484 #>>45017558 #>>45018226 #
ndriscoll ◴[] No.45017558[source]
The thing is, this is like saying lots of mechanical engineers have no need, interest, or time to learn derivatives; they just want to get on with "forces" and "momentum" and cool stuff like "resonance". Saying you have no interest in learning limits and asymptotes but you want to know what people are talking about when they mention asymptotic analysis doesn't make sense.

If you want know what calculus-y words mean, you're going to need to learn calculus. People use calculus-y words to quickly convey things professionally. That's why it's a "topic" for you to learn. The thing under discussion is a limit.

replies(1): >>45017629 #
1. samwho ◴[] No.45017629[source]
I replied to this effect to someone else in this thread, but I think it's reasonable for people to want to have an idea of what big O is for (in software engineering!) without having to have a grounding in calculus. The notation is useful, and used, without it regularly.
replies(2): >>45018062 #>>45018069 #
2. dawnofdusk ◴[] No.45018062[source]
It's reasonable but essentially every "common misconceptions about Big O" is because people didn't have the necessary notions in calculus. For example, the fact that O(x^2) can be practically faster than O(x), due to the size of constants/subdominant terms, is confusing only if you never properly learned what asymptotic behavior is.

The practical question is whether you think it's ok to continue propagating a rather crude and misunderstanding-prone idea about Big O. My stance is that we shouldn't: engineers are not business people or clients, they should understand what's happening not rely on misleading cartoon pictures of what's happening. I do not think you need a full-year collegiate course in calculus to get this understanding, but certainly you cannot get it if you fully obscure the calculus behind the idea (like this and uncountable numbers of blogpost explainers do).

3. ndriscoll ◴[] No.45018069[source]
Given the various ways people in this thread have pointed out you lack fluency with the notation, why do you think it reasonable for people to want to learn it without learning the concepts it's describing?
replies(1): >>45018517 #
4. samwho ◴[] No.45018517[source]
I’m not sure that’s quite my position. Happy to cede that I lack fluency, and I appreciate your time and the time others have given to help me understand.