←back to thread

688 points samwho | 1 comments | | HN request time: 0.21s | source
Show context
dawnofdusk ◴[] No.45017382[source]
Whenever I read content like this about Big O notation I can't help but think the real solution is that computer science education should take calculus more seriously, and students/learners should not dismiss calculus as "useless" in favor of discrete math or other things that are more obviously CS related. For example, the word "asymptotic" is not used at all in this blog post. I have always thought that education, as opposed to mere communication, is not about avoiding jargon but explaining it.
replies(3): >>45017452 #>>45017670 #>>45017900 #
samwho ◴[] No.45017452[source]
Part of the problem is that a lot of people that come across big O notation have no need, interest, or time to learn calculus. I think it's reasonable for that to be the case, too.
replies(3): >>45017484 #>>45017558 #>>45018226 #
growthwtf ◴[] No.45017484[source]
I'm not the original commentator, that makes a lot of sense! I had assumed there was a huge overlap, personally.
replies(1): >>45017515 #
1. samwho ◴[] No.45017515[source]
I think it's pretty common for folks to enter the software field without a CS degree, start building apps, and see big O notation without understanding what it is. These people have jobs, deadlines, they want to ship features that make peoples' lives easier. I'd bet many of those people don't care so much about calculus, but a quick intro to what all this big O nonsense is about could help them.