Most active commenters
  • samwho(5)
  • dawnofdusk(3)

←back to thread

688 points samwho | 17 comments | | HN request time: 0.058s | source | bottom
1. dawnofdusk ◴[] No.45017382[source]
Whenever I read content like this about Big O notation I can't help but think the real solution is that computer science education should take calculus more seriously, and students/learners should not dismiss calculus as "useless" in favor of discrete math or other things that are more obviously CS related. For example, the word "asymptotic" is not used at all in this blog post. I have always thought that education, as opposed to mere communication, is not about avoiding jargon but explaining it.
replies(3): >>45017452 #>>45017670 #>>45017900 #
2. samwho ◴[] No.45017452[source]
Part of the problem is that a lot of people that come across big O notation have no need, interest, or time to learn calculus. I think it's reasonable for that to be the case, too.
replies(3): >>45017484 #>>45017558 #>>45018226 #
3. growthwtf ◴[] No.45017484[source]
I'm not the original commentator, that makes a lot of sense! I had assumed there was a huge overlap, personally.
replies(1): >>45017515 #
4. samwho ◴[] No.45017515{3}[source]
I think it's pretty common for folks to enter the software field without a CS degree, start building apps, and see big O notation without understanding what it is. These people have jobs, deadlines, they want to ship features that make peoples' lives easier. I'd bet many of those people don't care so much about calculus, but a quick intro to what all this big O nonsense is about could help them.
5. ndriscoll ◴[] No.45017558[source]
The thing is, this is like saying lots of mechanical engineers have no need, interest, or time to learn derivatives; they just want to get on with "forces" and "momentum" and cool stuff like "resonance". Saying you have no interest in learning limits and asymptotes but you want to know what people are talking about when they mention asymptotic analysis doesn't make sense.

If you want know what calculus-y words mean, you're going to need to learn calculus. People use calculus-y words to quickly convey things professionally. That's why it's a "topic" for you to learn. The thing under discussion is a limit.

replies(1): >>45017629 #
6. samwho ◴[] No.45017629{3}[source]
I replied to this effect to someone else in this thread, but I think it's reasonable for people to want to have an idea of what big O is for (in software engineering!) without having to have a grounding in calculus. The notation is useful, and used, without it regularly.
replies(2): >>45018062 #>>45018069 #
7. lenkite ◴[] No.45017670[source]
Apparently Big-O notation was invented by Paul Bachmann in 1894. Its not a johnny-come-lately.
8. crystal_revenge ◴[] No.45017900[source]
> students/learners should not dismiss calculus as "useless"

This seems to be quite a bit of a strawman to me.

ML is such a major part of the field today and at a minimum requires a fairly strong foundation in calc, linear algebra, and probability theory that I earnestly don't believe there are that many CS students who view calculus as "useless". I mean, anyone who has written some Pytorch has used calculus in a practical setting.

Now in pure software engineering you will find a lot of people who don't know calculus, but even then I'm not sure any would decry it as useless, they would just admit they're scared to learn it.

If anything, I'm a bit more horrified at how rapidly peoples understanding of things like the Theory of Computation seem to be vanishing. I've been shocked with how many CS grads I've talked to that don't really understand the relationship between regular languages and context free grammars, don't understand what 'non-determinism' means in the context of computability, etc.

replies(2): >>45018105 #>>45018325 #
9. dawnofdusk ◴[] No.45018062{4}[source]
It's reasonable but essentially every "common misconceptions about Big O" is because people didn't have the necessary notions in calculus. For example, the fact that O(x^2) can be practically faster than O(x), due to the size of constants/subdominant terms, is confusing only if you never properly learned what asymptotic behavior is.

The practical question is whether you think it's ok to continue propagating a rather crude and misunderstanding-prone idea about Big O. My stance is that we shouldn't: engineers are not business people or clients, they should understand what's happening not rely on misleading cartoon pictures of what's happening. I do not think you need a full-year collegiate course in calculus to get this understanding, but certainly you cannot get it if you fully obscure the calculus behind the idea (like this and uncountable numbers of blogpost explainers do).

10. ndriscoll ◴[] No.45018069{4}[source]
Given the various ways people in this thread have pointed out you lack fluency with the notation, why do you think it reasonable for people to want to learn it without learning the concepts it's describing?
replies(1): >>45018517 #
11. dawnofdusk ◴[] No.45018105[source]
>This seems to be quite a bit of a strawman to me.

Not really, if you ever listen to CS undergrads or people in non-traditional schooling (bootcamps, etc.) talk about software engineering this opinion is essentially ubiquitous. People interested in ML are less likely to hold this exact opinion, but they will hold qualitatively identical ones ("do you really need multivariable calculus/linear algebra to do ML?"). It is precisely because people (primarily Americans) are scared to learn mathematics that they rationalize away this fear by saying the necessary mathematics must not be essential, and indeed it is true that many people get away without knowing it.

replies(1): >>45019028 #
12. oooyay ◴[] No.45018226[source]
I find myself in the odd position of disagreeing with you and the person you responded to.

First, software engineering doesn't just consist of Computer Science majors. We have a lot of people from accounting, physics, or people who have no degree at all. Teaching this concept in CS fixes very little.

Second, and complimentary to the first, is that asymptotic behavior is derivative of the lessons you learn in Calculus. You can't really full understand it beyond a facade unless you have a rudimentary understanding of Calculus. If you want to put this theory to the test then ask someone with a functional understanding of Big-O to write asymptotic notation for a moderately complex function.

I don't have a degree and in order to really understand asymptotics (and Big-O as well as the others) I read a primer on Calculus. It doesn't take a ton of knowledge or reading but a decent background is what will get you there. I do think we need a lot better continuing education in software that goes beyond O'Reilly style technical books that could fill this gap.

13. marcosdumay ◴[] No.45018325[source]
Most people from almost every specialty mostly dismiss calculus as useless, don't even remember linear algebra is a thing, and never learn or forgets everything about statistics. (But the reaction to probability varies a lot.)

I have no idea what's up with those disciplines, but it's an almost universal reaction to them. Unless people are very clearly and directly using them all the time, they just get dismissed.

replies(1): >>45018560 #
14. samwho ◴[] No.45018517{5}[source]
I’m not sure that’s quite my position. Happy to cede that I lack fluency, and I appreciate your time and the time others have given to help me understand.
15. samwho ◴[] No.45018560{3}[source]
Can’t imagine what it might be.
replies(1): >>45029360 #
16. the_af ◴[] No.45019028{3}[source]
> non-traditional schooling (bootcamps, etc.)

It's probably unfair to qualify those as "computer science education" though...

17. marcosdumay ◴[] No.45029360{4}[source]
If that's sarcastic, do you have any idea why probability is different?