←back to thread

346 points swatson741 | 1 comments | | HN request time: 0s | source
Show context
joshdavham ◴[] No.45788432[source]
Given that we're now in the year 2025 and AI has become ubiquitous, I'd be curious to estimate what percentage of developers now actually understand backprop.

It's a bit snarky of me, but whenever I see some web developer or product person with a strong opinion about AI and its future, I like to ask "but can you at least tell me how gradient descent works?"

I'd like to see a future where more developers have a basic understanding of ML even if they never go on to do much of it. I think we would all benefit from being a bit more ML-literate.

replies(5): >>45788456 #>>45788499 #>>45788793 #>>45788867 #>>45798902 #
augment_me ◴[] No.45788867[source]
Impossible requirement. The inherent quality of abstractions is to allow us to get more done without understanding everything. We dont write raw assembly for the same reason, you dont make fire by rubbing sticks, you dont go hunting for food in the woods, etc.

There is no need for the knowledge that you propose in a world where this is solved, you will achieve more goals by utilizing higher-level tools.

replies(1): >>45789328 #
1. joshdavham ◴[] No.45789328[source]
I get your point and this certainly applies to most modern computing where each new layer of abstraction becomes so solid and reliable that devs can usually afford to just build on top of it without worrying about how it works. I don’t believe this applies to modern AI/ML however. Knowing the chain rule, gradient descent and basic statistics IMO is not the same level of solid as other abstractions in computing. We can’t afford to not know these things. (At least not yet!)