←back to thread

302 points sebg | 1 comments | | HN request time: 0.221s | source
Show context
cgadski ◴[] No.45051795[source]
> This blog post has explored the most critical equations in machine learning, from foundational probability and linear algebra to advanced concepts like diffusion and attention. With theoretical explanations, practical implementations, and visualizations, you now have a comprehensive resource to understand and apply ML math. Point anyone asking about core ML math here—they’ll learn 95% of what they need in one place!

It makes me sad to see LLM slop on the front page.

replies(2): >>45051866 #>>45057251 #
maerch ◴[] No.45051866[source]
Apart from the “—“, what else gives it away? Just asking from a non-native perspective.
replies(6): >>45051895 #>>45052056 #>>45052094 #>>45052168 #>>45055799 #>>45061696 #
1. cgadski ◴[] No.45052094[source]
It's not really about the language. If someone doesn't speak English well and wants to use a model to translate it, that's cool. What I'm picking up on is the dishonesty and vapidness. The article _doesn't_ explore linear algebra, it _doesn't_ have visualizations, it's _not_ a comprehensive resource, and reading this won't teach you anything beyond keywords and formulas.

What makes me angry about LLM slop is imagining how this looks to a student learning this stuff. Putting a post like this on your personal blog is implicitly saying: as long as you know some some "equations" and remember the keywords, a language model can do the rest of the thinking for you! It's encouraging people to forgo learning.