←back to thread

302 points sebg | 1 comments | | HN request time: 0.205s | source
Show context
cgadski ◴[] No.45051795[source]
> This blog post has explored the most critical equations in machine learning, from foundational probability and linear algebra to advanced concepts like diffusion and attention. With theoretical explanations, practical implementations, and visualizations, you now have a comprehensive resource to understand and apply ML math. Point anyone asking about core ML math here—they’ll learn 95% of what they need in one place!

It makes me sad to see LLM slop on the front page.

replies(2): >>45051866 #>>45057251 #
maerch ◴[] No.45051866[source]
Apart from the “—“, what else gives it away? Just asking from a non-native perspective.
replies(6): >>45051895 #>>45052056 #>>45052094 #>>45052168 #>>45055799 #>>45061696 #
Romario77 ◴[] No.45052168[source]
It's just too bombastic for what it is - listing some equations with brief explanation and implementation.

If you don't know these things on some level already the post doesn't give you too much (far from 95%), it's a brief reference of some of the formulas used in machine learning/AI.

replies(1): >>45055367 #
1. random3 ◴[] No.45055367[source]
Slop brings back memories of literature teachers red-marking my "bombastic" terms in primary school essays