←back to thread

302 points sebg | 2 comments | | HN request time: 0s | source
Show context
cgadski ◴[] No.45051795[source]
> This blog post has explored the most critical equations in machine learning, from foundational probability and linear algebra to advanced concepts like diffusion and attention. With theoretical explanations, practical implementations, and visualizations, you now have a comprehensive resource to understand and apply ML math. Point anyone asking about core ML math here—they’ll learn 95% of what they need in one place!

It makes me sad to see LLM slop on the front page.

replies(2): >>45051866 #>>45057251 #
maerch ◴[] No.45051866[source]
Apart from the “—“, what else gives it away? Just asking from a non-native perspective.
replies(6): >>45051895 #>>45052056 #>>45052094 #>>45052168 #>>45055799 #>>45061696 #
1. kace91 ◴[] No.45051895[source]
Not op, but it is very clearly the final summary telling the user that the post they asked the AI to write is now created.
replies(1): >>45061704 #
2. gandalfgreybeer ◴[] No.45061704[source]
I stopped reading the post before that and went back to check. It's so blatant...especially when it mentions visualizations.

> With theoretical explanations, practical implementations, and visualizations, you now have a comprehensive resource to understand and apply ML math. Point anyone asking about core ML math here—they’ll learn 95% of what they need in one place!