←back to thread

Bayesian Statistics: The three cultures

(statmodeling.stat.columbia.edu)
309 points luu | 2 comments | | HN request time: 1.981s | source
Show context
brcmthrowaway ◴[] No.41081254[source]
Where does Deep Learning come in?
replies(6): >>41081343 #>>41081808 #>>41081817 #>>41081946 #>>41082236 #>>41116247 #
thegginthesky ◴[] No.41081343[source]
Most models are derived of Machine Learning principles that are a mix of classic probability theory, Frequentist and Bayesian statistics and lots of Computer Science fundamentals. But there have been advancements in Bayesian Inference and Bayesian Deep Learning, you should check the work of frameworks like Pyro (built on top of PyTorch)

Edit: corrected my sentence, but see 0xdde reply for better info.

replies(1): >>41081458 #
1. 0xdde ◴[] No.41081458[source]
I could be wrong, but my sense is that ML has leaned Bayesian for a very long time. For example, even Bishop's widely used book from 2006 [1] is Bayesian. Not sure how Bayesian his new deep learning book is.

[1] https://www.microsoft.com/en-us/research/publication/pattern...

replies(1): >>41081786 #
2. thegginthesky ◴[] No.41081786[source]
I stand corrected! It was my impression that many methods used in ML such as Support Vector Machines, Decision Trees, Random Forests, Boosting, Bagging and so on have very deep roots in Frequentist Methods, although current CS implementations lean heavily on optimizations such as Gradient Descent.

Giving a cursory look into Bishop's book I see that I am wrong, as there's deep root in Bayesian Inference as well.

On another note, I find it very interesting that there's not a bigger emphasis on using the correct distributions in ML models, as the methods are much more concerned in optimizing objective functions.