No, got a link ?
Searching for them did bring me to an interesting discussion :
https://www.lesswrong.com/posts/YSFRazdoWXKHgNakz/link-the-b... (2015)
and then to :
https://bayes.wustl.edu/etj/articles/second.law.pdf (1998)
Which confirms my suspicions, but also sheds lights on how old the confusion is !
There are a bunch of assumptions that are easy to make (because they almost always are true), but very hard to get rid of when they aren't :
- that entropy is objective/ontological rather than subjective/epistemic
- that entropy is equivalent to disorder
- that temperature can always be defined
- that entropy is extensive
(- I think there was at least another one, but I had to do something else in-between and I don't remember now)
- oh yeah, maybe it was that there's a difference between a distribution and a macrostate ? (not sure about it myself)
Now, I don't know what the Bayesian framework can bring to the table here (not being sufficiently familiar with it down to the nuts and bolts of calculations), but if it can prevent us (and future students) from making these mistakes over and over and over again, it would be real progress.