←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.247s | source
1. asdf_snar ◴[] No.43689936[source]
I throw these quotes by Y. Oono into the mix because they provide viewpoints which are in some tension with those who take -\sum_x p(x) log p(x) definition of entropy as fundamental.

> Boltzmann’s argument summarized in Exercise of 2.4.11 just derives Shannon’s formula and uses it. A major lesson is that before we use the Shannon formula important physics is over.

> There are folklores in statistical mechanics. For example, in many textbooks ergodic theory and the mechanical foundation of statistical mechanics are discussed even though detailed mathematical explanations may be missing. We must clearly recognize such topics are almost irrelevant to statistical mechanics. We are also brainwashed that statistical mechanics furnishes the foundation of thermodynamics, but we must clearly recognize that without thermodynamics statistical mechanics cannot be formulated. It is a naive idea that microscopic theories are always more fundamental than macroscopic phenomenology.

sources: http://www.yoono.org/download/inst.pdf http://www.yoono.org/download/smhypers12.pdf