←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.229s | source
1. vitus ◴[] No.43686995[source]
The problem with this explanation (and with many others) is that it misses why we should care about "disorder" or "uncertainty", whether in information theory or statistical mechanics. Yes, we have the arrow of time argument (second law of thermodynamics, etc), and entropy breaks time-symmetry. So what?

The article hints very briefly at this with the discussion of an unequally-weighted die, and how by encoding the most common outcome with a single bit, you can achieve some amount of compression. That's a start, and we've now rediscovered the idea behind Huffman coding. What information theory tells us is that if you consider a sequence of two dice rolls, you can then use even fewer bits on average to describe that outcome, and so on; as you take your block length to infinity, your average number of bits for each roll in the sequence approaches the entropy of the source. (This is Shannon's source coding theorem, and while entropy plays a far greater role in information theory, this is at least a starting point.)

There's something magical about statistical mechanics where various quantities (e.g. energy, temperature, pressure) emerge as a result of taking partial derivatives of this "partition function", and that they turn out to be the same quantities that we've known all along (up to a scaling factor -- in my stat mech class, I recall using k_B * T for temperature, such that we brought everything back to units of energy).

https://en.wikipedia.org/wiki/Partition_function_(statistica...

https://en.wikipedia.org/wiki/Fundamental_thermodynamic_rela...

If you're dealing with a sea of electrons, you might apply the Pauli exclusion principle to derive Fermi-Dirac statistics that underpins all of semiconductor physics; if instead you're dealing with photons which can occupy the same energy state, the same statistical principles lead to Bose-Einstein statistics.

Statistical mechanics is ultimately about taking certain assumptions about how particles interact with each other, scaling up the quantities beyond our ability to model all of the individual particles, and applying statistical approximations to consider the average behavior of the ensemble. The various forms of entropy are building blocks to that end.