←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.201s | source
Show context
FilosofumRex ◴[] No.43690280[source]
Boltzmann and Gibbs turn in their graves, every time some information theorist mutilates their beloved entropy. Shanon & Von Neumann were hacking a new theory of communication, not doing real physics and never meant to equate thermodynamic concepts to encoding techniques - but alas now dissertations are written on it.

Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn't tell us anything new. If it did, it'd violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.

The article never mentions the simplest and most basic definition of entropy, ie its units (KJ/Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.

“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell

replies(2): >>43690413 #>>43690648 #
1. kgwgk ◴[] No.43690648[source]
> Entropy can't be a measure of uncertainty

Gibbs’ entropy is derived from “the probability that an unspecified system of the ensemble (i.e. one of which we only know that it belongs to the ensemble) will lie within the given limits” in phase space. That’s the “coefficient of probability” of the phase, its logarithm is the “index of probability” of the phase, the average of that is the entropy.

Of course the probability distribution corresponds to the uncertainty. That’s why the entropy is defined from the probability distribution.

Your claim sounds like saying that the area of a polygon cannot be a measure of its extension because the extension is given by the shape and calculating the area doesn’t tell us anything new.