←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0s | source
Show context
FilosofumRex ◴[] No.43690280[source]
Boltzmann and Gibbs turn in their graves, every time some information theorist mutilates their beloved entropy. Shanon & Von Neumann were hacking a new theory of communication, not doing real physics and never meant to equate thermodynamic concepts to encoding techniques - but alas now dissertations are written on it.

Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn't tell us anything new. If it did, it'd violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.

The article never mentions the simplest and most basic definition of entropy, ie its units (KJ/Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.

“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell

replies(2): >>43690413 #>>43690648 #
kgwgk ◴[] No.43690413[source]
> Shanon & Von Neumann were hacking a new theory of communication, not doing real physics

Maybe I’m misunderstanding the reference to von Neumann but his work on entropy was about physics, not about communication.

replies(2): >>43691139 #>>43698951 #
1. nanna ◴[] No.43691139[source]
Think the parent has confused Von Neumann with Wiener. They've also misspelled Shannon.