←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.001s | source
Show context
FilosofumRex ◴[] No.43690280[source]
Boltzmann and Gibbs turn in their graves, every time some information theorist mutilates their beloved entropy. Shanon & Von Neumann were hacking a new theory of communication, not doing real physics and never meant to equate thermodynamic concepts to encoding techniques - but alas now dissertations are written on it.

Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn't tell us anything new. If it did, it'd violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.

The article never mentions the simplest and most basic definition of entropy, ie its units (KJ/Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.

“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell

replies(2): >>43690413 #>>43690648 #
kgwgk ◴[] No.43690413[source]
> Shanon & Von Neumann were hacking a new theory of communication, not doing real physics

Maybe I’m misunderstanding the reference to von Neumann but his work on entropy was about physics, not about communication.

replies(2): >>43691139 #>>43698951 #
FilosofumRex ◴[] No.43698951[source]
More precisely, Von Neumann was extending Shannon's information theoretic entropy to quantum channels, which he restated as S(p)=Tr(p ln(p)) - Again showing that information theoretic entropy reveals nothing more about a system than its probability distribution density matrix p.
replies(1): >>43702056 #
kgwgk ◴[] No.43702056[source]
It’s quite remarkable that in his 1927 paper “The thermodynamics of quantum-mechanical ensembles” von Neumann was extending the mathematical theory of communication that Shannon - who was 11 at the time - would only publish decades later.
replies(1): >>43720874 #
FilosofumRex ◴[] No.43720874{3}[source]
Dude, read your own reference... There is no mention of information or communication theory anywhere in his 1927 paper or 1932 book. Young Von Nuemann was doing real physics extending and updating Gibb's entropy.

OTOH, old Von Neumann was wealthy, hobnobbing with politicians and glitterati musing about life, biology, econ and anything else that would amuse his social circles. "Entropy", as he's alleged to have told Shannon, was his ace in the pocket to win arguments.

Formal similarity with Shannon's entropy is superfluous and conveys no new information about any system, quantum or otherwise. But it does make for lot's PhD dissertations, for exactly the same reason Von Nuemann stated.

replies(1): >>43721739 #
1. kgwgk ◴[] No.43721739{4}[source]
> There is no mention of information or communication theory anywhere in his 1927 paper or 1932 book. Young Von Nuemann was doing real physics extending and updating Gibb's entropy.

We agree then! John von Neumann’s work on entropy was about physics, not about communication theory. S(p)=Tr(p ln(p)) is physics. If you still claim that he “was extending Shannon's information theoretic entropy to quantum channels” at some point could you maybe give a reference?

> Formal similarity with Shannon's entropy is superfluous and conveys no new information about any system, quantum or otherwise

What I still don’t understand is your fixation with that.

“Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x)” makes zero sense given that the entropy is a property of the probability distribution. (Any measure of “all the uncertainty” which is “in the probability distribution p(x)” will be a property of p(x). The entropy checks that box so why can’t it be a measure of uncertainty?)

It is a measure of the uncertainty in the probability distribution that describes a physical system in statistical mechanics. It is a measure of the lack of knowledge about the system. For a quantum system, von Neumann’s entropy becomes zero when the density matrix corresponds to a pure state and there is nothing left to know.