←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 8 comments | | HN request time: 1.035s | source | bottom
1. TexanFeller ◴[] No.43686830[source]
I don’t see Sean Carroll’s musings mentioned yet, so repeating my previous comment:

Entropy got a lot more exciting to me after hearing Sean Carroll talk about it. He has a foundational/philosophical bent and likes to point out that there are competing definitions of entropy set on different philosophical foundations, one of them seemingly observer dependent: - https://youtu.be/x9COqqqsFtc?si=cQkfV5IpLC039Cl5 - https://youtu.be/XJ14ZO-e9NY?si=xi8idD5JmQbT5zxN

Leonard Susskind has lots of great talks and books about quantum information and calculating the entropy of black holes which led to a lot of wild new hypotheses.

Stephen Wolfram gave a long talk about the history of the concept of entropy which was pretty good: https://www.youtube.com/live/ocOHxPs1LQ0?si=zvQNsj_FEGbTX2R3

replies(3): >>43688517 #>>43688804 #>>43688816 #
2. infogulch ◴[] No.43688517[source]
Half a year after that talk Wolfram appeared on a popular podcast [1] to discuss his book on the Second Law of Thermodynamics [2]. That discussion contained the best one-sentence description of entropy I've ever heard:

> Entropy is the logarithm of the number of states that are consistent with what you know about a system.

[1]: Mystery of Entropy FINALLY Solved After 50 Years? (Stephen Wolfram) - Machine Learning Street Talk Podcast - https://www.youtube.com/watch?v=dkpDjd2nHgo

[2]: The Second Law: Resolving the Mystery of the Second Law of Thermodynamics - https://www.amazon.com/Second-Law-Resolving-Mystery-Thermody...

replies(1): >>43689758 #
3. ◴[] No.43688804[source]
4. gsf_emergency ◴[] No.43688816[source]
By Jeeves, it's rentropy!!

Sean and Stephen are absolutely thoughtful popularizers, but complexity, not entropy, is what they are truly interested in talking about.

Although it doesn't make complexity less scary, here's something Sean's been working on for more than a decade. The paper seems to be more accessible to the layman than he thinks..

https://arxiv.org/abs/1405.6903 https://scottaaronson.blog/?p=762

[When practitioners say "entropy", they mean RELATIVE ENTROPY, which is another can of worms.. rentropy is the one that is observer dependent: "That's Relative as in Relativity". Entropy by itself is simple, blame von Neumann for making it live rent-free]

https://en.wikipedia.org/wiki/Relative_entropy

@nyrikki below hints (too softly, imho) at this:

>You can also approach the property that people often want to communicate when using the term entropy as effective measure 0 sets, null cover, martingales, kolmogorov complexity, compressibility, set shattering, etc...

5. frank20022 ◴[] No.43689758[source]
By that definition, the entropy of a game of chess decreases with time because as the game moves on there are less possible legal states. Did I get that right?
replies(2): >>43690119 #>>43694685 #
6. dist-epoch ◴[] No.43690119{3}[source]
Is about subjective knowledge, not objective.

So entropy is not related to the number of remaining legal states.

If I know the seed of a PRNG, the entropy of the numbers it generates is zero for me. If I don't know the seed, it has very high entropy.

https://www.quantamagazine.org/what-is-entropy-a-measure-of-...

replies(1): >>43692764 #
7. HelloNurse ◴[] No.43692764{4}[source]
Low entropy in chess means effective prediction of future moves, which is far less useful than dedicating equivalent effort to choosing a good next move for ourselves.
8. infogulch ◴[] No.43694685{3}[source]
Sure. Lots of games result in a reduction in game state entropy as the game progresses. Many card games could be described as unnecessarily complicated ways to sort a deck, as an example. When analyzing games wrt the Second Law, consider that "the system" is not simply the current game state, but should at least include captured pieces and human choices.