←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 2 comments | | HN request time: 0s | source
Show context
TexanFeller ◴[] No.43686830[source]
I don’t see Sean Carroll’s musings mentioned yet, so repeating my previous comment:

Entropy got a lot more exciting to me after hearing Sean Carroll talk about it. He has a foundational/philosophical bent and likes to point out that there are competing definitions of entropy set on different philosophical foundations, one of them seemingly observer dependent: - https://youtu.be/x9COqqqsFtc?si=cQkfV5IpLC039Cl5 - https://youtu.be/XJ14ZO-e9NY?si=xi8idD5JmQbT5zxN

Leonard Susskind has lots of great talks and books about quantum information and calculating the entropy of black holes which led to a lot of wild new hypotheses.

Stephen Wolfram gave a long talk about the history of the concept of entropy which was pretty good: https://www.youtube.com/live/ocOHxPs1LQ0?si=zvQNsj_FEGbTX2R3

replies(3): >>43688517 #>>43688804 #>>43688816 #
infogulch ◴[] No.43688517[source]
Half a year after that talk Wolfram appeared on a popular podcast [1] to discuss his book on the Second Law of Thermodynamics [2]. That discussion contained the best one-sentence description of entropy I've ever heard:

> Entropy is the logarithm of the number of states that are consistent with what you know about a system.

[1]: Mystery of Entropy FINALLY Solved After 50 Years? (Stephen Wolfram) - Machine Learning Street Talk Podcast - https://www.youtube.com/watch?v=dkpDjd2nHgo

[2]: The Second Law: Resolving the Mystery of the Second Law of Thermodynamics - https://www.amazon.com/Second-Law-Resolving-Mystery-Thermody...

replies(1): >>43689758 #
frank20022 ◴[] No.43689758[source]
By that definition, the entropy of a game of chess decreases with time because as the game moves on there are less possible legal states. Did I get that right?
replies(2): >>43690119 #>>43694685 #
1. dist-epoch ◴[] No.43690119{3}[source]
Is about subjective knowledge, not objective.

So entropy is not related to the number of remaining legal states.

If I know the seed of a PRNG, the entropy of the numbers it generates is zero for me. If I don't know the seed, it has very high entropy.

https://www.quantamagazine.org/what-is-entropy-a-measure-of-...

replies(1): >>43692764 #
2. HelloNurse ◴[] No.43692764[source]
Low entropy in chess means effective prediction of future moves, which is far less useful than dedicating equivalent effort to choosing a good next move for ourselves.