←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.001s | source
Show context
TexanFeller ◴[] No.43686830[source]
I don’t see Sean Carroll’s musings mentioned yet, so repeating my previous comment:

Entropy got a lot more exciting to me after hearing Sean Carroll talk about it. He has a foundational/philosophical bent and likes to point out that there are competing definitions of entropy set on different philosophical foundations, one of them seemingly observer dependent: - https://youtu.be/x9COqqqsFtc?si=cQkfV5IpLC039Cl5 - https://youtu.be/XJ14ZO-e9NY?si=xi8idD5JmQbT5zxN

Leonard Susskind has lots of great talks and books about quantum information and calculating the entropy of black holes which led to a lot of wild new hypotheses.

Stephen Wolfram gave a long talk about the history of the concept of entropy which was pretty good: https://www.youtube.com/live/ocOHxPs1LQ0?si=zvQNsj_FEGbTX2R3

replies(3): >>43688517 #>>43688804 #>>43688816 #
1. gsf_emergency ◴[] No.43688816[source]
By Jeeves, it's rentropy!!

Sean and Stephen are absolutely thoughtful popularizers, but complexity, not entropy, is what they are truly interested in talking about.

Although it doesn't make complexity less scary, here's something Sean's been working on for more than a decade. The paper seems to be more accessible to the layman than he thinks..

https://arxiv.org/abs/1405.6903 https://scottaaronson.blog/?p=762

[When practitioners say "entropy", they mean RELATIVE ENTROPY, which is another can of worms.. rentropy is the one that is observer dependent: "That's Relative as in Relativity". Entropy by itself is simple, blame von Neumann for making it live rent-free]

https://en.wikipedia.org/wiki/Relative_entropy

@nyrikki below hints (too softly, imho) at this:

>You can also approach the property that people often want to communicate when using the term entropy as effective measure 0 sets, null cover, martingales, kolmogorov complexity, compressibility, set shattering, etc...