←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.26s | source
1. marojejian ◴[] No.43695380[source]
This is the best description of entropy and information I've read: https://arxiv.org/abs/1601.06176

Most of all, it highlights the subjective / relative foundations of these concepts.

Entropy and Information only exist relative to a decision about the set of state an observer cares to distinguish.

It also caused me to change my informal definition of entropy from a negative ("disorder)" to a more positive one ("the number of things I might care to know")

The Second Law now tells me that the number of interesting things I don't know about is always increasing!

This thread inspired me to post it here: https://news.ycombinator.com/item?id=43695358