←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.202s | source
Show context
glial ◴[] No.43685469[source]
One thing that helped me was the realization that, at least as used in the context of information theory, entropy is a property of an individual (typically the person receiving a message) and NOT purely of the system or message itself.

> entropy quantifies uncertainty

This sums it up. Uncertainty is the property of a person and not a system/message. That uncertainty is a function of both a person's model of a system/message and their prior observations.

You and I may have different entropies about the content of the same message. If we're calculating the entropy of dice rolls (where the outcome is the 'message'), and I know the dice are loaded but you don't, my entropy will be lower than yours.

replies(4): >>43685585 #>>43686121 #>>43687411 #>>43688999 #
1. Geee ◴[] No.43688999[source]
It's both. The system or process has it's actual entropy, and the sequence of observations we make has a certain entropy. We can say that "this sequence of numbers has this entropy", which is slightly different from the entropy of the process which created the numbers. For example, when we make more coin tosses, our sequence of observations has an entropy which gets closer and closer to the actual entropy of the coin.