←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 1.036s | source
Show context
glial ◴[] No.43685469[source]
One thing that helped me was the realization that, at least as used in the context of information theory, entropy is a property of an individual (typically the person receiving a message) and NOT purely of the system or message itself.

> entropy quantifies uncertainty

This sums it up. Uncertainty is the property of a person and not a system/message. That uncertainty is a function of both a person's model of a system/message and their prior observations.

You and I may have different entropies about the content of the same message. If we're calculating the entropy of dice rolls (where the outcome is the 'message'), and I know the dice are loaded but you don't, my entropy will be lower than yours.

replies(4): >>43685585 #>>43686121 #>>43687411 #>>43688999 #
ninetyninenine ◴[] No.43685585[source]
Not true. The uncertainty of the dice rolls is not controlled by you. It is the property of the loaded dice itself.

Here's a better way to put it. If I roll the dice infinite times. The uncertainty of the outcome of the dice will become evident in the distribution of the outcomes of the dice. Whether you or another person is certain or uncertain of this does not indicate anything.

Now when you realize this you'll start to think about this thing in probability called frequentists vs. bayesian and you'll realize that all entropy is, is a consequence of probability and that the philosophical debate in probability applies to entropy as well because they are one and the same.

I think the word "entropy" confuses people into thinking it's some other thing when really it's just probability at work.

replies(3): >>43685604 #>>43686183 #>>43691399 #
1. quietbritishjim ◴[] No.43691399[source]
You're right it reduces to Bayesian vs frequentist views of probability. But you seem to be taking an adamantly frequentist view yourself.

Imagine you're not interested in whether a dice is weighted (in fact assume that it is fair in every reasonable sense), but instead you want to know the outcome of a specific roll. What if that roll has already happened, but you haven't seen it? I've cheekily covered up the dice with my hand straight after I rolled it. It's no longer random at all, in at least some philosophical points of view, because its outcome is now 100% determined. If you're only concerned about "the property of the dice itself" are you now only concerned with the property of the roll itself? It's done and dusted. So the entropy of that "random variable" (which only has one outcome, of probability 1) is 0.

This is actually a valid philosophical point of view. But people that act as though the outcome is still random, allow themselves to use probability theory as if it hadn't been rolled yet, are going to win a lot more games of chance than those that refuse to.

Maybe this all seems like a straw man. Have I argued against anything you actually said in your post? Yes I have: your core disagreement with OP's statement "entropy is a property of an individual". You see, when I covered up the dice with my hand, I did see it. So if you take the Bayesian view of probability and allow yourself to consider that dice roll probabilistically, then you and I really do have different views about the probability distribution of that dice roll and therefore the entropy. If I tell a third person, secretly and honestly, that the dice roll is even then they have yet another view of the entropy of the same dice roll! All at the same time and all perfectly valid.