←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 3 comments | | HN request time: 0s | source
Show context
quietbritishjim ◴[] No.43691377[source]
I like the axiomatic definition of entropy. Here's the introduction from Pattern Recognition and Machine Learning by C. Bishop (2006):

> The amount of information can be viewed as the ‘degree of surprise’ on learning the value of x. If we are told that a highly improbable event has just occurred, we will have received more information than if we were told that some very likely event has just occurred, and if we knew that the event was certain to happen we would receive no information. Our measure of information content will therefore depend on the probability distribution p(x), and we therefore look for a quantity h(x) that is a monotonic function of the probability p(x) and that expresses the information content. The form of h(·) can be found by noting that if we have two events x and y that are unrelated, then the information gain from observing both of them should be the sum of the information gained from each of them separately, so that h(x, y) = h(x) + h(y). Two unrelated events will be statistically independent and so p(x, y) = p(x)p(y). From these two relationships, it is easily shown that h(x) must be given by the logarithm of p(x) and so we have h(x) = − log2 p(x).

This is the definition of information for a single probabilistic event. The definition of entropy of a random variable follows from this by just taking the expectation.

replies(3): >>43691716 #>>43693308 #>>43697126 #
1. tshaddox ◴[] No.43693308[source]
According to my perhaps naive interpretation of that, the "degree of surprise" would depend on at least three things:

1. the laws of nature (i.e. how accurately do the laws of physics permit measuring the system and how determined are future states based on current states)

2. one's present understanding of the laws of nature

3. one's ability to measure the state of a system accurately and compute the predictions in practice

It strikes me as odd to include 2 and 3 in a definition of "entropy."

replies(1): >>43694595 #
2. tmalsburg2 ◴[] No.43694595[source]
OP is talking about information entropy. Nature isn't relevant there.
replies(1): >>43695968 #
3. tshaddox ◴[] No.43695968[source]
Surely laws of nature are still relevant since they (presumably) establish limits on how closely a system can be measured and which physical interactions can be simulated by computers (and how accurately).