←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.224s | source
1. dswilkerson ◴[] No.43688842[source]
Entropy is expected information. That is, given a random variable, if you compute the expected value (the sum of the values weighted by their probability) of the information of an event (the log base 2 of the multiplicative inverse of the probability of the event), you get the formula for entropy.

Here it is explained at length: "An Intuitive Explanation of the Information Entropy of a Random Variable, Or: How to Play Twenty Questions": http://danielwilkerson.com/entropy.html