←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.302s | source
Show context
nihakue ◴[] No.43686233[source]
I'm not in any way qualified to have a take here, but I have one anyway:

My understanding is that entropy is a way of quantifying how many different ways a thing could 'actually be' and yet still 'appear to be' how it is. So it is largely a result of an observer's limited ability to perceive / interrogate the 'true' nature of the system in question.

So for example you could observe that a single coin flip is heads, and entropy will help you quantify how many different ways that could have come to pass. e.g. is it a fair coin, a weighted coin, a coin with two head faces, etc. All these possibilities increase the entropy of the system. An arrangement _not_ counted towards the system's entropy is the arrangement where the coin has no heads face, only ever comes up tails, etc.

Related, my intuition about the observation that entropy tends to increase is that it's purely a result of more likely things happening more often on average.

Would be delighted if anyone wanted to correct either of these intuitions.

replies(4): >>43686452 #>>43686509 #>>43686706 #>>43687464 #
1. fsckboy ◴[] No.43686452[source]
>purely a result of more likely things happening more often on average

according to your wording, no. if you have a perfect six sided die (or perfect two sided coin), none/neither of the outcomes are more likely at any point in time... yet something approximating entropy occurs after many repeated trials. what's expected to happen is the average thing even though it's never the most likely thing to happen.

you want to look at how repeated re-convolution of a function with itself always converges on the same gaussian function, no matter the shape of the starting function is (as long as it's not some pathological case, such as an impulse function... but even then, consider the convolution of the impulse function with the gaussian)