←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0.204s | source
Show context
nihakue ◴[] No.43686233[source]
I'm not in any way qualified to have a take here, but I have one anyway:

My understanding is that entropy is a way of quantifying how many different ways a thing could 'actually be' and yet still 'appear to be' how it is. So it is largely a result of an observer's limited ability to perceive / interrogate the 'true' nature of the system in question.

So for example you could observe that a single coin flip is heads, and entropy will help you quantify how many different ways that could have come to pass. e.g. is it a fair coin, a weighted coin, a coin with two head faces, etc. All these possibilities increase the entropy of the system. An arrangement _not_ counted towards the system's entropy is the arrangement where the coin has no heads face, only ever comes up tails, etc.

Related, my intuition about the observation that entropy tends to increase is that it's purely a result of more likely things happening more often on average.

Would be delighted if anyone wanted to correct either of these intuitions.

replies(4): >>43686452 #>>43686509 #>>43686706 #>>43687464 #
1. 867-5309 ◴[] No.43686706[source]
> 'actually be' and yet still 'appear to be'

esse quam videri