←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 1 comments | | HN request time: 0s | source
Show context
xavivives ◴[] No.43689208[source]
Over the last few months, I've been developing an unorthodox perspective on entropy [1] . It defines the phenomenon in much more detail, allowing for a unification of all forms of entropy. It also defines probability through the same lens.

I define both concepts fundamentally in relation to priors and possibilities:

- Entropy is the relationship between priors and ANY possibility, relative to the entire space of possibilities.

- Probability is the relationship between priors and a SPECIFIC possibility, relative to the entire space of possibilities.

The framing of priors and possibilities shows why entropy appears differently across disciplines like statistical mechanics and information theory. Entropy is not merely observer-dependent, but prior-dependent. Including priors not held by any specific observer but embedded in the framework itself. This helps resolve the apparent contradiction between objective and subjective interpretations of entropy.

It also defines possibilities as constraints imposed on an otherwise unrestricted reality. This framing unifies how possibility spaces are defined across frameworks.

[1]: https://buttondown.com/themeaninggap/archive/a-unified-persp...

replies(1): >>43689764 #
3abiton ◴[] No.43689764[source]
I am curious why the word "entropy" encompasses so many concepts? Wouldn't it have made sense to just give each concept a different word?
replies(2): >>43690195 #>>43690242 #
1. prof-dr-ir ◴[] No.43690242[source]
Whenever there is an entropy, it can be defined as

S = - sum_n p_n log( p_n )

where the p_n is a probability distribution: for n = 1...W, p_n >= 0 and sum_n p_n = 1. This is always the underlying equation, the only thing that changes is the probability distribution.