←back to thread

What Is Entropy?

(jasonfantl.com)
287 points jfantl | 4 comments | | HN request time: 0.712s | source
1. xavivives ◴[] No.43689208[source]
Over the last few months, I've been developing an unorthodox perspective on entropy [1] . It defines the phenomenon in much more detail, allowing for a unification of all forms of entropy. It also defines probability through the same lens.

I define both concepts fundamentally in relation to priors and possibilities:

- Entropy is the relationship between priors and ANY possibility, relative to the entire space of possibilities.

- Probability is the relationship between priors and a SPECIFIC possibility, relative to the entire space of possibilities.

The framing of priors and possibilities shows why entropy appears differently across disciplines like statistical mechanics and information theory. Entropy is not merely observer-dependent, but prior-dependent. Including priors not held by any specific observer but embedded in the framework itself. This helps resolve the apparent contradiction between objective and subjective interpretations of entropy.

It also defines possibilities as constraints imposed on an otherwise unrestricted reality. This framing unifies how possibility spaces are defined across frameworks.

[1]: https://buttondown.com/themeaninggap/archive/a-unified-persp...

replies(1): >>43689764 #
2. 3abiton ◴[] No.43689764[source]
I am curious why the word "entropy" encompasses so many concepts? Wouldn't it have made sense to just give each concept a different word?
replies(2): >>43690195 #>>43690242 #
3. namaria ◴[] No.43690195[source]
Yes. There are different concepts called 'entropy', sometimes merely because their mathematical formulation looks very similar.

It means different things in different contexts and an abstract discussion of the term is essentially meaningless.

Even discussions within the context of the second law of thermodynamics are often misleading because people ignore much of the context in which the statistical framing of the law was formulated. Formal systems and all that... These are not general descriptions of how nature works, but formal systems definitions that allow for some calculations.

I find the study of symmetries by Noether much more illuminating in general than trying to generalize conservation laws as observed within certain formal models.

4. prof-dr-ir ◴[] No.43690242[source]
Whenever there is an entropy, it can be defined as

S = - sum_n p_n log( p_n )

where the p_n is a probability distribution: for n = 1...W, p_n >= 0 and sum_n p_n = 1. This is always the underlying equation, the only thing that changes is the probability distribution.