I define both concepts fundamentally in relation to priors and possibilities:
- Entropy is the relationship between priors and ANY possibility, relative to the entire space of possibilities.
- Probability is the relationship between priors and a SPECIFIC possibility, relative to the entire space of possibilities.
The framing of priors and possibilities shows why entropy appears differently across disciplines like statistical mechanics and information theory. Entropy is not merely observer-dependent, but prior-dependent. Including priors not held by any specific observer but embedded in the framework itself. This helps resolve the apparent contradiction between objective and subjective interpretations of entropy.
It also defines possibilities as constraints imposed on an otherwise unrestricted reality. This framing unifies how possibility spaces are defined across frameworks.
[1]: https://buttondown.com/themeaninggap/archive/a-unified-persp...