This is the best description of entropy and information I've read:
https://arxiv.org/abs/1601.06176
Most of all, it highlights the subjective / relative foundations of these concepts.
Entropy and Information only exist relative to a decision about the set of state an observer cares to distinguish.
It also caused me to change my informal definition of entropy from a negative ("disorder)" to a more positive one ("the number of things I might care to know")
The Second Law now tells me that the number of interesting things I don't know about is always increasing!
This thread inspired me to post it here: https://news.ycombinator.com/item?id=43695358