"[O]ur knowledge of equilibrium states is the most certain knowledge we can obtain." #readingToday
In arriving at his definition of intrinsic information, Shannon made an important connection between the general notion of information, and the idea of order and entropy in the statistical mechanics of Boltzmann. Apart from a dimensional scale of the Boltzmann constant, the Shannon intrinsic information is numerically equal to Boltzmann's H function, which he showed was equivalent to the thermodynamic entropy (a measure of energy that becomes unavailable to do work due to its being spread around the system in a state of high uncertainty 147 ). Boltzmann effectively showed, eighty years before Shannon's theory, that the information needed to describe the state of a physical system was always maximal at equilibrium. An equilibrium is stable, because it cannot contain any more information, without injecting some from a completely new external source. For that reason, our knowledge of equilibrium states is the most certain knowledge we can obtain._
The Search for Certainty, Mark Burgess, 2013