Tag: Information entropy

Classical Uncertainty Relationship

Classical Uncertainty Relationship

TL;DR – Classical Hamiltonian mechanics already includes an uncertainty relationship that is similar to Heisenberg’s uncertainty principle of quantum mechanics. In previous posts we have looked at information entropy, the number of yes/no questions you need to identify an element within a distribution, and the fact that Hamiltonian dynamics conserves that. Here we will show …

+ Read More

Hamiltonian mechanics is conservation of information entropy

Hamiltonian mechanics is conservation of information entropy

TL;DR – Hamiltonian systems are those that conserve information entropy during time evolution. The idea is the following: suppose you have a distribution over position and momentum $\rho(x, p)$. Suppose you evolve it in time and get a final distribution $\hat{\rho}(x, p)$ using Hamiltonian evolution. That is: you take each little area in phase space …

+ Read More

What is information entropy?

What is information entropy?

TL;DR – Information entropy tells you the average number of bits (i.e. answers to yes/no questions) you need to identify an element from a distribution. Starting from the 1950s, the connection between information theory and thermodynamics has been increasingly made clear. What I believe is still largely unnoticed is that there is a deep link …

+ Read More