Title:

Entropy and limit theorems

This thesis uses techniques based on Shannon entropy to prove probabilistic limit theorems. Chapter 1 defines entropy and Fisher information, and reviews previous work. We reformulate the Central Limit Theorem to state that the entropy of sums of independent identically distribution realvalue random variables converges to its unique maximum, the normal distribution. This is called convergence in relative entropy, and is proved by Barron. Chapter 2 extends Barron's results to nonidentically distributed random variables, extending the LindebergFeller Theorem, under similar conditions. Next, in Chapter 3, we provide a proof for random vectors. In Chapter 4, we discuss convergence to other nonGaussian stable distributions. Whilst not giving a complete solution to this problem, we provide some suggestions as to how the entropy theoretic method may apply in this case. The next two chapters concern random variables with values on compact groups. Although the situation is different, in that the limit distribution is uniform, again this is a maximum entropy distribution, so some of the same ideas will apply. In Chapter 7 we discuss conditional limit theorems, which relate to the problem in Statistical Physics of 'equivalence of ensembles'. We consider random variables equidistributed on the surface of certain types of manifolds, and show their projections are close to Gibbs densities. Once again, the proof involves convergence in relative entropy, establishing continuity of the projection map with respect to the KullbackLeibler topology. The bound is sharp and provides a necessary and sufficient condition for convergence in relative entropy. If we consider the manifold as a surface of equal energy for a particular Hamiltonian, this implies that the microcanonical and canonical ensembles are close to one another.
