Title:

Information theory and statistical mechanics

In this dissertation we argue the case for Jaynes' Information Theory approach to the foundations of Statistical Mechanics. In Chapter 1 a brief revue of the relevant aspects of classical and quantum mechanics is given. In particular, the methods in which probability theory is applied to mechanics are discussed the emphasis being on statistical indices, i.e., probability distribution functions (classical case) and density matrices (quantum case.). Also, Jaynes' 'Maximum Entropy Principle' is introduced; this principle advocates the use of the statistical index with the maximum microscopic entropy in Statistical Mechanics. The microscopic expressions for the classical and quantum entropies are, derived, in Chapter 2, from a few basic axioms. The method of proof is identical in both cases and two new axiomatic characterisation theorems (one classical, one quantum) for the microscopic entropies are proved. Chapter 3 is mostly an exposition of the literature on Jaynes' approach to Statistical Mechanics, or 'Information Thermodynamics' as it is also known. Also, a few of the mathematical gaps are either filled or examined from a different point of view. Chapter 3 is concluded by a discussion of the Second Law of Thermodynamics. In Chapter 4 Jaynes' approach is generalised to the case of nonequilibrium Statistical Mechanics. This net theory is based on the idea of expanding the statistical mean values evaluated using the solution of the Liouville equation in terms of those mean values evaluated using the statistical index with the maximum entropy via a 'mean value perturbation theory' actually developed by Jaynes. The Laws of 'Irreversible Thermodynamics' are then derived from a microscopic basis. Chapter 5 gives a critical analysis of the Information Theory approach. Also, possible directions which future work on this subject may take are indicated.
