Title:
|
Hamiltonian sequential Monte Carlo and normalizing constants
|
The present thesis deals with the problems of simulation from a given target distribution
and the estimation of ratios of normalizing constants, i.e. marginal likelihoods
(ML). Both problems could be considerably difficult even for the simplest possible
real-world statistical setups. We investigate how the combination of Hamiltonian
Monte Carlo (HMC) and Sequential Monte Carlo (SMC) could be used to sample
effectively from a multi-modal target distribution and to estimate ratios of normalizing
constants at the same time. We call this novel combination Hamiltonian SMC
(HSMC) algorithm and we show that it achieves some improvements over the existing
Monte Carlo sampling algorithms, especially when the target distribution is
multi-modal and/ or have complicated covariance structure. An important convergence
result is proved for the HSMC, as well as an upper bound on the bias of the
estimate of the ratio of MLs. Our investigation of the continuous time limit of the
HSMC process reveals an interesting connection between Monte Carlo simulation
and physics. We also concern ourselves with the problem of estimation of the uncertainty
of the estimate of the ML of a HMM. We propose a new algorithm (Pairs
algorithm) to estimate the non-asymptotic second moment of the estimate of the
ML for general HMM. Later we show that there exists a linear-in-time bound on the
relative variance of the estimate of the second moment of the ML obtained using the
Pairs algorithm. This theoretical property of the relative variance translates in practice
into a more reliable estimates of the second moment of the estimate of the MLs
compared to the standard approach of running independent copies of the particle
filter. We support out investigations with different numerical examples like Bayesian
inference of a heteroscedastic regression, inference of a Lotka - Volterra based HMM,
etc.
|