Use this URL to cite or link to this record in EThOS:
Title: Model misspecification and general Bayesian bootstraps
Author: Lyddon, Simon
ISNI:       0000 0004 8502 9143
Awarding Body: University of Oxford
Current Institution: University of Oxford
Date of Award: 2018
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
This thesis explores how a Bayesian should update their beliefs in the knowledge that any model available to them, however useful, is wrong. A number of connections are made between ideas in the model misspecification, Bayesian nonparametrics and bootstrapping literatures. These have led to the development of novel methods for scalable Bayesian inference under model misspecification for a family of general Bayesian parameters. In Chapter 1 we introduce the core topics of this thesis, in particular the foundations of statistical modelling, model misspecification and general Bayesian updating. In Chapter 2 we revisit the weighted likelihood bootstrap. This method can be interpreted as sampling exactly from a Bayesian nonparametric posterior distribution, making only weak assumptions about the data-generating mechanism. The loss-likelihood bootstrap is developed as a generalisation of this idea to a wider family of parameters. By making a connection with general Bayesian updating, we provide a means of calibrating the parametric form of the general Bayesian posterior by matching asymptotic posterior information. In Chapter 3 we present a novel framework for Bayesian nonparametric learning from Bayesian parametric models. A mixture of Dirichlet processes is well suited for incorporating information from a misspecified model into a nonparametric analysis. The properties of this method are examined, and a computationally scalable algorithm for posterior sampling, called the posterior bootstrap, is described. Extensions to the correction of approximate posterior distributions, and to more general parameters of interest, are pursued. In Chapter 4 we explore two extensions to the work of Chapters 2 and 3. In the first, we explore an application of our nonparametric learning framework to sparse regression problems and investigate the properties of nonparametric learning under a lasso objective function. In the second we consider how we should extend the posterior bootstrap to parametric statistical models containing random effects. Both of these extensions are based on the notion of uncertainty quantification by suitably randomising an objective function. In Chapter 5 we conclude and discuss future work.
Supervisor: Holmes, Chris Sponsor: EPSRC OxWaSP CDT ; Engineering and Physical Sciences Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available
Keywords: Statistics