Use this URL to cite or link to this record in EThOS:
Title: Topics in the probabilistic solution of ordinary differential equations
Author: Teymur, Onur
ISNI:       0000 0004 7659 1636
Awarding Body: Imperial College London
Current Institution: Imperial College London
Date of Award: 2019
Availability of Full Text:
Access from EThOS:
Access from Institution:
This thesis concerns several new developments in the probabilistic solution of ordinary differential equations. Probabilistic numerical methods are differentiated from their classical counterparts through the key property of returning a probability measure as output, rather than simply a point value. When properly calibrated, this measure can then be taken to probabilistically represent the output uncertainty arising from the application of the numerical procedure. After giving some introductory context, we start with a concise survey of the still-developing field of probabilistic ODE solvers, highlighting how several different paradigms have developed somewhat in parallel. One of these, established by Conrad et al. (2016), defines randomised one-step solvers for initial value problems, where the outputs are empirical measures arising from Monte Carlo repetitions of the algorithm. We extend this to multistep solvers of Adams-Bashforth type using a novel Gaussian process construction. The properties of this method are explored and its convergence is rigorously proved. We continue by defining a class of implicit probabilistic ODE solvers, the first in the literature. Unlike explicit methods, these modified Adams-Moulton algorithms incorporate information from the ODE dynamics beyond the current time-point, and as such are able to enhance the accuracy of the probabilistic model of numerical error. In their full form, they output a non-parametric description of the stepwise error, though we also propose a parametric approximation that aids computation. Once again, we explore the properties of the method and prove its convergence in the small step-size limit. We follow with a discussion on the problem of calibration for these classes of algorithms, and generalise a proposal from Conrad et al. in order to implement it for our methods. We then apply the new integrators to two test differential equation models, first in the solution of the forward model, then later in the setting of a Bayesian inverse problem. We contrast the effect of using probabilistic integrators instead of classical ones on posterior inference over the model parameters, as well as derived functions of the forward solution. We conclude with a brief discussion on the advantages and shortcomings of the proposed methods, and posit several suggestions for potential future research.
Supervisor: Calderhead, Ben Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral