Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.780689
Title: Neural networks for inference, inference for neural networks
Author: Webb, Stefan Douglas
ISNI:       0000 0004 7966 3308
Awarding Body: University of Oxford
Current Institution: University of Oxford
Date of Award: 2018
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
Abstract:
Bayesian statistics is a powerful framework for modeling the world and reasoning over uncertainty. It provides a principled method for representing our prior knowledge, and updating that knowledge in the light of new information. Traditional Bayesian statistics, however, has been limited to simple models. Two of the main limiting factors for this are the expressiveness and flexibility of the probability distributions used, and the computational restrictions in performing inference and model learning. In this thesis, we consider how neural networks (NNs) can be used to assist with both of these problems. In particular, we will look at how NNs can assist in the inference process and how we can perform inference over flexible NN models. NNs are helpful for Bayesian inference in generative models. They are useful for producing the flexible variational families required for successful variational inference (VI), as well as learning distributed representations of model variables for guiding the distributional relationships of the model. Inference, on the other hand, is useful for quantifying uncertainty in NN discriminative models. For instance, with "Bayesian NNs" one can use inference to learn a distribution over the NNs parameters and hence quantify our uncertainty over the model's predictions. However, this increased flexibility in modeling and inference comes with challenges in inference and representation. We present three pieces of original work in this thesis towards solving these challenges. We produce an algorithm for constructing the factorization of variational approximations in an optimal way to improve the fidelity and scalability of VI. We develop a framework for distributed Bayesian learning that is particularly useful for large Bayesian NNs and is less prone to the stale gradient of non-Bayesian approaches. We finish by considering an example of how Bayesian inference can be applied to NNs in a non-standard context by reinterpreting the problem of estimating NN robustness as an inference problem.
Supervisor: Mudigonda, Pawan K. ; Teh, Yee Whye Sponsor: Engineering and Physical Sciences Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.780689  DOI: Not available
Keywords: Deep Learning ; Bayesian Statistics ; Machine learning
Share: