Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.773925
Title: Bayesian models for scalable machine learning
Author: Perrone, Valerio
ISNI:       0000 0004 7961 1591
Awarding Body: University of Warwick
Current Institution: University of Warwick
Date of Award: 2018
Availability of Full Text:
Access from EThOS:
Access from Institution:
Abstract:
In the last years we have witnessed an unprecedented increase in the volume and complexity of the available data. To reason about the world and address scientific questions, we need to build expressive and scalable models that can leverage this extraordinary amount of information. The Bayesian paradigm is an essential tool to formalize beliefs and quantify uncertainty, and is one of the cornerstones of machine learning. In this work, we develop new models for Bayesian machine learning that are able to tackle a number of challenges posed by large-scale data. We first introduce a novel Bayesian nonparametric latent feature model for dependent data. Latent feature models aim to explain data in terms of a smaller set of hidden components which are responsible for the properties of the objects we observe. However, most latent feature models either fix the total number of features in advance or impose unrealistic exchangeability assumptions on the data. We overcome these challenges by constructing a novel time-evolving Bayesian non-parametric prior that learns the number of features from the data while modeling their time evolution explicitly. In the second part of the thesis, we aim to combine the power of deep neural networks to learn useful representations with the ability of Bayesian learning to provide uncertainty estimates. To this end, we first design robust stochastic gradient algorithms for scalable inference in Bayesian neural network models. Then, we consider a simpler model consisting of a Bayesian linear regression layer built on top of a deep neural network. We demonstrate the benefits of this approach in a Bayesian optimization setting, scaling to millions of datapoints and transferring knowledge across tasks. Finally, we establish a deep learning framework for large-scale Bayesian inference and apply it to fully harness raw population genomic data.
Supervisor: Not available Sponsor: Engineering and Physical Sciences Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.773925  DOI: Not available
Keywords: QA Mathematics
Share: