Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.770145
Title: Sparse machine learning methods for autonomous decision making
Author: Kuzin, Danil
ISNI:       0000 0004 7651 3523
Awarding Body: University of Sheffield
Current Institution: University of Sheffield
Date of Award: 2018
Availability of Full Text:
Access from EThOS:
Access from Institution:
Abstract:
Sparse regression methods are used for the reconstruction of compressed signals, that are usually sparse in some bases; or in feature selection problem, where only few features are meaningful. This thesis overviews the existing Bayesian methods for dealing with sparsity, improves them and provides new models for these problems. The novel models decrease complexity, allow to model structure and provide uncertainty distributions in such applications as medicine and computer vision. The thesis starts with exploring Bayesian sparsity for the problem of compressive back- ground subtraction. Sparsity naturally arises in this problem as foreground usually occupies only small part of the video frame. The use of Bayesian compressive sensing improves the solutions in independent and multi-task scenarios. It also raises an important problem of exploring the structure of the data, as foreground pixels are usually clustered in groups. The problem of structure modelling in sparse problems is addressed with hierarchical Gaussian processes, that are the Bayesian way of imposing structure without specifying its exact patterns. Full Bayesian inference based on expectation propagation is provided for offline and online algorithms. The experiments demonstrate the applicability of these methods for the compressed background subtraction and brain activity localisation problems. The majority of sparse Bayesian methods are computationally intensive. This thesis proposes a novel sparse regression method based on the Bayesian neural networks. It makes the prediction operation fast and additionally estimates the uncertainty of predictions, while requiring a longer training phase. The results are demonstrated in the active learning scenario, where the estimated uncertainty is used for experiment design. Sparse methods are also used as part of other methods such as Gaussian processes that suffer from high computational complexity. The use of active sparse subsets of data improves the performance on large datasets. The thesis proposes a method of dealing with the complexity problem for online data updates using Bayesian filtering.
Supervisor: Mihaylova, Lyudmila ; Kadirkamanathan, Visakan Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.770145  DOI: Not available
Share: