Use this URL to cite or link to this record in EThOS: | https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.487598 |
![]() |
|||||
Title: | Fast variational methods for non-Gaussian likelihoods | ||||
Author: | King, Nathaniel John |
ISNI:
0000 0001 3599 8286
|
|||
Awarding Body: | University of Sheffield | ||||
Current Institution: | University of Sheffield | ||||
Date of Award: | 2008 | ||||
Availability of Full Text: |
|
||||
Abstract: | |||||
In this thesis we present a modular algorithm for supervised learning which we refer to as probabilistic
point assimilation (PPA). Three interpretations of the PPA framework are develop.ed: a
linear, a regularized linear and Gaussian process model. Learning tasks are performed by 'plugging'
in different noise models. These noise models can be included easily with out recalculation
of the model as a whole by just the calculation of three simple forms found from a univariate
Gaussian integral involving that noise model. Experiments show comparable comparisons
for PPA against other state of the art algorithms for both binomial and ordinal classification
problems.
In the second part we introduce our speed up approach for variational methods which we call
KL correction. KL correction produces a tighter bound and allows the parameters of interest to
interact directly during optimisation. The consideration of multiple KL correction allows us to
develop multiple KL corrected bounds which can be switched in and out to cater for parameters
that didn't fall under the previous KL corrected bound. We show that KL correction dramatically
improves the speed of convergence for the PPA model over its original formalism. For the case of
multiple KL correction not only improves convergence for PPA but also produces a fully tractable
and modular algorithm.
|
|||||
Supervisor: | Not available | Sponsor: | Not available | ||
Qualification Name: | Thesis (Ph.D.) | Qualification Level: | Doctoral | ||
EThOS ID: | uk.bl.ethos.487598 | DOI: | Not available | ||
Share: |