Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.784646
Title: Reproducing-kernel Hilbert space regression with notes on the Wasserstein distance
Author: Page, Stephen
ISNI:       0000 0004 7970 1933
Awarding Body: Lancaster University
Current Institution: Lancaster University
Date of Award: 2019
Availability of Full Text:
Access from EThOS:
Access from Institution:
Abstract:
We study kernel least-squares estimators for the regression problem subject to a norm constraint. We bound the squared L2 error of our estimators with respect to the covariate distribution. We also bound the worst-case squared L2 error of our estimators with respect to a Wasserstein ball of probability measures centred at the covariate distribution. This leads us to investigate the extreme points of Wasserstein balls. In Chapter 3, we provide bounds on our estimators both when the regression function is unbounded and when the regression function is bounded. When the regression function is bounded, we clip the estimators so that they are closer to the regression function. In this setting, we also use training and validation to adaptively select a size for our norm constraint based on the data. In Chapter 4, we study a different adaptive estimation procedure called the Goldenshluger--Lepski method. Unlike training and validation, this method uses all of the data to create estimators for a range of sizes of norm constraint before using pairwise comparisons to select a final estimator. We are able to adaptively select both a size for our norm constraint and a kernel. In Chapter 5, we examine the extreme points of Wasserstein balls. We show that the only extreme points which are not on the surface of the ball are the Dirac measures. This is followed by finding conditions under which points on the surface of the ball are extreme points or not extreme points. In Chapter 6, we provide bounds on the worst-case squared L2 error of our estimators with respect to a Wasserstein ball of probability measures centred at the covariate distribution. We prove bounds both when the regression function is unbounded and when the regression function is bounded. We also investigate the analysis and computation of alternative estimators.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.784646  DOI:
Share: