Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.492241
Title: Bias reduction in exponential family nonlinear models
Author: Kosmidis, Ioannis
ISNI:       0000 0001 3601 9713
Awarding Body: University of Warwick
Current Institution: University of Warwick
Date of Award: 2007
Availability of Full Text:
Access from EThOS:
Abstract:
The modified-score functions approach to bias reduction (Firth, 1993) is continually gaining in popularity (e.g. Mehrabi & Matthews, 1995; Pettitt et al., 1998; Heinze & Schemper, 2002; Bull et al., 2002; Zorn, 2005; Sartori, 2006; Bull et al., 2007), because of the superior properties of the bias-reduced estimator over the traditional maximum likelihood estimator, particularly in models for categorical responses. Most of the activity is noted for logistic regression, where the bias-reduction method neatly corresponds to penalization of the likelihood by Jeffreys prior and the bias-reduced estimates are always finite and beneficially shrink towards the origin. The recent applied and methodological interest in the bias-reduction method motivates the current thesis and the aim is to explore the nature and widen the applicability of the method, identifying cases where bias reduction is beneficial. Particularly, the current thesis focuses on the following three targets: i) To explore the nature of the bias-reducing modifications to the efficient scores and to obtain results that facilitate the application and the theoretical assessment of the bias-reduction method. ii) To establish theoretically that the bias-reduction method should be considered as an improvement over traditional ML for logistic regressions. iii) To deviate from the flat exponential family and explore the effect of bias reduction in some commonly used curved models for categorical responses.
Supervisor: Not available Sponsor: Not available
Qualification Name: University of Warwick, 2007 Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.492241  DOI: Not available
Share: