Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.791670
Title: Derivative-free algorithms for nonlinear optimisation problems
Author: Roberts, Lindon
ISNI:       0000 0004 8502 9856
Awarding Body: University of Oxford
Current Institution: University of Oxford
Date of Award: 2019
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
Abstract:
Minimising a nonlinear function is a ubiquitous problem in applications, and standard algorithms need accurate evaluations of the function and its derivatives. However, if the function is black-box, expensive to evaluate, or noisy, it may be impractical or even impossible to obtain accurate derivatives, and we require derivative-free optimisation (DFO). Model-based DFO methods perform well in practice, taking many features from derivative-based methods, but can have lower performance for noisy problems and a high linear algebra cost. In this thesis, we aim to improve the flexibility, robustness and scalability of state-of-the-art methods for model-based DFO by designing, analysing, implementing and comprehensively testing three new algorithms for nonlinear optimisation, with a special focus on nonlinear least-squares problems (such as parameter fitting). The first, DFO-LS, is designed for nonlinear least-squares problems, and is simpler than existing methods, constructing linear residual models with a flexible approach. DFO-LS can benefit from a reduced initialisation cost, and has improved scalability and runtime over existing methods without a performance penalty. DFO-LS also has features for noisy problems, including a novel multiple restarts mechanism, which are low cost in evaluations and linear algebra, giving DFO-LS better performance compared to existing techniques for handling noise. We then introduce Py-BOBYQA, an extension of BOBYQA (Powell, 2009) with a similar multiple restarts mechanism to DFO-LS, amongst other new enhancements. It matches or outperforms BOBYQA and other state-of-the-art solvers on both smooth and noisy problems. We also propose a simple extension of Py-BOBYQA that may enable it to escape local minima and progress towards the global optima. Lastly, we introduce, for large-scale nonlinear least-squares problems, DFBGN, the first model-based DFO method to optimise in low-dimensional subspaces at each iteration. DFBGN has lower linear algebra costs, improved runtime, and hence improved performance on large-scale problems than DFO-LS, as it can perform many more iterations within a reasonable runtime limit. It can also make progress under very small evaluation budgets, with a low runtime cost.
Supervisor: Cartis, Coralia ; Fiala, Jan ; Marteau, Benjamin Sponsor: Engineering and Physical Sciences Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.791670  DOI: Not available
Keywords: Mathematical optimization ; Numerical analysis
Share: