Use this URL to cite or link to this record in EThOS: | https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.438660 |
![]() |
|||||||
Title: | Parameter optimisation for search heuristics via a barrier tree Markov model | ||||||
Author: | Benfold, William |
ISNI:
0000 0001 3457 4385
|
|||||
Awarding Body: | University of Southampton | ||||||
Current Institution: | University of Southampton | ||||||
Date of Award: | 2007 | ||||||
Availability of Full Text: |
|
||||||
Abstract: | |||||||
The quality of solution provided by a search heuristic on a particular problem is by no means an absolute value. Most heuristics are controlled by a set of parameters, upon which the performance is heavily dependent. We construct Markov models for search heuristics on specific problem instances, and model the relationship between the quality of search and the choice of parameters. For any problem instance of nontrivial size, the state space for this model is large enough so as to make computation infeasible. Our solution is to use a reduced-state model of the search space, by amalgamating many similar points. We use a model based on a level-accessible barrier tree, grouping regions of the search space according to the reachability of minima. Optimal annealing and mutation schedules are produced, minimising either "where-you-are" or "best-so-far" cost, over binary perception, spin-glass and Max-SAT problems. The predictions of the model are found to be consistently over-optimistic; we discuss reasons for this and suggest some possible refinements to the model. A population-based variant of simulated annealing is briefly examined, where annealing temperature is adjusted according to performance. We later optimise the average first-passage time for several special-case heuristics, comparing the minimal times across a range of problems.
|
|||||||
Supervisor: | Not available | Sponsor: | Not available | ||||
Qualification Name: | Thesis (Ph.D.) | Qualification Level: | Doctoral | ||||
EThOS ID: | uk.bl.ethos.438660 | DOI: | Not available | ||||
Share: |