Use this URL to cite or link to this record in EThOS:
Title: Small nets and short paths : optimising neural computation
Author: Frean, Marcus Roland
Awarding Body: University of Edinburgh
Current Institution: University of Edinburgh
Date of Award: 1990
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
The thesis explores two aspects of optimisation in neural network research. 1. The question of how to find the optimal feed-forward neural network architecture for learning a given binary classification is addressed. The so-called constructive approach is reviewed whereby intermediate, hidden, units are built as required for the particular problem. Current constructive algorithms are compared, and three new methods are introduced. One of these, the Upstart algorithm, is shown to out-perform all other constructive algorithms of this type. This work led on to the ancillary problem of finding a satisfactory procedure for changing the weight values of an individual unit in a network. The new thermal perceptron rule is described and is shown to compare favorably with its competitors. Finally the spectrum of possible learning rules is surveyed. 2. Neurobiologically inspired algorithms for mapping between spaces of different dimensions are applied to a classic optimisation problem, the Travelling Salesman Problem. Two new methods are described that can tackle the general symmetric form of the TSP, thus overcoming the restriction on other neural network algorithms to the geometric case.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available