Use this URL to cite or link to this record in EThOS:
Title: Analogue imprecision in MLPs implications and learning improvements
Author: Edwards, Peter J.
Awarding Body: University of Edinburgh
Current Institution: University of Edinburgh
Date of Award: 1994
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
Analogue hardware implementations of Multi-Layer Perceptrons (MLP) have a limited precision that has a detrimental effect on the result of synaptic multiplication. At the same time however the accuracy of the circuits can be very high with good design. This thesis investigates the consequences of the imprecision on the performance of the MLP, examining whether it is accuracy or precision that is of importance in neural computation. The results of this thesis demonstrate that far from having a detrimental effect, the imprecision or synaptic weight noise enhances the performance of the solution. In particular the fault tolerance and generalisation ability are improved. In addition, under certain conditions, the learning trajectory of the training network is also improved. Through a mathematical analysis and subsequent verification experiments the enhancements are reported. Simulation experiments examine the underlying mechanisms and probe the limitations of the technique as an enhancement scheme. For a variety of problems, precision is shown to be significantly less important than accuracy. In fact imprecision can have beneficial effects on learning performance.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available