Use this URL to cite or link to this record in EThOS:
Title: Efficient neural network classification of magnetic resonance images of the breast
Author: Harte, T. P.
Awarding Body: University of Cambridge
Current Institution: University of Cambridge
Date of Award: 1998
Availability of Full Text:
Full text unavailable from EThOS.
Please contact the current institution’s library for further details.
This dissertation proposes a new method of automated malignancy recognition in contrast-enhanced magnetic resonance images of the human breast using the multi-layer perceptron (MLP) feed-forward neural network paradigm. The fundamental limitation is identified as being the efficiency of such a classifier: the computational budget demanded by multi-dimensional image data sets is immense. Without optimization the MLP flounders. This work proposes a new efficient algorithm for MLP classification of large multi-dimensional data sets based on fast discrete orthogonal transforms. This is possible given the straightforward observation that point-wise mask-processing of image data for classification purposes is linear spatial convolution. The novel observation, then, is that the MLP permits convolution at its input layer due to the linearity of the inner product which it computes. Optimized fast Fourier transform (FFT) are investigated and an order of magnitude improvement in the execution time of a four-dimensional transform is achieved over commonly-implemented FFTs. One of the principal retardations in common multi-dimensional FFTs is observed to be the lack of attention paid to memory-hierarchy considerations. A simple, but fast, technique for optimizing cache performance is implemented. The abstract mathematical basis for convolution is investigated and a finite integer number theoretic transform (NTT) approach suggests itself, because such a transform can be defined that is fast, purely real, has parsimony of memory requirements, and has compact hardware realizations. A new method for multi-dimensional convolution with long-length number theoretic transforms is presented. This is an extension of previous work where NTTs were implemented over pseudo-Mersenne, and pseudo-Fermat surrogate moduli. A suitable modulus is identified which allows long-length transforms that readily lend themselves to the multi-dimensional convolution problem involved in classifying large magnetic resonance image data sets.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available