Use this URL to cite or link to this record in EThOS:
Title: Evolution of a heterogeneous hybrid extreme learning machine
Author: Christou, Vasileios
ISNI:       0000 0004 8501 2034
Awarding Body: University of Manchester
Current Institution: University of Manchester
Date of Award: 2019
Availability of Full Text:
Access from EThOS:
Access from Institution:
Hybrid optimization algorithms have gained popularity as it has become apparent there cannot be a universal optimization strategy which is globally more beneficial than any other. Despite their popularity, hybridization frameworks require more detailed categorization regarding: the nature of the problem domain, the constituent algorithms, the coupling schema and the intended area of application. This thesis proposes a hybrid algorithm named heterogeneous hybrid extreme learning machine (He-HyELM) for finding the optimal multi-layer perceptron (MLP) with one hidden layer that solves a specific problem. This is achieved by combining the extreme learning machine (ELM) training algorithm with an evolutionary computing (EC) algorithm. The research process is complemented by a series of preliminary experiments prior to hybridization that explore in depth the characteristics of the ELM algorithm. He-HyELM uses a pool of custom created neurons which are then embedded in a series of ELM trained MLPs. A genetic algorithm (GA) evolves these homogeneous networks into heterogeneous networks according to a fitness criterion. The GA utilizes a proposed intelligent novel crossover operator which uses a mechanism to rank each hidden layer node with purpose to guide the evolution process. Having analysed the proposed He-HyELM algorithm in Chapter 5, an enhanced version of the proposed algorithm is presented in Chapter 6. This enhanced version makes the mutation operator self-adaptive with aim to reduce the number of parameters need tuning. Both He-HyELM and SA-He-HyELM approaches are tested in three regression and three classification real-world datasets with purpose to test their performance. These experiments showed that both versions improved generalization when compared with the best homogeneous network found during the ELM empirical study in Chapter 3. Finally, in Chapter 7 we summarize the key findings and contributions of this work.
Supervisor: Zhao, Liping ; Brown, Gavin Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available
Keywords: Hybrid extreme learning machine ; Genetic algorithm ; Regression problem ; Classification problem ; Artificial neural network ; Custom neuron