Use this URL to cite or link to this record in EThOS:
Title: Sequential recurrent connectionist algorithms for time series modeling of nonlinear dynamical systems
Author: Mirikitani, Derrick Takeshi
ISNI:       0000 0004 2691 2550
Awarding Body: Goldsmiths
Current Institution: Goldsmiths College (University of London)
Date of Award: 2010
Availability of Full Text:
Access from EThOS:
Access from Institution:
This thesis deals with the methodology of building data driven models of nonlinear systems through the framework of dynamic modeling. More specifically this thesis focuses on sequential optimization of nonlinear dynamic models called recurrent neural networks (RNNs). In particular, the thesis considers fully connected recurrent neural networks with one hidden layer of neurons for modeling of nonlinear dynamical systems. The general objective is to improve sequential training of the RNN through sequential second-order methods and to improve generalization of the RNN by regularization. The total contributions of the proposed thesis can be summarized as follows: 1. First, a sequential Bayesian training and regularization strategy for recurrent neural networks based on an extension of the Evidence Framework is developed. 2. Second, an efficient ensemble method for Sequential Monte Carlo filtering is proposed. The methodology allows for efficient O(H 2 ) sequential training of the RNN. 3. Last, the Expectation Maximization (EM) framework is proposed for training RNNs sequentially.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available
Keywords: Computer Science