Use this URL to cite or link to this record in EThOS:
Title: Deep complex-valued neural networks for natural language processing
Author: Mönning, Nils
ISNI:       0000 0004 8501 4734
Awarding Body: University of York
Current Institution: University of York
Date of Award: 2019
Availability of Full Text:
Access from EThOS:
Access from Institution:
This thesis presents novel work on complex-valued neural networks applied to Natural Language Processing. We experimentally show the validity of complex-valued neural networks for semantic and phonetic processing of natural languages. We highlight important issues that complex networks have in comparison to their real-valued counter parts. In particular this work considers the tasks of Language Modelling, Semantic Similarity Judgement, Basic Question Answering, Phonetic Transcription and Automatic Speech Recognition. Our contributions are the translation of neural network building blocks to the complex plane and their experimental application in a variety of natural language tasks. We present criteria to compare real-valued and complex-valued neural networks for classification tasks. We present various complex embedding methods for words. These produce position and frequency-based word representations trainable using language models and usable in down-stream tasks. We also compare a real-valued and complex-valued memory network used for Question Answering. We derive a quantum-inspired framework for languages. Additionally, we demonstrate quantum-inspired Semantic Spaces. A general framework of complex-valued attention is presented in this thesis. It is used to derive spectral self-attention with a novel activation function. We also introduce two pooling functions to reduce dimensionality of frequency-based representations. A Spectral Transformer architecture facilitates the spectral self-attention for Speech Recognition. This work also includes a novel dataset for transcription of children's utterances consisting of seven sub tasks each with fixed data splits and baselines for better comparison and reproducibility. Throughout this thesis we find that complex-valued neural networks are suitable for natural language tasks, but require additional care in their design and training.
Supervisor: Manandhar, Suresh Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available