Use this URL to cite or link to this record in EThOS:
Title: Brainwave-based human emotion estimation using deep neural network models for biofeedback
Author: Liu, Jingxin
ISNI:       0000 0004 7961 7782
Awarding Body: Brunel University London
Current Institution: Brunel University
Date of Award: 2019
Availability of Full Text:
Access from EThOS:
Access from Institution:
Emotion is a state that comprehensively represents human feeling, thought and behavior, thus takes an important role in interpersonal human communication. Emotion estimation aims to automatically discriminate different emotional states by using physiological and nonphysiological signals acquired from human to achieve effective communication and interaction between human and machines. Brainwaves-Based Emotion Estimation is one of the most common used and efficient methods for emotion estimation research. The technology reveals a great role for human emotional disorder treatment, brain computer interface for disabilities, entertainment and many other research areas. In this thesis, various methods, schemes and frameworks are presented for Electroencephalogram (EEG) based human emotion estimation. Firstly, a hybrid dimension feature reduction scheme is presented using a total of 14 different features extracted from EEG recordings. The scheme combines these distinct features in the feature space using both supervised and unsupervised feature selection processes. Maximum Relevance Minimum Redundancy (mRMR) is applied to re-order the combined features into max-relevance with the emotion labels and min-redundancy of each feature. The generated features are further reduced with Principal Component Analysis (PCA) for extracting the principal components. Experimental results show that the proposed work outperforms the state-of-art methods using the same settings at the publicly available Database for Emotional Analysis using Physiological Signals (DEAP) data set. Secondly, a disentangled adaptive noise learning β-Variational autoencoder (VAE) combine with long short term memory (LSTM) model was proposed for the emotion recognition based on EEG recordings. The experiment is also based on the EEG emotion public DEAP dataset. At first, the EEG time-series data are transformed into the Video-like EEG image data through the Azimuthal Equidistant Projection (AEP) to original EEG-sensor 3-D coordinates to perform 2-D projected locations of electrodes. Then Clough-Tocher scheme is applied for interpolating the scattered power measurements over the scalp and for estimating the values in-between the electrodes over a 32x32 mesh. After that, the βVAE LSTM algorithm is used to estimate the accuracy of the quadratic (arousal-valence) classification. The comparison between the β VAE-LSTM model and other classic methods is conducted at the same experimental setting that shows that the proposed model is effective. Finally, a novel real-time emotion detection system based on the EEG signals from a portable headband was presented, integrated into the interactive film 'RIOT'. At first, the requirement of the interactive film was collected and the protocol for data collection using a portable EEG sensor (Emotiv Epoc) was designed. Then, a portable EEG emotion database (PEED) is built from 10 participants with the emotion labels using both self-reporting and video annotation tools. After that, various feature extraction, feature selection, validation scheme and classification methods are explored to build a practical system for the real-time detection. In the end, the emotion detection system is trained and integrated into the interactive film for real-time implementation and fully evaluated. The experimental results demonstrate the system with satisfied emotion detection accuracy and real-time performance.
Supervisor: Meng, H. ; Nandi, A. Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available
Keywords: Emotion recognition