Use this URL to cite or link to this record in EThOS:
Title: Biological and biomimetic machine learning for automatic classification of human gait
Author: Sarangi, Viswadeep
ISNI:       0000 0005 0287 7006
Awarding Body: University of York
Current Institution: University of York
Date of Award: 2020
Availability of Full Text:
Access from EThOS:
Access from Institution:
Machine learning (ML) research has benefited from a deep understanding of biological mechanisms that have evolved to perform comparable tasks. Recent successes of ML models, superseding human performance in human perception based tasks has garnered interest in improving them further. However, the approach to improving ML models tends to be unstructured, particularly for the models that aim to mimic biology. This thesis proposes and applies a bidirectional learning paradigm to streamline the process of improving ML models’ performance in classification of a task, which humans are already adept at. The approach is validated taking human gait classification as the exemplar task. This paradigm possesses the additional benefit of investigating underlying mechanisms in human perception (HP) using the ML models. Assessment of several biomimetic (BM) and non-biomimetic (NBM) machine learning models on an intrinsic feature of gait, namely the gender of the walker, establishes a functional overlap in the perception of gait between HP and BM, selecting the Long-Short-Term-Memory (LSTM) architecture as the BM of choice for this study, when compared with other models such as support vector machines, decision trees and multi-layer perceptron models. Psychophysics and computational experiments are conducted to understand the overlap between human and machine models. The BM and HP derived from psychophysics experiments, share qualitatively similar profiles of gender classification accuracy across varying stimulus exposure durations. They also share the preference for motion-based cues over structural cues (BM = H > NBM). Further evaluation reveals a human-like expression of the inversion effect, a well-studied cognitive bias in HP that reduces the gender classification accuracy to 37% (p<0.05, chance at 50%) when exposed to inverted stimulus. Its expression in the BM supports the argument for learned rather than hard-wired mechanisms in HP. Particularly given the emergence of the effect in every BM, after training multiple randomly initialised BM models without prior anthropomorphic expectations of gait. The above aspects of HP, namely the preference for motion cues over structural cues and the lack of prior anthropomorphic expectations, were selected to improve BM performance. Representing gait explicitly as motion-based cues of a non-anthropomorphic, gender-neutral skeleton not only mitigates the inversion effect in BM, but also improves significantly the classification accuracy. In the case of gender classification of upright stimuli, mean accuracy improved by 6%, from 76% to 82% (F1,18 = 16, p<0.05). For inverted stimuli, mean accuracy improved by 45%, from 37% to 82% (F1,18 = 20, p<0.05). The model was further tested on a more challenging, extrinsic feature task; the classification of the emotional state of a walker. Emotions were visually induced in subjects through exposure to emotive or neutral images from the International Affective Picture System (IAPS) database. The classification accuracy of the BM was significantly above chance at 43% accuracy (p<0.05, chance at 33.3%). However, application of the proposed paradigm in further binary emotive state classification experiments, improved mean accuracy further by 23%, from 43% to 65% (F1,18 = 7.4, p<0.05) for the positive vs. neutral task. Results validate the proposed paradigm of concurrent bidirectional investigation of HP and BM for the classification of human gait, suggesting future applications for automating perceptual tasks for which the human brain and body has evolved.
Supervisor: Adar, Pelah Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available