Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.777924
Title: Inferring intentions from natural eye-movements for biomimetic control of prosthetics and wheelchairs
Author: Abbott, William Welby
ISNI:       0000 0004 7963 6887
Awarding Body: Imperial College London
Current Institution: Imperial College London
Date of Award: 2017
Availability of Full Text:
Access from EThOS:
Access from Institution:
Abstract:
The link between perception and action is a fundamental question in neuroscience. To see the world we must look, thus eye movements provide a window into the brain. They reflect our intentions, as the only directly observable behavioural signal that is highly correlated with actions at the task level, and proactive of body movements. Despite this, eye tracking is not widely used as control interface for movement impaired patients due to poor signal interpretation and lack of control flexibility. It is therefore proposed, that tracking the gaze position in 3D rather than conventional 2D provides a considerably richer signal directly relevant for prosthetic control. An ultra-low-cost 3D gaze tracker is developed capable of information transfer rates up to 43 bits/sec, well beyond current invasive and non invasive brain machine interfaces. Further, considering that the eyes make the fastest movements in the body (faster than a heartbeat), reflects the pace of information retrieval required for the brain to coordinate and orchestrate the complex actions of everyday life. Thus eye-movements are naturally disposed to the real-time closed loop control of actuators. This is demonstrated in a large field study involving over 2000 participants playing the arcade game "Pong". While this is a trivial task in comparison to prosthetic control, unlike many brain machine interface applications it requires continuous closed loop control. Pong is proposed as a universally accessible benchmark to compare the performance of interfaces. To achieve this data-logging pong game is developed with a suite of algorithms to analyse the performance and behavioural strategies. This is then used to benchmark the developed gaze input method. In just 30 seconds set up and 5 seconds training time, the majority of subjects were able to successfully play and in some cases win, despite having never using their eyes as a control input. The developed 3D gaze interfaces also allowed this potential to be taken beyond the computer screen to control robotic arms and wheelchairs. Interfaces were designed to minimise the impact on the natural task of the eyes - to sample information from the world. In fact the aim is to base control strategies on these natural eye-movements because they implicitly contain intention information. This has been shown in a large body of literature, however methods are either reductionist lab based studies or require painstaking manual annotation of eye and body movements "in the wild". Thus a new methodology is developed, to record a comprehensive database of human visuomotor behaviour, to inform the development of biomimetic prosthetics. The variance of behaviour is captured, rather than constrained, in natural daily tasks lasting hours rather than minutes. The dataset presented moves towards an embodied approach, building a database of human behaviour more complete, extensive and unconstrained than achieved previously. This leads to new concepts of Embodied Saliency; predicting eye-movements directly from body movements and Embodied Gaze Descriptors; data driven methods for taxonomising the complex interaction between eye and body movement in the wild. Thus by capturing rather than constraining natural behaviour, it is hoped that future biomimetic prosthetics, principled on the full perception action loop, can liberate patients from the constraints of disability.
Supervisor: Faisal, Aldo Sponsor: Engineering and Physical Sciences Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.777924  DOI:
Share: