Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.797535
Title: Analysis of visual search for knowledge gathering
Author: Dempere-Marco, Laura
Awarding Body: Imperial College London
Current Institution: Imperial College London
Date of Award: 2004
Availability of Full Text:
Access from EThOS:
Access from Institution:
Abstract:
The quest for understanding how we look has fascinated both the art and research communities for centuries. Human eyes do not have a uniform visual response and the best visual acuity is only within a visual angle of one to two degrees located at the fovea. In order to obtain a complete representation of the visual world, it is necessary to move our eyes to scan the scene. The rapid saccadic eye movements and individual fixations form a visual search scanpath which can be influenced by a number of factors including the knowledge, interest and expectation of the scene. Although the visual search patterns of different observers while studying the same scene bear some common characteristics, the idiosyncrasy associated with individual observers provides both opportunities and challenges for unveiling the underlying cognitive processes involved in specific visual tasks. The aim of this research is to study the spatio-temporal characteristics of visual search, together with the intrinsic visual features of the fixation points, for domain knowledge representation and decision support in medical imaging. The use of visual search for image feature learning and decision support is a new concept which is driven by the need for a general framework for knowledge gathering in image understanding. The work aims to address the inherent drawback of traditional approaches for which the use of explicit domain knowledge representation often overlooks those factors that are subconsciously applied during visual recognition. A novel framework termed Visual Tracking for Active Learning (ViTAL) has been developed for analysing the dynamics of the eye movements. The basic characteristics of the visual search patterns in both the spatial and feature spaces are analysed. A new technique based on the EMD metric has been proposed for assessing the idiosyncrasy of different scanpath patterns. To enable the integration of visual search behaviour of different observers covering different patient data sets, we have introduced a standardised anatomical representation through the use of free-form image registration. Our study has shown that through the effective use of feature space representation, it is possible to untangle what appears to be uncorrelated scanpath patterns to reveal common visual search behaviours. By the introduction of transient fixation moments in the feature space, it also provides a way of separating skilled visual search tasks into different episodes as explained by the global-focal model. To avoid the explicit use of feature extractors for the ViTAL framework, a feature embedding method by incorporating Gabor filter banks has been proposed. This has allowed a mapping from the image space into a low-dimensional feature space that preserves the intrinsic similarity of image patterns. It also provides a systematic way of defining perceptually meaningful spaces by making use of the visual similarity between foveated patterns. The proposed methods have been validated with both laboratory experiments involving normal volunteers and clinical studies for the assessment of diffuse lung diseases with high-resolution computed tomography. The main contribution of the thesis is in the development and detailed analysis of different stages of the ViTAL framework, particularly in the design of the feature embedding framework which is both biologically inspired and practically feasible. The value of the work for decision support in medical image understanding has been discussed and validated.
Supervisor: Yang, Guang-Zhong ; Ellis, Stephen ; Hansell, David Sponsor: Engineering and Physical Sciences Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.797535  DOI: Not available
Share: