Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.530462
Title: Simultaneous localisation and mapping for minimally invasive surgery
Author: Mountney, Peter Edward
Awarding Body: Imperial College London
Current Institution: Imperial College London
Date of Award: 2011
Availability of Full Text:
Access from EThOS:
Access from Institution:
Abstract:
In recent years, Minimally Invasive Surgery (MIS) has transformed the general practice of surgery. The benefits are well documented and include reduced trauma, hospitalisation and comorbidity leading to faster recovery. Despite the benefits, current instrument design and visualisation make MIS challenging. The clinical benefits of Image Guided Intervention (IGI) are well established for procedures such as neurosurgery, where tissue motion is manageable. IGI provides visualisation below the tissue surface allowing the surgeon to avoid critical structures and identify target anatomy. In minimally invasive cardiac, gastrointestinal, or abdominal surgery, significant tissue deformation prohibits accurate registration of pre- and intra-operative data. In this thesis, computer vision and machine learning techniques are explored to estimate 3D tissue deformation for improved intra-operative navigation and visualisation. The main focus of this thesis is concerned with modelling 3D tissue deformation from a mobile intra-operative device. Two methods are proposed to improve region tracking using machine learning techniques. The first is based on tracking-by-detection. A set of region descriptors is systematically selected, which are robust to deformation. This set is fused in a probabilistic framework to combine multiple cues and boost tracking. In the second method, a context specific technique is developed. It is capable of learning the information that best distinguishes a region from its surroundings. The information is adaptively updated online to learn a representation that is robust to deformation. 3D tissue models are built sequentially from a moving imaging device using Simultaneous Localisation And Mapping (SLAM). To this end, an optical biopsy mapping system based on SLAM is proposed. The system registers multi-modal, intraoperative images to a common coordinate space. The resulting Augmented Reality (AR) visualisation aids biopsy site retargeting and navigation. A second SLAM based system is proposed for dynamic view expansion. By using the localised camera position, a photorealistic tissue model is augmented onto the laparoscopic video. This expands the camera's field-of-view to aid navigation and reduce disorientation. Significantly, in this thesis, a re-formulation of the static SLAM problem is proposed. This is called Motion Compensated SLAM (MC-SLAM) which is capable of accurate localisation and dynamic mapping in periodically deforming environments. The work is validated using simulated, phantom, ex vivo and in vivo data. Finally, the future research directions and potential improvements to the techniques presented in this thesis are outlined.
Supervisor: Yang, Guang-Zhong ; Davison, Andrew Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.530462  DOI: Not available
Share: