Use this URL to cite or link to this record in EThOS:
Title: Intraoperative imaging and tool tracking for robotic surgery
Author: Zhang, Lin
ISNI:       0000 0004 7657 9119
Awarding Body: Imperial College London
Current Institution: Imperial College London
Date of Award: 2018
Availability of Full Text:
Access from EThOS:
Access from Institution:
Minimally invasive laparoscopic interventions, despite being widely used within different disciplines of surgical oncology, are restricted by several factors including limited contact sensing and haptic feedback, a loss of stereoscopic depth perception, indirect camera navigation and reduced dexterity. The development of robotic systems in recent years has aimed to address these limitations by adding force-sensing capabilities to the instruments, providing a stereo camera with a robotic holder, and increasing the degree of freedom in the instruments. However, most of the systems only allow teleoperation where the robot directly duplicates the surgeon's motions without any level of autonomy. Despite their increasing application, many robotic systems have been designed without taking full advantage of new imaging techniques that can assist surgical diagnosis and planning via multi-scale anatomical and pathological information. Existing studies on the integration of imaging techniques into intraoperative guidance have demonstrated promising outcomes in terms of improved surgical precision, reduced tumour margins, better nerve-sparing results and an improved consistency of treatment. This research proposes a framework for the automated scanning of various imaging modalities including endomicroscopy, optical coherence tomography and ultrasound. The scanning results are fused with the endoscopic view to provide a clear and intuitive visualisation for the surgeon, aimed at assisting surgical planning. To facilitate the automation of the scanning task via visual servoing, real-time vision-based tracking methods for surgical instruments have been developed. Furthermore, dynamic surgical environments, such as tissue motion, are considered in the autonomous scanning framework, by estimating the motion characteristics and adapting the motion during the scanning task. The results provide valuable insights into improving surgical diagnosis and navigation for image-guided robotic surgery in the form of autonomous robotics, and motivate the development of autonomous systems for surgical tasks, while reducing the cognitive workload of the surgeon.
Supervisor: Yang, Guang-Zhong Sponsor: China Scholarship Council ; Imperial College London
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral