Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.558523
Title: Building computational atlases from databases of whole-body clinical PET/CT images
Author: Potesil, Vaclav
Awarding Body: University of Oxford
Current Institution: University of Oxford
Date of Award: 2011
Availability of Full Text:
Full text unavailable from EThOS. Please contact the current institution’s library for further details.
Abstract:
Medical imaging has revolutionized cancer care and its use has grown massively over the past several decades. Images are increasingly stored in large digital image repositories such as hospital Picture Archiving and Communication System, which will hopefully provide a wealth of information on patient conditions and therapy outcomes as cancer diagnosis and therapy moves from 'one size fits all' to more personalized approaches tailored to each particular patient. However, converting the unstructured avalanche of data at thousands of different hospitals into clinically valuable biomarkers and tools requires that the images of different patients can be compared and efficiently searched. Our research aims to develop novel methods to compare whole-body scans of multiple patients; methods which incorporate 'intelligent' prior knowledge of the internal structure of the human body, as opposed to current methods of image registration which mostly rely on matching the voxel intensities and disregard their anatomical meaning. We develop computational methods for accurate and reliable automated localization of anatomical structures in whole-body images, which will help to automate key steps in cancer diagnosis and radiation treatment planning and save expensive clinicians' time while improving the reliability of their decisions. Conventional approaches to determining spatial correspondences between pairs or sets of images in medical imaging typically rely on image registration methods. There have been considerable advances in registration of multiple images of the same patient taken at different time-points, known as longitudinal studies. However, conventional methods, which rely on optimizing certain integral functions of voxel values over the entire image, are unreliable when applied to aligning whole-body images of different patients. Whole-body Computed Tomography (CT) images contain many different anatomical structures whose physical attributes and consequent appearance can be highly variable between patients. This substantial, but normal, variability is further increased by the presence of pathologies such as tumours and non-cancerous diseases, surgical interventions and degenerative changes due to aging as well as different patterns of contrast agent uptake. Conventional registration methods often get trapped in local minima that abound in such images, resulting in unreliable and inaccurate anatomical correspondences. The methods developed in this thesis tackle the problem of inter-patient registration by incorporating prior anatomical knowledge into parts-based graphical models that accurately and reliably localize arbitrary skeletal and soft-tissue anatomical landmarks in whole-body clinical oncology scans. We optimize parts-based graphical models called Pictorial Structures for accurate and reliable landmark localization in CT images and introduce novel methods that replace standard population models by models personalized to the particular patient. We also propose methods that further improve landmark localization while minimizing, as far as possible, the high costs of ground-truth annotation by expert radiologists. We do this by automatically discovering new landmark correspondences from a database of partially annotated images. The performance of the algorithms developed in my thesis is evaluated on a large database of clinical lung cancer PET/CT scans, showing superior accuracy and reliability of landmark localization compared to conventional methods.
Supervisor: Brady, Michael ; Kadir, Timor Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.558523  DOI: Not available
Share: