Use this URL to cite or link to this record in EThOS:
Title: Automated segmentation of structures essential to cell movement
Author: Fernyhough, Emma Nicole
ISNI:       0000 0004 5918 5177
Awarding Body: University of Leeds
Current Institution: University of Leeds
Date of Award: 2016
Availability of Full Text:
Access from EThOS:
Access from Institution:
The study of cells is not only a key field in modern science, but has been an important area of study for hundreds of years. Despite this there is still a lot left unknown. As technology has progressed, so has our ability to photograph and film cells, but much of the processing of these images is still carried out by hand. This is not only difficult and time consuming, but is subject to opinion and error, and often not exactly reproducible. We are wishing to automate the process of segmenting cells, in order to provide biologists with that data they require to learn more about cells and their movement. This should be done in a quantitative and reproducible way. Crawling cells, such as those studied for this research, often need to move around the host body, such as the human or other mammal, in order to assist with growth, prevent disease, or to cure damage. To do this they employ other structures which protrude from the cell body to aid their motility. They use very fine hair like features (filopodia) to detect their surrounding, penetrate other cells, and determine direction. They then use thin, flat membranes (lamellipodia) to adhere both at the front and rear of the cell to pull and push forward in the direction of movement. These features are often extremely difficult to see by eye, making automation of their segmentation an awkward task. To do this, we need to use not only the information in the individual frames of video, but also information gained over time such as their movement between the frames. We firstly pre-process the images using an automated technique to correct for lighting variations in the footage. Our method is not only extremely efficient and reliable but works equally on different sizes and shapes of cell as well as frames with differing degrees of background coverage, from only one or two small cells in a frame to where the majority of the image is covered. This shading correction method was also tested on non-cellular images taken using the same kind of microscopy to show that it is suitable for all images rather than just those being studied in this work. This pre-processing allows us to make a simple segmentation of the main cell bodies, which on its own is suitable for cells which do not contain other thin structures. Using the cell bodies obtained from our pre-processing technique we then find the thinner membranes which are attached to the cell. Despite being a fully automated method, this was more accurate in two out of our three sets of videos than the most popular segmentation program using manual setting of parameters for each video individually. We improved upon this initial segmentation by incorporating the movement of the cell over time, using an iterative technique to compare the outcome of sequential frames. The result was that our segmentation was better than the manually parametrised segmentation program for every video. We then wished to find the hair like extensions and again used the information from our pre-processing stage. As these are so difficult to detect by eye we used the information of the movement to create candidate regions where these were believed to be located. Although these were usually not straight, we were able to build up small line segments in the candidate regions to recreate the features and detect the direction. This allowed us to identify all regions with filopodia present, and to separate them in order to find the required information such as the number, the length, what kind of clusters they grew in and the location compared to the direction of movement. No other method has been found which is able to detect these or segment them separately from the cell.
Supervisor: Bulpitt, Andy Sponsor: Medical Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available