Use this URL to cite or link to this record in EThOS:
Title: Human action recognition from relative motion
Author: Oshin, Olusegun Temitope
ISNI:       0000 0004 2718 062X
Awarding Body: University of Surrey
Current Institution: University of Surrey
Date of Award: 2011
Availability of Full Text:
Access from EThOS:
Access from Institution:
The aim of this thesis is to develop discriminative and efficient representations of human actions in video for recognition. Human actions can be defined as sets of atomic events which occur in local and global regions of the video. Natural actions consist of several variations in factors pertaining to their execution and capture conditions. Therefore, representations that are robust in the presence of these variations are desirable. A visual description of an action is often given in terms of motion of body parts, and/or objects in the scene with which these parts interact. Therefore, this thesis presents approaches to the recognition of actions based solely on motion observed in video. Explicit appearance information is discarded as the appearance of subjects vary significantly, especially in uncontrolled environments, while motion cues are consistent. Also, it has been shown in Psychology experiments using Point Light Displays that it is possible to observe detailed properties of human actions and actors based entirely on the dynamics of body movement. This motivates the presented approaches. Motion in video can be summarised using highly informative spatio-temporal interest points. However, the selection of interesting motion regions can be computationally expensive. A novel interest point detector is therefore introduced, which provides for a generic and efficient solution. Interest point detection is formulated as a classification problem: Given examples of detected interest points, an approach is presented, which emulates the functionality of any detector. Simple, yet effective tests are employed in a naive Bayesian classifier, Randomised Ferns, to categorise local regions as motion or non-motion regions. Results show comparable detections to emulated detectors, achieved in constant time, and independent of the complexity of the detectors. The spatial and temporal distribution of interest points induced by actions provides discriminative information for action description. For simulated actions performed in simplified settings, characteristic events of actions can be deduced from the global distribution of these points. The Randomised Ferns classifier is further extended to encode these distributions. The global distribution of interest points indicates the presence and absence of motion at various regions within the global action region, and can therefore provide discriminative information for action description. Minimal constraints exist on the execution of natural actions and scene setup. In such settings, simply encoding global motion events fails. A Relative Motion Descriptor is introduced, which encodes characteristic low level motion information, and-therefore captures detailed properties of action and scene dynamics. The descriptor is computed at local regions across the video, and encodes atomic motion events via the relative distribution of interest point response strengths. The resulting descriptor can be used in conjunction with state-of-the-art classifiers. Results show recognition using SVM Classifiers. Furthermore, an approach is presented for the improvement of action classification, which assumes the presence of inherent modes in the observations. This is necessary as loose constraints are placed on actions in natural settings. Automatic Outlier Detection and Mode Finding methods are introduced to determine these modes. A variant of the RANSAC algorithm is employed with a novel adaptation based on a Boosting-inspired iterative reweighting scheme. These methods simplify the classification boundaries between actions and result in improved recognition performance.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available