Use this URL to cite or link to this record in EThOS:
Title: Video event analysis using activity primitives
Author: Garg, Aparna
ISNI:       0000 0004 7657 377X
Awarding Body: University of Manchester
Current Institution: University of Manchester
Date of Award: 2014
Availability of Full Text:
Access from EThOS:
Access from Institution:
The environments in which videos are recorded are continually changing. Recognizing visual events from changing environments is a challenging task. Moreover the set of visual events which need to be recognized also changes as events of interest are not always known a-priori. This thesis presents a adaptive framework for visual event recognition, which can adjust to the changing environment and is extensible to new events. Two key notions are introduced to achieve this. One is a novel 'adapt-and-abstract' approach to extract primitive activities from motion trajectories which is adaptive to changing environment. The second is that generic activity primitives can be used to construct all complex activities in different locations. The first notion is that models for conceptual abstraction of visual events from low level representation can adapt to change in environment when based on approximate low-level features and contextual adaptation in a way similar to the strategy human beings adopt when inferring visual events. This concept of 'adapt-and-abstract' approach adapts the low level features it is looking at according the to the context. Statistical features are used to obtain the adaptive measures. The approach then abstracts from the adapted features thus removing the hard-coding between the low level features and high-level visual concepts. This allows the event analysis framework to be deployed at different locations without retraining. The second main concept used in this thesis is that all complex activities can be composed from primitive activities if they are generic. Logical rules are used to construct complex events from primitive activities and their spatial temporal attributes . It is shown that rules for new events can be easily added or existing rules modified when the environment changes. The framework presented in this thesis combines the capability to adapt to new locations of the primitive activities with the capability of a rule based approach to introduce new events. This is demonstrated by evaluating the framework on three different datasets. Such a framework is especially useful for complex and infrequent events. A number of methods for visual event analysis are also introduced in this work.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available
Keywords: video event parser ; video event analysis ; activity primitives