Gait Analysis and Recognition for Automated Visual Surveillance
Gait Analysis and Recognition for Automated Visual Surveillance
Human motion analysis has received a great attention from researchers in the last decade due to its potential use in different applications such as automated visual surveillance. This field of research focuses on the perception and recognition of human activities, including people identification. We explore a new approach for walking pedestrian detection in an unconstrained outdoor environment. The proposed algorithm is based on gait motion as the rhythm of the footprint pattern of walking people is considered the stable and characteristic feature for the classification of moving objects. The novelty of our approach is motivated by the latest research for people identification using gait. The experimental results confirmed the robustness of our method to discriminate between single walking subject, groups of people and vehicles with a successful detection rate of 100%. Furthermore, the results revealed the potential of our method to extend visual surveillance systems to recognize walking people. Furthermore, we propose a new approach to extract human joints (vertex positions) using a model-based method. The spatial templates describing the human gait motion are produced via gait analysis performed on data collected from manual labeling. The Elliptic Fourier Descriptors are used to represent the motion models in a parametric form. The heel strike data is exploited to reduce the dimensionality of the parametric models. People walk normal to the viewing plane, as major gait information is available in a sagittal view. The ankle, knee and hip joints are successfully extracted with high accuracy for indoor and outdoor data. In this way, we have established a baseline analysis which can be deployed in recognition, marker-less analysis and other areas. The experimental results confirmed the robustness of the model-based approach to recognise walking subjects with a correct classification rate of 95% using purely the dynamic features derived from the joint motion. Therefore, this confirms the early psychological theories claiming that the discriminative features for motion perception and people recognition are embedded in gait kinematics. Furthermore, to quantify the intrusive nature of gait recognition we explore the effects of the different covariate factors on the performance of gait recognition. The covariate factors include footwear, clothing, carrying conditions and walking speed. As far as the author can determine, this is the first major study of its kind in this field to analyse the covariate factors using a model-based method.
Bouchrika, Imed
240fa05b-aed2-400a-a683-b4c0d20f2f68
Bouchrika, Imed
240fa05b-aed2-400a-a683-b4c0d20f2f68
Bouchrika, Imed
(2008)
Gait Analysis and Recognition for Automated Visual Surveillance.
University of Southampton, Department of Electronics and Computer Science, Doctoral Thesis.
Record type:
Thesis
(Doctoral)
Abstract
Human motion analysis has received a great attention from researchers in the last decade due to its potential use in different applications such as automated visual surveillance. This field of research focuses on the perception and recognition of human activities, including people identification. We explore a new approach for walking pedestrian detection in an unconstrained outdoor environment. The proposed algorithm is based on gait motion as the rhythm of the footprint pattern of walking people is considered the stable and characteristic feature for the classification of moving objects. The novelty of our approach is motivated by the latest research for people identification using gait. The experimental results confirmed the robustness of our method to discriminate between single walking subject, groups of people and vehicles with a successful detection rate of 100%. Furthermore, the results revealed the potential of our method to extend visual surveillance systems to recognize walking people. Furthermore, we propose a new approach to extract human joints (vertex positions) using a model-based method. The spatial templates describing the human gait motion are produced via gait analysis performed on data collected from manual labeling. The Elliptic Fourier Descriptors are used to represent the motion models in a parametric form. The heel strike data is exploited to reduce the dimensionality of the parametric models. People walk normal to the viewing plane, as major gait information is available in a sagittal view. The ankle, knee and hip joints are successfully extracted with high accuracy for indoor and outdoor data. In this way, we have established a baseline analysis which can be deployed in recognition, marker-less analysis and other areas. The experimental results confirmed the robustness of the model-based approach to recognise walking subjects with a correct classification rate of 95% using purely the dynamic features derived from the joint motion. Therefore, this confirms the early psychological theories claiming that the discriminative features for motion perception and people recognition are embedded in gait kinematics. Furthermore, to quantify the intrusive nature of gait recognition we explore the effects of the different covariate factors on the performance of gait recognition. The covariate factors include footwear, clothing, carrying conditions and walking speed. As far as the author can determine, this is the first major study of its kind in this field to analyse the covariate factors using a model-based method.
More information
Accepted/In Press date: June 2008
Organisations:
University of Southampton, Electronics & Computer Science
Identifiers
Local EPrints ID: 266142
URI: http://eprints.soton.ac.uk/id/eprint/266142
PURE UUID: c5464ed1-7bd3-4365-8731-254bac824ec9
Catalogue record
Date deposited: 16 Jul 2008 14:16
Last modified: 14 Mar 2024 08:21
Export record
Contributors
Author:
Imed Bouchrika
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics