The University of Southampton
University of Southampton Institutional Repository

Model-based gait extraction and recognition

Model-based gait extraction and recognition
Model-based gait extraction and recognition

Extracting full-body motion from monocular video, sequences for gait recognition is an important and difficult problem.  Very often, the motion will be highly articulated and have complex changing boundaries and images may suffer from high level correlated and random noise from the real world.  Moreover, the large variations of the appearances of walking people caused by, for instance, carrying objects or wearing clothing, make the problem even more complicated.  In this thesis, we propose a consistent and easily extensible Bayesian framework for the gait extraction problem using strong prior knowledge.  This knowledge is imposed by a single two-dimensional articulated model having both time-invariant (static) and time-variant (dynamic) parameters.  The model is easily extended to handle the variations of body shapes.  To exploit the dynamics of human walk, we use a hidden Markov model to detect the phases of images in: a walking cycle.  The PDF projection theorem is introduced to learn the observation probability distributions accurately.  We build a strong prior model from the statistics of the parameters of the articulated model, which are learned from noise-free indoor training data.  The system parameters are first bootstrapped from a small amount of data and then refined by the Bayesian updating.  We demonstrate our approach on both high-quality indoor and noisy outdoor video data, as well as high-quality data with synthetic noise and occlusions added, and walkers with rucksacks, skirts and trench coats.

University of Southampton
Zhou, Ziheng
246463a7-a2a2-4a56-a700-663edda0a51f
Zhou, Ziheng
246463a7-a2a2-4a56-a700-663edda0a51f

Zhou, Ziheng (2007) Model-based gait extraction and recognition. University of Southampton, Doctoral Thesis.

Record type: Thesis (Doctoral)

Abstract

Extracting full-body motion from monocular video, sequences for gait recognition is an important and difficult problem.  Very often, the motion will be highly articulated and have complex changing boundaries and images may suffer from high level correlated and random noise from the real world.  Moreover, the large variations of the appearances of walking people caused by, for instance, carrying objects or wearing clothing, make the problem even more complicated.  In this thesis, we propose a consistent and easily extensible Bayesian framework for the gait extraction problem using strong prior knowledge.  This knowledge is imposed by a single two-dimensional articulated model having both time-invariant (static) and time-variant (dynamic) parameters.  The model is easily extended to handle the variations of body shapes.  To exploit the dynamics of human walk, we use a hidden Markov model to detect the phases of images in: a walking cycle.  The PDF projection theorem is introduced to learn the observation probability distributions accurately.  We build a strong prior model from the statistics of the parameters of the articulated model, which are learned from noise-free indoor training data.  The system parameters are first bootstrapped from a small amount of data and then refined by the Bayesian updating.  We demonstrate our approach on both high-quality indoor and noisy outdoor video data, as well as high-quality data with synthetic noise and occlusions added, and walkers with rucksacks, skirts and trench coats.

Text
1063993.pdf - Version of Record
Available under License University of Southampton Thesis Licence.
Download (3MB)

More information

Published date: 2007

Identifiers

Local EPrints ID: 466164
URI: http://eprints.soton.ac.uk/id/eprint/466164
PURE UUID: 227ba102-c389-43aa-93b4-a7bca10b7679

Catalogue record

Date deposited: 05 Jul 2022 04:35
Last modified: 16 Mar 2024 20:32

Export record

Contributors

Author: Ziheng Zhou

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×