The University of Southampton
University of Southampton Institutional Repository

Automated markerless analysis of human gait motion for recognition and classification

Automated markerless analysis of human gait motion for recognition and classification
Automated markerless analysis of human gait motion for recognition and classification
We present a new method for an automated markerless system to describe, analyze, and classify human gait motion. The automated system consists of three stages: i) detection and extraction of the moving human body and its contour from image sequences, ii) extraction of gait figures by the joint angles and body points, and iii) analysis of motion parameters and feature extraction for classifying human gait. A sequential set of 2D stick figures is used to represent the human gait motion, and the features based on motion parameters are determined from the sequence of extracted gait figures. Then, a Knearest neighbor classifier is used to classify the gait patterns. In experiments, this provides an alternative estimate of biomechanical parameters on a large population of subjects, suggesting that the estimate of variance by marker-based techniques appeared generous. This is a very effective and well-defined representation method for analyzing the gait motion. As such, the markerless approach confirms uniqueness of the gait as earlier studies and encourages further development along these lines.
259-266
yoo, jang hee
ec82f377-0c75-4853-8f8b-c672b49f80bb
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
yoo, jang hee
ec82f377-0c75-4853-8f8b-c672b49f80bb
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12

yoo, jang hee and Nixon, Mark (2011) Automated markerless analysis of human gait motion for recognition and classification. ETRI Journal, 33 (2), 259-266. (doi:10.4218/etrij.11.1510.0068).

Record type: Article

Abstract

We present a new method for an automated markerless system to describe, analyze, and classify human gait motion. The automated system consists of three stages: i) detection and extraction of the moving human body and its contour from image sequences, ii) extraction of gait figures by the joint angles and body points, and iii) analysis of motion parameters and feature extraction for classifying human gait. A sequential set of 2D stick figures is used to represent the human gait motion, and the features based on motion parameters are determined from the sequence of extracted gait figures. Then, a Knearest neighbor classifier is used to classify the gait patterns. In experiments, this provides an alternative estimate of biomechanical parameters on a large population of subjects, suggesting that the estimate of variance by marker-based techniques appeared generous. This is a very effective and well-defined representation method for analyzing the gait motion. As such, the markerless approach confirms uniqueness of the gait as earlier studies and encourages further development along these lines.

Text
jang_hee_you_etri.pdf - Version of Record
Restricted to Repository staff only
Request a copy

More information

Published date: April 2011
Organisations: Vision, Learning and Control

Identifiers

Local EPrints ID: 272191
URI: http://eprints.soton.ac.uk/id/eprint/272191
PURE UUID: 2499a589-cdf3-4c66-b9ab-0694acf6bf0e
ORCID for Mark Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 15 Apr 2011 14:20
Last modified: 15 Mar 2024 02:35

Export record

Altmetrics

Contributors

Author: jang hee yoo
Author: Mark Nixon ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×