The University of Southampton
University of Southampton Institutional Repository

Marionette mass-spring model for 3D gait biometrics

Marionette mass-spring model for 3D gait biometrics
Marionette mass-spring model for 3D gait biometrics
Though interest in gait biometrics continues to increase, there have as yet been few approaches which use model-based algorithms with temporal 3D data. In this paper we describe a new 3D model-based approach using a marionette and mass-spring model to gait biometrics with 3D voxel gait dataset. To model the articulated human body, we use a stick-figure which emulates the marionettes’ motion and joint structure. The stick-figure has 11 nodes representing the human joints of head, torso, and lower legs. Each node is linked with at least one other node by a spring. The voxel data points in the next frame have a role as attractor which able to generate forces for each node and then iteratively warp the model into the data. This process is repeated for successive frames for one gait period. The motion kinematics extracted from this tracking process are projected into the sagittal and the frontal plane and used as a gait feature via the discrete Fourier transform. We use 46 subjects where each subject has 4 sample sequences and report encouraging initial gait classification results.
978-1-4673-0397-2
354-359
Ariyanto, Gunawan
a36977d0-5857-4caa-8e3a-88d41f85c304
Nixon, Mark S.
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Ariyanto, Gunawan
a36977d0-5857-4caa-8e3a-88d41f85c304
Nixon, Mark S.
2b5b9804-5a81-462a-82e6-92ee5fa74e12

Ariyanto, Gunawan and Nixon, Mark S. (2012) Marionette mass-spring model for 3D gait biometrics. 2012 5th IAPR International Conference on Biometrics (ICB), India. 29 Mar - 01 Apr 2012. pp. 354-359 . (doi:10.1109/ICB.2012.6199832).

Record type: Conference or Workshop Item (Paper)

Abstract

Though interest in gait biometrics continues to increase, there have as yet been few approaches which use model-based algorithms with temporal 3D data. In this paper we describe a new 3D model-based approach using a marionette and mass-spring model to gait biometrics with 3D voxel gait dataset. To model the articulated human body, we use a stick-figure which emulates the marionettes’ motion and joint structure. The stick-figure has 11 nodes representing the human joints of head, torso, and lower legs. Each node is linked with at least one other node by a spring. The voxel data points in the next frame have a role as attractor which able to generate forces for each node and then iteratively warp the model into the data. This process is repeated for successive frames for one gait period. The motion kinematics extracted from this tracking process are projected into the sagittal and the frontal plane and used as a gait feature via the discrete Fourier transform. We use 46 subjects where each subject has 4 sample sequences and report encouraging initial gait classification results.

Text
ariyanto icb.pdf - Other
Download (951kB)

More information

Published date: March 2012
Venue - Dates: 2012 5th IAPR International Conference on Biometrics (ICB), India, 2012-03-29 - 2012-04-01
Organisations: Vision, Learning and Control

Identifiers

Local EPrints ID: 363316
URI: https://eprints.soton.ac.uk/id/eprint/363316
ISBN: 978-1-4673-0397-2
PURE UUID: ce136769-e8da-448a-9eed-b29df1ea18b8
ORCID for Mark S. Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 20 Mar 2014 16:51
Last modified: 07 Aug 2019 00:54

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of https://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×