The University of Southampton
University of Southampton Institutional Repository

Performing content-based retrieval of humans using gait biometrics

Performing content-based retrieval of humans using gait biometrics
Performing content-based retrieval of humans using gait biometrics
In order to analyse surveillance video, we need to efficiently explore large datasets containing videos of walking humans. At surveillance-image resolution, the human walk (their gait) can be determined automatically, and more readily than other features such as the face. Effective analysis of such data relies on retrieval of video data which has been enriched using semantic annotations. A manual annotation process is time-consuming and prone to error due to subject bias. We explore the content-based retrieval of videos containing walking subjects, using semantic queries. We evaluate current biometric research using gait, unique in its effectiveness at recognising people at a distance. We introduce a set of semantic traits discernible by humans at a distance, outlining their psychological validity. Working under the premise that similarity of the chosen gait signature implies similarity of certain semantic traits we perform a set of semantic retrieval experiments using popular latent semantic analysis techniques from the information retrieval community.
CBIR, gait, biometrics
978-3-540-92234-6
105-120
Samangooei, Sina
c380fb26-55d4-4b34-94e7-c92bbb26a40d
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Samangooei, Sina
c380fb26-55d4-4b34-94e7-c92bbb26a40d
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12

Samangooei, Sina and Nixon, Mark (2008) Performing content-based retrieval of humans using gait biometrics. SAMT 2008, Koblenz. pp. 105-120 .

Record type: Conference or Workshop Item (Paper)

Abstract

In order to analyse surveillance video, we need to efficiently explore large datasets containing videos of walking humans. At surveillance-image resolution, the human walk (their gait) can be determined automatically, and more readily than other features such as the face. Effective analysis of such data relies on retrieval of video data which has been enriched using semantic annotations. A manual annotation process is time-consuming and prone to error due to subject bias. We explore the content-based retrieval of videos containing walking subjects, using semantic queries. We evaluate current biometric research using gait, unique in its effectiveness at recognising people at a distance. We introduce a set of semantic traits discernible by humans at a distance, outlining their psychological validity. Working under the premise that similarity of the chosen gait signature implies similarity of certain semantic traits we perform a set of semantic retrieval experiments using popular latent semantic analysis techniques from the information retrieval community.

Text
fulltext.pdf - Accepted Manuscript
Download (4MB)

More information

Published date: 27 November 2008
Additional Information: Event Dates: 2/12/2008
Venue - Dates: SAMT 2008, Koblenz, 2008-12-02
Keywords: CBIR, gait, biometrics
Organisations: Vision, Learning and Control, Southampton Wireless Group

Identifiers

Local EPrints ID: 267052
URI: http://eprints.soton.ac.uk/id/eprint/267052
ISBN: 978-3-540-92234-6
PURE UUID: 40ddabcf-76e5-4758-86f3-3f5db2693e12
ORCID for Mark Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 22 Jan 2009 11:37
Last modified: 15 Mar 2024 02:35

Export record

Contributors

Author: Sina Samangooei
Author: Mark Nixon ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×