The University of Southampton
University of Southampton Institutional Repository

Analysing comparative soft biometrics from crowdsourced annotations

Analysing comparative soft biometrics from crowdsourced annotations
Analysing comparative soft biometrics from crowdsourced annotations
Soft biometrics enable human description and identification from low-quality surveillance footage. This study premises the design, collection and analysis of a novel crowdsourced dataset of comparative soft biometric body annotations, obtained from a richly diverse set of human annotators. The authors annotate 100 subject images to provide a coherent, in-depth appraisal of the collected annotations and inferred relative labels. The dataset includes gender as a comparative trait and the authors find that comparative labels characteristically contain additional discriminative information over traditional categorical annotations. Using the authors' pragmatic dataset, semantic recognition is performed by inferring relative biometric signatures using a RankSVM algorithm. This demonstrates a practical scenario, reproducing responses from a video surveillance operator searching for an individual. The approach can reliably return the correct match in the top 7% of results with ten comparisons, or top 13% of results using just five sets of subject comparisons.
2047-4938
276-283
Martinho-Corbishley, Daniel
6dd73e5c-9a7e-41bd-b896-fb1ea9852abb
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Carter, John
e05be2f9-991d-4476-bb50-ae91606389da
Martinho-Corbishley, Daniel
6dd73e5c-9a7e-41bd-b896-fb1ea9852abb
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Carter, John
e05be2f9-991d-4476-bb50-ae91606389da

Martinho-Corbishley, Daniel, Nixon, Mark and Carter, John (2016) Analysing comparative soft biometrics from crowdsourced annotations. IET Biometrics, 5 (4), 276-283. (doi:10.1049/iet-bmt.2015.0118).

Record type: Article

Abstract

Soft biometrics enable human description and identification from low-quality surveillance footage. This study premises the design, collection and analysis of a novel crowdsourced dataset of comparative soft biometric body annotations, obtained from a richly diverse set of human annotators. The authors annotate 100 subject images to provide a coherent, in-depth appraisal of the collected annotations and inferred relative labels. The dataset includes gender as a comparative trait and the authors find that comparative labels characteristically contain additional discriminative information over traditional categorical annotations. Using the authors' pragmatic dataset, semantic recognition is performed by inferring relative biometric signatures using a RankSVM algorithm. This demonstrates a practical scenario, reproducing responses from a video surveillance operator searching for an individual. The approach can reliably return the correct match in the top 7% of results with ten comparisons, or top 13% of results using just five sets of subject comparisons.

Text
iet-biometrics-16-sub2.pdf - Accepted Manuscript
Download (6MB)

More information

Accepted/In Press date: 21 March 2016
e-pub ahead of print date: 22 April 2016
Published date: 1 December 2016
Organisations: Vision, Learning and Control

Identifiers

Local EPrints ID: 390468
URI: http://eprints.soton.ac.uk/id/eprint/390468
ISSN: 2047-4938
PURE UUID: 897f5b5f-ebe9-464b-b7ec-5b9d5dc965ba
ORCID for Mark Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 04 Apr 2016 09:05
Last modified: 15 Mar 2024 02:35

Export record

Altmetrics

Contributors

Author: Daniel Martinho-Corbishley
Author: Mark Nixon ORCID iD
Author: John Carter

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×