The University of Southampton
University of Southampton Institutional Repository

Super-fine attributes with crowd prototyping

Super-fine attributes with crowd prototyping
Super-fine attributes with crowd prototyping
Recognising human attributes from surveillance footage is widely studied for attribute-based re-identification. However, most works assume coarse, expertly-defined categories, ineffective in describing challenging images. Such brittle representations are limited in descriminitive power and hamper the efficacy of learnt estimators. We aim to discover more relevant and precise subject descriptions, improving image retrieval and closing the semantic gap. Inspired by fine-grained and relative attributes, we introduce super-fine attributes, which now describe multiple, integral concepts of a single trait as multi-dimensional perceptual coordinates. Crowd prototyping facilitates efficient crowdsourcing of super-fine labels by pre-discovering salient perceptual concepts for prototype matching. We re-annotate gender, age and ethnicity traits from PETA, a highly diverse (19K instances, 8.7K identities) amalgamation of 10 re-id datasets including VIPER, CUHK and TownCentre. Employing joint attribute regression with the ResNet-152 CNN, we demonstrate substantially improved ranked retrieval performance with super-fine attributes in direct comparison to conventional binary labels, reporting up to a 11.2% and 14.8% mAP improvement for gender and age, further surpassed by ethnicity. We also find our 3 super-fine traits to outperform 35 binary attributes by 6.5% mAP for subject retrieval in a challenging zero-shot identification scenario.
1-14
Martinho-Corbishley, Daniel
6dd73e5c-9a7e-41bd-b896-fb1ea9852abb
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Carter, John
e05be2f9-991d-4476-bb50-ae91606389da
Martinho-Corbishley, Daniel
6dd73e5c-9a7e-41bd-b896-fb1ea9852abb
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12
Carter, John
e05be2f9-991d-4476-bb50-ae91606389da

Martinho-Corbishley, Daniel, Nixon, Mark and Carter, John (2018) Super-fine attributes with crowd prototyping. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1-14. (doi:10.1109/TPAMI.2018.2836900).

Record type: Article

Abstract

Recognising human attributes from surveillance footage is widely studied for attribute-based re-identification. However, most works assume coarse, expertly-defined categories, ineffective in describing challenging images. Such brittle representations are limited in descriminitive power and hamper the efficacy of learnt estimators. We aim to discover more relevant and precise subject descriptions, improving image retrieval and closing the semantic gap. Inspired by fine-grained and relative attributes, we introduce super-fine attributes, which now describe multiple, integral concepts of a single trait as multi-dimensional perceptual coordinates. Crowd prototyping facilitates efficient crowdsourcing of super-fine labels by pre-discovering salient perceptual concepts for prototype matching. We re-annotate gender, age and ethnicity traits from PETA, a highly diverse (19K instances, 8.7K identities) amalgamation of 10 re-id datasets including VIPER, CUHK and TownCentre. Employing joint attribute regression with the ResNet-152 CNN, we demonstrate substantially improved ranked retrieval performance with super-fine attributes in direct comparison to conventional binary labels, reporting up to a 11.2% and 14.8% mAP improvement for gender and age, further surpassed by ethnicity. We also find our 3 super-fine traits to outperform 35 binary attributes by 6.5% mAP for subject retrieval in a challenging zero-shot identification scenario.

Text article-corrections-final(2) - Accepted Manuscript
Download (14MB)

More information

Accepted/In Press date: 8 May 2018
e-pub ahead of print date: 15 May 2018

Identifiers

Local EPrints ID: 420641
URI: https://eprints.soton.ac.uk/id/eprint/420641
PURE UUID: 5cc16bd5-55e8-4fbe-bacb-c05ae00df3a7
ORCID for Mark Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 11 May 2018 16:30
Last modified: 11 Aug 2018 00:36

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of https://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×