The University of Southampton
University of Southampton Institutional Repository

Unconstrained human identification using comparative facial soft biometrics

Unconstrained human identification using comparative facial soft biometrics
Unconstrained human identification using comparative facial soft biometrics
The recent growth in CCTV systems and the challenges of automatically identifying humans under the adverse visual conditions of surveillance have increased the interest in soft biometrics, which are physical and behavioural attributes that are used to semantically describe people. Soft biometrics enable human identification under the challenging conditions of surveillance where it is impossible to acquire traditional biometrics such as iris and fingerprint. The existing work on facial soft biometrics is focused on categorical attributes, while comparative attributes have received very little attention, although they have demonstrated a better accuracy. Thus, it is still unknown whether comparative soft biometrics can scale to large and more realistic databases. Also, the automatic retrieval of comparative facial soft biometrics from images needs to be investigated.

The purpose of this thesis is to explore human identification and verification in large and realistic databases via comparative facial soft biometrics using the Labelled Faces in the Wild (LFW) database. A novel set of comparative facial soft biometrics is introduced, and a thorough analysis that assesses attribute significance and discriminative power is presented. Also, a set of identification and verification experiments was conducted to evaluate the comparative facial soft biometrics. Moreover, this thesis proposes MIURank,
a novel fully unsupervised ranking algorithm that is based on mutual information.

The experiments demonstrate that a correct match can be found in the top 71 retrieved subjects from a database of 4038 subjects by comparing an unknown subject to ten subjects only. Additionally, the experiments reveal that face retrieval by verbal descriptions in a database of images can yield a correct match in the top 15 retrieved subjects from a database of 430 subjects. Furthermore, the performance analysis of the MIURank algorithm shows that it can result in a ranking accuracy that is comparable to the maximum likelihood estimator of Bradley-Terry and the state-of-the-art SerialRank algorithm. By these analyses and developments, it is now possible not only to use human labels for recognition, but also to derive them by computer vision.
University of Southampton
Almudhahka, Nawaf Yousef
929b4dbb-016d-44bb-9755-b9e0adb6ded0
Almudhahka, Nawaf Yousef
929b4dbb-016d-44bb-9755-b9e0adb6ded0
Nixon, Mark
2b5b9804-5a81-462a-82e6-92ee5fa74e12

Almudhahka, Nawaf Yousef (2017) Unconstrained human identification using comparative facial soft biometrics. University of Southampton, Doctoral Thesis, 141pp.

Record type: Thesis (Doctoral)

Abstract

The recent growth in CCTV systems and the challenges of automatically identifying humans under the adverse visual conditions of surveillance have increased the interest in soft biometrics, which are physical and behavioural attributes that are used to semantically describe people. Soft biometrics enable human identification under the challenging conditions of surveillance where it is impossible to acquire traditional biometrics such as iris and fingerprint. The existing work on facial soft biometrics is focused on categorical attributes, while comparative attributes have received very little attention, although they have demonstrated a better accuracy. Thus, it is still unknown whether comparative soft biometrics can scale to large and more realistic databases. Also, the automatic retrieval of comparative facial soft biometrics from images needs to be investigated.

The purpose of this thesis is to explore human identification and verification in large and realistic databases via comparative facial soft biometrics using the Labelled Faces in the Wild (LFW) database. A novel set of comparative facial soft biometrics is introduced, and a thorough analysis that assesses attribute significance and discriminative power is presented. Also, a set of identification and verification experiments was conducted to evaluate the comparative facial soft biometrics. Moreover, this thesis proposes MIURank,
a novel fully unsupervised ranking algorithm that is based on mutual information.

The experiments demonstrate that a correct match can be found in the top 71 retrieved subjects from a database of 4038 subjects by comparing an unknown subject to ten subjects only. Additionally, the experiments reveal that face retrieval by verbal descriptions in a database of images can yield a correct match in the top 15 retrieved subjects from a database of 430 subjects. Furthermore, the performance analysis of the MIURank algorithm shows that it can result in a ranking accuracy that is comparable to the maximum likelihood estimator of Bradley-Terry and the state-of-the-art SerialRank algorithm. By these analyses and developments, it is now possible not only to use human labels for recognition, but also to derive them by computer vision.

Text
Final Thesis - Version of Record
Available under License University of Southampton Thesis Licence.
Download (12MB)

More information

Published date: 2 November 2017

Identifiers

Local EPrints ID: 419481
URI: http://eprints.soton.ac.uk/id/eprint/419481
PURE UUID: 7a6e3fb8-615d-4756-a88a-faafbcc841a0
ORCID for Mark Nixon: ORCID iD orcid.org/0000-0002-9174-5934

Catalogue record

Date deposited: 12 Apr 2018 16:31
Last modified: 30 Jan 2020 05:07

Export record

Contributors

Author: Nawaf Yousef Almudhahka
Thesis advisor: Mark Nixon ORCID iD

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×