The University of Southampton
University of Southampton Institutional Repository

Distilling ordinal relation and dark knowledge for facial age estimation

Distilling ordinal relation and dark knowledge for facial age estimation
Distilling ordinal relation and dark knowledge for facial age estimation
In this paper, we propose a knowledge distillation approach with two teachers for facial age estimation. Due to the nonstationary patterns of facial aging process, the relative order of age labels provides more reliable information than exact age values for facial age estimation. Thus the first teacher is a novel ranking method capturing the ordinal relation among age labels. Specifically, it formulates the ordinal relation learning as a task of recovering the original ordered sequences from shuffled ones. The second teacher adopts a same model as the student that treats facial age estimation as a multi-class classification task. The proposed method leverages the intermediate representations learned by the first teacher and the softened outputs of the second teacher as supervisory signals to improve the training procedure and final performance of the compact student for facial age estimation. Hence, the proposed knowledge distillation approach is capable of distillating the ordinal knowledge from the ranking model and the dark knowledge from the multi-class classification model into a compact student, which facilitates the implementation of facial age estimation on platforms with limited memory and computation resources, such as mobile and embedded devices. Extensive experiments involving several famous datasets for age estimation have demonstrated the superior performance of our proposed method over several existing state-of-the-art methods.
2162-237X
1-14
Zhao, Qilu
689d0828-e1e9-45df-958b-be02393ba636
Dong, Junyu
6d48af8a-6aa7-418a-ad7a-41c84df8898d
Yu, Hui
62623ded-fe42-4211-9529-ff32de116743
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Zhao, Qilu
689d0828-e1e9-45df-958b-be02393ba636
Dong, Junyu
6d48af8a-6aa7-418a-ad7a-41c84df8898d
Yu, Hui
62623ded-fe42-4211-9529-ff32de116743
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80

Zhao, Qilu, Dong, Junyu, Yu, Hui and Chen, Sheng (2020) Distilling ordinal relation and dark knowledge for facial age estimation. IEEE Transactions on Neural Networks and Learning Systems, 1-14. (doi:10.1109/TNNLS.2020.3009523).

Record type: Article

Abstract

In this paper, we propose a knowledge distillation approach with two teachers for facial age estimation. Due to the nonstationary patterns of facial aging process, the relative order of age labels provides more reliable information than exact age values for facial age estimation. Thus the first teacher is a novel ranking method capturing the ordinal relation among age labels. Specifically, it formulates the ordinal relation learning as a task of recovering the original ordered sequences from shuffled ones. The second teacher adopts a same model as the student that treats facial age estimation as a multi-class classification task. The proposed method leverages the intermediate representations learned by the first teacher and the softened outputs of the second teacher as supervisory signals to improve the training procedure and final performance of the compact student for facial age estimation. Hence, the proposed knowledge distillation approach is capable of distillating the ordinal knowledge from the ranking model and the dark knowledge from the multi-class classification model into a compact student, which facilitates the implementation of facial age estimation on platforms with limited memory and computation resources, such as mobile and embedded devices. Extensive experiments involving several famous datasets for age estimation have demonstrated the superior performance of our proposed method over several existing state-of-the-art methods.

Text
TNNLS2020-F - Accepted Manuscript
Download (2MB)

More information

Accepted/In Press date: 10 July 2020
e-pub ahead of print date: 17 August 2020

Identifiers

Local EPrints ID: 442621
URI: http://eprints.soton.ac.uk/id/eprint/442621
ISSN: 2162-237X
PURE UUID: b5c9f186-7854-41be-85f5-92b7e75ba525

Catalogue record

Date deposited: 21 Jul 2020 16:34
Last modified: 13 Oct 2020 16:51

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×