The University of Southampton
University of Southampton Institutional Repository

Distilling ordinal relation and dark knowledge for facial age estimation

Distilling ordinal relation and dark knowledge for facial age estimation
Distilling ordinal relation and dark knowledge for facial age estimation

In this article, we propose a knowledge distillation approach with two teachers for facial age estimation. Due to the nonstationary patterns of the facial-aging process, the relative order of age labels provides more reliable information than exact age values for facial age estimation. Thus, the first teacher is a novel ranking method capturing the ordinal relation among age labels. Especially, it formulates the ordinal relation learning as a task of recovering the original ordered sequences from shuffled ones. The second teacher adopts the same model as the student that treats facial age estimation as a multiclass classification task. The proposed method leverages the intermediate representations learned by the first teacher and the softened outputs of the second teacher as supervisory signals to improve the training procedure and final performance of the compact student for facial age estimation. Hence, the proposed knowledge distillation approach is capable of distilling the ordinal knowledge from the ranking model and the dark knowledge from the multiclass classification model into a compact student, which facilitates the implementation of facial age estimation on platforms with limited memory and computation resources, such as mobile and embedded devices. Extensive experiments involving several famous data sets for age estimation have demonstrated the superior performance of our proposed method over several existing state-of-the-art methods.

Dark knowledge, facial age estimation, feature transfer, jigsaw puzzles solver, knowledge distillation, permutation prediction, self-supervised learning
2162-237X
3108-3121
Zhao, Qilu
689d0828-e1e9-45df-958b-be02393ba636
Dong, Junyu
6d48af8a-6aa7-418a-ad7a-41c84df8898d
Yu, Hui
62623ded-fe42-4211-9529-ff32de116743
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Zhao, Qilu
689d0828-e1e9-45df-958b-be02393ba636
Dong, Junyu
6d48af8a-6aa7-418a-ad7a-41c84df8898d
Yu, Hui
62623ded-fe42-4211-9529-ff32de116743
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80

Zhao, Qilu, Dong, Junyu, Yu, Hui and Chen, Sheng (2021) Distilling ordinal relation and dark knowledge for facial age estimation. IEEE Transactions on Neural Networks and Learning Systems, 32 (7), 3108-3121, [9169852]. (doi:10.1109/TNNLS.2020.3009523).

Record type: Article

Abstract

In this article, we propose a knowledge distillation approach with two teachers for facial age estimation. Due to the nonstationary patterns of the facial-aging process, the relative order of age labels provides more reliable information than exact age values for facial age estimation. Thus, the first teacher is a novel ranking method capturing the ordinal relation among age labels. Especially, it formulates the ordinal relation learning as a task of recovering the original ordered sequences from shuffled ones. The second teacher adopts the same model as the student that treats facial age estimation as a multiclass classification task. The proposed method leverages the intermediate representations learned by the first teacher and the softened outputs of the second teacher as supervisory signals to improve the training procedure and final performance of the compact student for facial age estimation. Hence, the proposed knowledge distillation approach is capable of distilling the ordinal knowledge from the ranking model and the dark knowledge from the multiclass classification model into a compact student, which facilitates the implementation of facial age estimation on platforms with limited memory and computation resources, such as mobile and embedded devices. Extensive experiments involving several famous data sets for age estimation have demonstrated the superior performance of our proposed method over several existing state-of-the-art methods.

Text
TNLS2021-Jul - Author's Original
Download (2MB)
Text
TNNLS2020-F - Accepted Manuscript
Download (2MB)

More information

Accepted/In Press date: 10 July 2020
e-pub ahead of print date: 17 August 2020
Published date: 7 July 2021
Additional Information: Funding Information: Manuscript received November 29, 2019; revised May 18, 2020; accepted July 11, 2020. Date of publication August 17, 2020; date of current version July 7, 2021. This work was supported in part by the National Key Research and Development Program of China under Grant 2018AAA0100602, in part by the Fundamental Research Funds for the Central Universities under Grant 201964022, in part by the National Natural Science Foundation of China under Grant U1706218 and Grant 41927805, and in part by the Shandong Provincial Natural Science Foundation, China, under Grant ZR2018ZB0852. (Corresponding author: Junyu Dong.) Qilu Zhao and Junyu Dong are with the Department of Computer Science and Technology, Ocean University of China, Qingdao 266100, China (e-mail: zql@ouc.edu.cn; dongjunyu@ouc.edu.cn). Publisher Copyright: © 2012 IEEE.
Keywords: Dark knowledge, facial age estimation, feature transfer, jigsaw puzzles solver, knowledge distillation, permutation prediction, self-supervised learning

Identifiers

Local EPrints ID: 442621
URI: http://eprints.soton.ac.uk/id/eprint/442621
ISSN: 2162-237X
PURE UUID: b5c9f186-7854-41be-85f5-92b7e75ba525

Catalogue record

Date deposited: 21 Jul 2020 16:34
Last modified: 16 Mar 2024 08:37

Export record

Altmetrics

Contributors

Author: Qilu Zhao
Author: Junyu Dong
Author: Hui Yu
Author: Sheng Chen

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×