Facial emotion recognition with noisy multi-task annotations
Facial emotion recognition with noisy multi-task annotations
Human emotions can be inferred from facial expressions. However, the annotations of facial expressions are often highly noisy in common emotion coding models, including categorical and dimensional ones. To reduce human labelling effort on multi-task labels, we introduce a new problem of facial emotion recognition with noisy multi-task annotations. For this new problem, we suggest a formulation from the point of joint distribution match view, which aims at learning more reliable correlations among raw facial images and multi-task labels, resulting in the reduction of noise influence. In our formulation, we exploit a new method to enable the emotion prediction and the joint distribution learning in a unified adversarial learning game. Evaluation throughout extensive experiments studies the real setups of the suggested new problem, as well as the clear superiority of the proposed method over the state-of-the-art competing methods on either the synthetic noisy labeled CIFAR-10 or practical noisy multi-task labeled RAF and AffectNet. The code is available at https://github.com/sanweiliti/noisyFER.
21-31
Zhang, Siwei
d6785568-0752-47ac-af53-c052f4e1db81
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Paudel, Danda Pani
92cefdf8-92e7-43ff-b952-6290a9844be0
Van Gool, Luc
7aa6fbb4-68f5-4b18-8d99-ba71be78844d
9 May 2021
Zhang, Siwei
d6785568-0752-47ac-af53-c052f4e1db81
Huang, Zhiwu
84f477cd-9097-44dd-a33e-ff71f253d36b
Paudel, Danda Pani
92cefdf8-92e7-43ff-b952-6290a9844be0
Van Gool, Luc
7aa6fbb4-68f5-4b18-8d99-ba71be78844d
Zhang, Siwei, Huang, Zhiwu, Paudel, Danda Pani and Van Gool, Luc
(2021)
Facial emotion recognition with noisy multi-task annotations.
In IEEE/CVF Winter Conference on Applications of Computer Vision.
.
(doi:10.1109/WACV48630.2021.00007).
Record type:
Conference or Workshop Item
(Paper)
Abstract
Human emotions can be inferred from facial expressions. However, the annotations of facial expressions are often highly noisy in common emotion coding models, including categorical and dimensional ones. To reduce human labelling effort on multi-task labels, we introduce a new problem of facial emotion recognition with noisy multi-task annotations. For this new problem, we suggest a formulation from the point of joint distribution match view, which aims at learning more reliable correlations among raw facial images and multi-task labels, resulting in the reduction of noise influence. In our formulation, we exploit a new method to enable the emotion prediction and the joint distribution learning in a unified adversarial learning game. Evaluation throughout extensive experiments studies the real setups of the suggested new problem, as well as the clear superiority of the proposed method over the state-of-the-art competing methods on either the synthetic noisy labeled CIFAR-10 or practical noisy multi-task labeled RAF and AffectNet. The code is available at https://github.com/sanweiliti/noisyFER.
This record has no associated files available for download.
More information
Published date: 9 May 2021
Identifiers
Local EPrints ID: 501678
URI: http://eprints.soton.ac.uk/id/eprint/501678
PURE UUID: 1c16a200-1a96-45fd-95d5-3995957edff8
Catalogue record
Date deposited: 05 Jun 2025 16:57
Last modified: 06 Jun 2025 02:06
Export record
Altmetrics
Contributors
Author:
Siwei Zhang
Author:
Zhiwu Huang
Author:
Danda Pani Paudel
Author:
Luc Van Gool
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics