Learning bodily expression of emotion for social robots through human interaction
Learning bodily expression of emotion for social robots through human interaction
Human facial and bodily expressions play a crucial role in human–human interaction to convey the communicator’s feelings. Being echoed by the influence of human social behavior, recent studies in human–robot interaction (HRI) have investigated how to generate emotional behaviors for social robots. Emotional behaviors can enhance user engagement, allowing the user to interact with robots in a transparent manner. However, they are ambiguous and affected by many factors, such as personality traits, cultures, and environments. This article focuses on developing the robot’s emotional bodily expressions adopting the user’s affective gestures. We propose the behavior selection and transformation model, enabling the robots to incrementally learn from the user’s gestures, to select the user’s habitual behaviors, and to transform the selected behaviors into robot motions. The experimental results under several scenarios showed that the proposed incremental learning model endows a social robot with the capability of entering into a positive, long-lasting HRI. We have also confirmed that the robot can express emotions through the imitated motions of the user. The robot’s emotional gestures that reflected the interacting partner’s traits were widely accepted within the same cultural group, and perceptible across different cultural groups in different ways.
16-30
Tuyen, Nguyen Tan Viet
f6e9374c-5174-4446-b4f0-5e6359efc105
Elibol, Armagan
33f82b57-1bab-4834-9370-a0c260c77bd0
Chong, Nak Young
16d979c7-2eec-43bf-835f-4f31c7b722f6
March 2021
Tuyen, Nguyen Tan Viet
f6e9374c-5174-4446-b4f0-5e6359efc105
Elibol, Armagan
33f82b57-1bab-4834-9370-a0c260c77bd0
Chong, Nak Young
16d979c7-2eec-43bf-835f-4f31c7b722f6
Tuyen, Nguyen Tan Viet, Elibol, Armagan and Chong, Nak Young
(2021)
Learning bodily expression of emotion for social robots through human interaction.
IEEE Transactions on Cognitive and Developmental Systems, 13 (1), .
(doi:10.1109/TCDS.2020.3005907).
Abstract
Human facial and bodily expressions play a crucial role in human–human interaction to convey the communicator’s feelings. Being echoed by the influence of human social behavior, recent studies in human–robot interaction (HRI) have investigated how to generate emotional behaviors for social robots. Emotional behaviors can enhance user engagement, allowing the user to interact with robots in a transparent manner. However, they are ambiguous and affected by many factors, such as personality traits, cultures, and environments. This article focuses on developing the robot’s emotional bodily expressions adopting the user’s affective gestures. We propose the behavior selection and transformation model, enabling the robots to incrementally learn from the user’s gestures, to select the user’s habitual behaviors, and to transform the selected behaviors into robot motions. The experimental results under several scenarios showed that the proposed incremental learning model endows a social robot with the capability of entering into a positive, long-lasting HRI. We have also confirmed that the robot can express emotions through the imitated motions of the user. The robot’s emotional gestures that reflected the interacting partner’s traits were widely accepted within the same cultural group, and perceptible across different cultural groups in different ways.
This record has no associated files available for download.
More information
e-pub ahead of print date: 30 June 2020
Published date: March 2021
Identifiers
Local EPrints ID: 507562
URI: http://eprints.soton.ac.uk/id/eprint/507562
ISSN: 2379-8920
PURE UUID: 02393957-f318-449d-9070-0b891a8c4528
Catalogue record
Date deposited: 12 Dec 2025 17:41
Last modified: 13 Dec 2025 03:05
Export record
Altmetrics
Contributors
Author:
Nguyen Tan Viet Tuyen
Author:
Armagan Elibol
Author:
Nak Young Chong
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics