Gesticulating with NAO: real-time context-aware co-speech gesture generation for human-robot interaction
Gesticulating with NAO: real-time context-aware co-speech gesture generation for human-robot interaction
Humans naturally produce nonverbal behaviours via facial, body, and vocal expressions to signal their messages, intentions, and feelings to their interacting partners during interactions. Considering robots are progressively moving out from research laboratories into human environments, there is an increasing need for them to be able to develop similar social intelligence skills. Equipping robots with human nonverbal communication skills, therefore, has been an active research area, where data-driven, end-to-end learning approaches have become predominant, offering scalability and generalisability. However, most recent works only consider a single character for modelling intrapersonal dynamics without paying attention to the interacting partner’s behaviours. Our research aims to address the gap in the literature by introducing a generative framework allowing social robots to produce co-speech gestures to convey their speech in a scenario of real-time human-robot interaction. Notably, the system also considers non-verbal signals observed from the interacting partner as a conditional input for producing robots’ communicative gestures.
Association for Computing Machinery
Nguyen, Tan Viet Tuyen
f6e9374c-5174-4446-b4f0-5e6359efc105
Schmuck, Viktor
bfb2f919-ee1e-4387-9514-9a7da826cf6e
Celiktutan, Oya
4263f3df-62ff-4485-b1d1-d7dd04ff3a7b
9 October 2023
Nguyen, Tan Viet Tuyen
f6e9374c-5174-4446-b4f0-5e6359efc105
Schmuck, Viktor
bfb2f919-ee1e-4387-9514-9a7da826cf6e
Celiktutan, Oya
4263f3df-62ff-4485-b1d1-d7dd04ff3a7b
Nguyen, Tan Viet Tuyen, Schmuck, Viktor and Celiktutan, Oya
(2023)
Gesticulating with NAO: real-time context-aware co-speech gesture generation for human-robot interaction.
André, Elisabeth, Chetouani, Mohamed, Vaufreydaz, Dominique, Lucas, Gale, Schultz, Tanja, Morency, Louis-Philippe and Vinciarelli, Alessandro
(eds.)
In ICMI '23 Companion: Companion Publication of the 25th International Conference on Multimodal Interaction.
Association for Computing Machinery.
3 pp
.
(doi:10.1145/3610661.3620664).
Record type:
Conference or Workshop Item
(Paper)
Abstract
Humans naturally produce nonverbal behaviours via facial, body, and vocal expressions to signal their messages, intentions, and feelings to their interacting partners during interactions. Considering robots are progressively moving out from research laboratories into human environments, there is an increasing need for them to be able to develop similar social intelligence skills. Equipping robots with human nonverbal communication skills, therefore, has been an active research area, where data-driven, end-to-end learning approaches have become predominant, offering scalability and generalisability. However, most recent works only consider a single character for modelling intrapersonal dynamics without paying attention to the interacting partner’s behaviours. Our research aims to address the gap in the literature by introducing a generative framework allowing social robots to produce co-speech gestures to convey their speech in a scenario of real-time human-robot interaction. Notably, the system also considers non-verbal signals observed from the interacting partner as a conditional input for producing robots’ communicative gestures.
This record has no associated files available for download.
More information
Published date: 9 October 2023
Identifiers
Local EPrints ID: 505658
URI: http://eprints.soton.ac.uk/id/eprint/505658
PURE UUID: e8986ffb-ecb5-4b01-8915-e65cb0589bbc
Catalogue record
Date deposited: 15 Oct 2025 16:56
Last modified: 16 Oct 2025 02:11
Export record
Altmetrics
Contributors
Author:
Tan Viet Tuyen Nguyen
Author:
Viktor Schmuck
Author:
Oya Celiktutan
Editor:
Elisabeth André
Editor:
Mohamed Chetouani
Editor:
Dominique Vaufreydaz
Editor:
Gale Lucas
Editor:
Tanja Schultz
Editor:
Louis-Philippe Morency
Editor:
Alessandro Vinciarelli
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics