Forecasting nonverbal social signals during Dyadic interactions with generative adversarial neural networks
Forecasting nonverbal social signals during Dyadic interactions with generative adversarial neural networks
We are approaching a future where social robots will progressively become widespread in many aspects of our daily lives, including education, healthcare, work, and personal use. All of such practical applications require that humans and robots collaborate in human environments, where social interaction is unavoidable. Along with verbal communication, successful social interaction is closely coupled with the interplay between nonverbal perception and action mechanisms, such as observation of gaze behaviour and following their attention, coordinating the form and function of hand gestures. Humans perform nonverbal communication in an instinctive and adaptive manner, with no effort. For robots to be successful in our social landscape, they should therefore engage in social interactions in a humanlike way, with increasing levels of autonomy. In particular, nonverbal gestures are expected to endow social robots with the capability of emphasizing their speech, or showing their intentions. Motivated by this, our research sheds a light on modeling human behaviors in social interactions, specifically, forecasting human nonverbal social signals during dyadic interactions, with an overarching goal of developing robotic interfaces that can learn to imitate human dyadic interactions. Such an approach will ensure the messages encoded in the robot gestures could be perceived by interacting partners in a facile and transparent manner, which could help improve the interacting partner perception and makes the social interaction outcomes enhanced.
cs.AI
Tuyen, Nguyen Tan Viet
f6e9374c-5174-4446-b4f0-5e6359efc105
Celiktutan, Oya
4263f3df-62ff-4485-b1d1-d7dd04ff3a7b
18 October 2021
Tuyen, Nguyen Tan Viet
f6e9374c-5174-4446-b4f0-5e6359efc105
Celiktutan, Oya
4263f3df-62ff-4485-b1d1-d7dd04ff3a7b
[Unknown type: UNSPECIFIED]
Abstract
We are approaching a future where social robots will progressively become widespread in many aspects of our daily lives, including education, healthcare, work, and personal use. All of such practical applications require that humans and robots collaborate in human environments, where social interaction is unavoidable. Along with verbal communication, successful social interaction is closely coupled with the interplay between nonverbal perception and action mechanisms, such as observation of gaze behaviour and following their attention, coordinating the form and function of hand gestures. Humans perform nonverbal communication in an instinctive and adaptive manner, with no effort. For robots to be successful in our social landscape, they should therefore engage in social interactions in a humanlike way, with increasing levels of autonomy. In particular, nonverbal gestures are expected to endow social robots with the capability of emphasizing their speech, or showing their intentions. Motivated by this, our research sheds a light on modeling human behaviors in social interactions, specifically, forecasting human nonverbal social signals during dyadic interactions, with an overarching goal of developing robotic interfaces that can learn to imitate human dyadic interactions. Such an approach will ensure the messages encoded in the robot gestures could be perceived by interacting partners in a facile and transparent manner, which could help improve the interacting partner perception and makes the social interaction outcomes enhanced.
More information
Published date: 18 October 2021
Keywords:
cs.AI
Identifiers
Local EPrints ID: 504500
URI: http://eprints.soton.ac.uk/id/eprint/504500
PURE UUID: 48c57c7d-babb-4de1-a2b2-f4fa3c889106
Catalogue record
Date deposited: 10 Sep 2025 15:41
Last modified: 11 Sep 2025 03:38
Export record
Contributors
Author:
Nguyen Tan Viet Tuyen
Author:
Oya Celiktutan
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics