The University of Southampton
University of Southampton Institutional Repository

All birds must fly: the experience of multimodal hands-free gaming with gaze and nonverbal voice synchronization

All birds must fly: the experience of multimodal hands-free gaming with gaze and nonverbal voice synchronization
All birds must fly: the experience of multimodal hands-free gaming with gaze and nonverbal voice synchronization
Eye tracking has evolved as a promising hands-free interaction mechanism to support people with disabilities. However, its adoption as a control mechanism in the gaming environment is constrained due to erroneous recognition of user intention and commands. Previous studies have suggested combining eye gaze with other modalities like voice input for improved interaction experience. However, speech recognition latency and accuracy is a major bottleneck, and the use of dictated verbal commands can disrupt the flow in gaming environment. Furthermore, several people with physical disabilities also suffer from speech impairments to utter precise verbal voice commands. In this work, we introduce nonverbal voice interaction (NVVI) to synchronize with gaze for an intuitive hands-free gaming experience. We propose gaze and NVVI (e.g., humming) for a spatio-temporal interaction applicable to several modern gaming apps, and developed ‘All Birds Must Fly’ as a representative app. In the experiment, we first compared the gameplay experience of gaze and NVVI (GV) with the conventional mouse and keyboard (MK) in a study with 15 non-disabled participants. The participants could effectively control the game environment with GV (expectedly a bit slower than MK). More importantly, they found GV more engaging, fun, and enjoyable. In a second study with 10 participants, we successfully validated the feasibility of GV with a target user group of people with disabilities.
Hedeshy, Ramin
03fad96c-b249-48e7-859e-d790fb0cc694
Kumar, Chandan
657246b6-361b-4fdd-a9ee-e13f59a4d936
Lauer, Mike
7907e778-a43c-49d1-a92d-ec37552b81eb
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49
Hedeshy, Ramin
03fad96c-b249-48e7-859e-d790fb0cc694
Kumar, Chandan
657246b6-361b-4fdd-a9ee-e13f59a4d936
Lauer, Mike
7907e778-a43c-49d1-a92d-ec37552b81eb
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49

Hedeshy, Ramin, Kumar, Chandan, Lauer, Mike and Staab, Steffen (2022) All birds must fly: the experience of multimodal hands-free gaming with gaze and nonverbal voice synchronization. 24th ACM International Conference on Multimodal Interaction, , Bangalore, India. 07 - 11 Nov 2022. 10 pp . (In Press)

Record type: Conference or Workshop Item (Paper)

Abstract

Eye tracking has evolved as a promising hands-free interaction mechanism to support people with disabilities. However, its adoption as a control mechanism in the gaming environment is constrained due to erroneous recognition of user intention and commands. Previous studies have suggested combining eye gaze with other modalities like voice input for improved interaction experience. However, speech recognition latency and accuracy is a major bottleneck, and the use of dictated verbal commands can disrupt the flow in gaming environment. Furthermore, several people with physical disabilities also suffer from speech impairments to utter precise verbal voice commands. In this work, we introduce nonverbal voice interaction (NVVI) to synchronize with gaze for an intuitive hands-free gaming experience. We propose gaze and NVVI (e.g., humming) for a spatio-temporal interaction applicable to several modern gaming apps, and developed ‘All Birds Must Fly’ as a representative app. In the experiment, we first compared the gameplay experience of gaze and NVVI (GV) with the conventional mouse and keyboard (MK) in a study with 15 non-disabled participants. The participants could effectively control the game environment with GV (expectedly a bit slower than MK). More importantly, they found GV more engaging, fun, and enjoyable. In a second study with 10 participants, we successfully validated the feasibility of GV with a target user group of people with disabilities.

Text
icmi22a-sub1315-cam-i15 - Author's Original
Download (2MB)

More information

Accepted/In Press date: 28 July 2022
Venue - Dates: 24th ACM International Conference on Multimodal Interaction, , Bangalore, India, 2022-11-07 - 2022-11-11

Identifiers

Local EPrints ID: 470769
URI: http://eprints.soton.ac.uk/id/eprint/470769
PURE UUID: e8a529d1-8864-4688-aa58-9e0f07766fa1
ORCID for Steffen Staab: ORCID iD orcid.org/0000-0002-0780-4154

Catalogue record

Date deposited: 19 Oct 2022 17:03
Last modified: 17 Mar 2024 03:38

Export record

Contributors

Author: Ramin Hedeshy
Author: Chandan Kumar
Author: Mike Lauer
Author: Steffen Staab ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×