Hand gesture recognition for user-defined textual inputs and gestures
Hand gesture recognition for user-defined textual inputs and gestures
Despite recent progress, hand gesture recognition, a highly regarded method of human computer interaction, still faces considerable challenges. In this paper, we address the problem of individual user style variation, which can significantly affect system performance. While previous work only supports the manual inclusion of customized hand gestures in the context of very specific application settings, here, an effective, adaptable graphical interface, supporting user-defined hand gestures is introduced. In our system, hand gestures are personalized by training a camera-based hand gesture recognition model for a particular user, using data just from that user. We employ a lightweight Multilayer Perceptron architecture based on contrastive learning, reducing the size of the data needed and the training timeframes compared to previous recognition models that require massive training datasets. Experimental results demonstrate rapid convergence and satisfactory accuracy of the recognition model, while a user study collects and analyses some initial user feedback on the system in deployment.
Wang, Jindi
5611d117-de7a-46b9-860e-3e4f30fc08a3
Ivrissimtzis, Ioannis
0a3091ef-c730-4d02-a55f-8f84d847c3dc
Li, Zhaoxing
65935c45-a640-496c-98b8-43bed39e1850
Shi, Lei
1d9c54f7-4021-47c8-873d-2f9d34d5b649
Wang, Jindi
5611d117-de7a-46b9-860e-3e4f30fc08a3
Ivrissimtzis, Ioannis
0a3091ef-c730-4d02-a55f-8f84d847c3dc
Li, Zhaoxing
65935c45-a640-496c-98b8-43bed39e1850
Shi, Lei
1d9c54f7-4021-47c8-873d-2f9d34d5b649
Wang, Jindi, Ivrissimtzis, Ioannis, Li, Zhaoxing and Shi, Lei
(2024)
Hand gesture recognition for user-defined textual inputs and gestures.
Universal Access in the Information Society, 27.
(doi:10.1007/s10209-024-01139-6).
Abstract
Despite recent progress, hand gesture recognition, a highly regarded method of human computer interaction, still faces considerable challenges. In this paper, we address the problem of individual user style variation, which can significantly affect system performance. While previous work only supports the manual inclusion of customized hand gestures in the context of very specific application settings, here, an effective, adaptable graphical interface, supporting user-defined hand gestures is introduced. In our system, hand gestures are personalized by training a camera-based hand gesture recognition model for a particular user, using data just from that user. We employ a lightweight Multilayer Perceptron architecture based on contrastive learning, reducing the size of the data needed and the training timeframes compared to previous recognition models that require massive training datasets. Experimental results demonstrate rapid convergence and satisfactory accuracy of the recognition model, while a user study collects and analyses some initial user feedback on the system in deployment.
Text
UAIS_Hand_Gesture_Recognition_for_User_defined_Textual_Inputs_and_Gestures
More information
Accepted/In Press date: 26 July 2024
e-pub ahead of print date: 2 August 2024
Identifiers
Local EPrints ID: 493964
URI: http://eprints.soton.ac.uk/id/eprint/493964
ISSN: 1615-5289
PURE UUID: 4083ab8a-2501-4db8-9a28-5db4e21067d8
Catalogue record
Date deposited: 17 Sep 2024 17:08
Last modified: 18 Sep 2024 04:01
Export record
Altmetrics
Contributors
Author:
Jindi Wang
Author:
Ioannis Ivrissimtzis
Author:
Zhaoxing Li
Author:
Lei Shi
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics