Hands-free web browsing: enriching the user experience with gaze and voice modality
Hands-free web browsing: enriching the user experience with gaze and voice modality
Hands-free browsers provide an effective tool for Web interaction and accessibility, overcoming the need for conventional input sources. Current approaches to hands-free interaction are primarily categorized in either voice or gaze-based modality. In this work, we investigate how these two modalities could be integrated to provide a better hands-free experience for end-users. We demonstrate a multimodal browsing approach combining eye gaze and voice inputs for optimized interaction, and to suffice user preferences with unimodal benefits. The initial assessment with five participants indicates improved performance for the multimodal prototype in comparison to single modalities for hands-free Web browsing.
Eye tracking, Hands-free interaction, Multimodal interfaces, Speech commands, Voice input, Web accessibility
Association for Computing Machinery
Sengupta, Korok
72599c31-2af0-4bc4-961e-de785c13a69b
Ke, Min
41e54ebe-59a7-4aab-a90e-d1c03b1eeba3
Menges, Raphael
0badd223-5f41-4770-b475-1fa96df2f669
Kumar, Chandan
657246b6-361b-4fdd-a9ee-e13f59a4d936
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49
14 June 2018
Sengupta, Korok
72599c31-2af0-4bc4-961e-de785c13a69b
Ke, Min
41e54ebe-59a7-4aab-a90e-d1c03b1eeba3
Menges, Raphael
0badd223-5f41-4770-b475-1fa96df2f669
Kumar, Chandan
657246b6-361b-4fdd-a9ee-e13f59a4d936
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49
Sengupta, Korok, Ke, Min, Menges, Raphael, Kumar, Chandan and Staab, Steffen
(2018)
Hands-free web browsing: enriching the user experience with gaze and voice modality.
In Proceedings - ETRA 2018: 2018 ACM Symposium on Eye Tracking Research and Applications.
vol. Part F137344,
Association for Computing Machinery..
(doi:10.1145/3204493.3208338).
Record type:
Conference or Workshop Item
(Paper)
Abstract
Hands-free browsers provide an effective tool for Web interaction and accessibility, overcoming the need for conventional input sources. Current approaches to hands-free interaction are primarily categorized in either voice or gaze-based modality. In this work, we investigate how these two modalities could be integrated to provide a better hands-free experience for end-users. We demonstrate a multimodal browsing approach combining eye gaze and voice inputs for optimized interaction, and to suffice user preferences with unimodal benefits. The initial assessment with five participants indicates improved performance for the multimodal prototype in comparison to single modalities for hands-free Web browsing.
This record has no associated files available for download.
More information
Published date: 14 June 2018
Venue - Dates:
10th ACM Symposium on Eye Tracking Research and Applications, ETRA 2018, , Warsaw, Poland, 2018-06-14 - 2018-06-17
Keywords:
Eye tracking, Hands-free interaction, Multimodal interfaces, Speech commands, Voice input, Web accessibility
Identifiers
Local EPrints ID: 423256
URI: http://eprints.soton.ac.uk/id/eprint/423256
PURE UUID: 09cae2c8-83ba-4497-83ac-d1e8bbc933b7
Catalogue record
Date deposited: 19 Sep 2018 16:31
Last modified: 16 Mar 2024 04:22
Export record
Altmetrics
Contributors
Author:
Korok Sengupta
Author:
Min Ke
Author:
Raphael Menges
Author:
Chandan Kumar
Author:
Steffen Staab
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics