The University of Southampton
University of Southampton Institutional Repository

Improving user experience of eye tracking-based interaction: Introspecting and adapting interfaces

Improving user experience of eye tracking-based interaction: Introspecting and adapting interfaces
Improving user experience of eye tracking-based interaction: Introspecting and adapting interfaces
Eye tracking systems have greatly improved in recent years, being a viable and affordable option as digital communication channel, especially for people lacking fine motor skills. Using eye tracking as an input method is challenging due to accuracy and ambiguity issues, and therefore research in eye gaze interaction is mainly focused on better pointing and typing methods. However, these methods eventually need to be assimilated to enable users to control application interfaces. A common approach to employ eye tracking for controlling application interfaces is to emulate mouse and keyboard functionality. We argue that the emulation approach incurs unnecessary interaction and visual overhead for users, aggravating the entire experience of gaze-based computer access. We discuss how the knowledge about the interface semantics can help reducing the interaction and visual overhead to improve the user experience. Thus, we propose the efficient introspection of interfaces to retrieve the interface semantics and adapt the interaction with eye gaze. We have developed a Web browser, GazeTheWeb, that introspects Web page interfaces and adapts both the browser interface and the interaction elements on Web pages for gaze input. In a summative lab study with 20 participants, GazeTheWeb allowed the participants to accomplish information search and browsing tasks significantly faster than an emulation approach. Additional feasibility tests of GazeTheWeb in lab and home environment showcase its effectiveness in accomplishing daily Web browsing activities and adapting large variety of modern Web pages to suffice the interaction for people with motor impairment.
1073-0516
1-46
Menges, Raphael
0badd223-5f41-4770-b475-1fa96df2f669
Kumar, Chandan
657246b6-361b-4fdd-a9ee-e13f59a4d936
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49
Menges, Raphael
0badd223-5f41-4770-b475-1fa96df2f669
Kumar, Chandan
657246b6-361b-4fdd-a9ee-e13f59a4d936
Staab, Steffen
bf48d51b-bd11-4d58-8e1c-4e6e03b30c49

Menges, Raphael, Kumar, Chandan and Staab, Steffen (2019) Improving user experience of eye tracking-based interaction: Introspecting and adapting interfaces. ACM Transactions on Computer-Human Interaction, 26 (6), 1-46, [37]. (doi:10.1145/3338844).

Record type: Article

Abstract

Eye tracking systems have greatly improved in recent years, being a viable and affordable option as digital communication channel, especially for people lacking fine motor skills. Using eye tracking as an input method is challenging due to accuracy and ambiguity issues, and therefore research in eye gaze interaction is mainly focused on better pointing and typing methods. However, these methods eventually need to be assimilated to enable users to control application interfaces. A common approach to employ eye tracking for controlling application interfaces is to emulate mouse and keyboard functionality. We argue that the emulation approach incurs unnecessary interaction and visual overhead for users, aggravating the entire experience of gaze-based computer access. We discuss how the knowledge about the interface semantics can help reducing the interaction and visual overhead to improve the user experience. Thus, we propose the efficient introspection of interfaces to retrieve the interface semantics and adapt the interaction with eye gaze. We have developed a Web browser, GazeTheWeb, that introspects Web page interfaces and adapts both the browser interface and the interaction elements on Web pages for gaze input. In a summative lab study with 20 participants, GazeTheWeb allowed the participants to accomplish information search and browsing tasks significantly faster than an emulation approach. Additional feasibility tests of GazeTheWeb in lab and home environment showcase its effectiveness in accomplishing daily Web browsing activities and adapting large variety of modern Web pages to suffice the interaction for people with motor impairment.

Text
tochi_submitted20190520 - Author's Original
Restricted to Repository staff only
Request a copy
Text
GazeTheWeb_ToCHI - Accepted Manuscript
Download (12MB)

More information

Submitted date: 15 May 2019
Accepted/In Press date: 28 May 2019
e-pub ahead of print date: 31 October 2019
Published date: 1 November 2019

Identifiers

Local EPrints ID: 431102
URI: http://eprints.soton.ac.uk/id/eprint/431102
ISSN: 1073-0516
PURE UUID: 6729882f-c16c-4655-9fe4-4deb20df67b5
ORCID for Steffen Staab: ORCID iD orcid.org/0000-0002-0780-4154

Catalogue record

Date deposited: 07 Jun 2019 16:30
Last modified: 16 Mar 2024 07:52

Export record

Altmetrics

Contributors

Author: Raphael Menges
Author: Chandan Kumar
Author: Steffen Staab ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×