The University of Southampton
University of Southampton Institutional Repository

Learning relevant eye movement feature spaces across users

Learning relevant eye movement feature spaces across users
Learning relevant eye movement feature spaces across users
In this paper we predict the relevance of images based on a lowdimensional feature space found using several users’ eye movements. Each user is given an image-based search task, during which their eye movements are extracted using a Tobii eye tracker. The users also provide us with explicit feedback regarding the relevance of images. We demonstrate that by using a greedy Nystrom algorithm on the eye movement features of different users, we can find a suitable low-dimensional feature space for learning. We validate the suitability of this feature space by projecting the eye movement features of a new user into this space, training an online learning algorithm using these features, and showing that the number of mistakes (regret over time) made in predicting relevant images is lower than when using the original eye movement features. We also plot Recall-Precision and ROC curves, and use a sign test to verify the statistical significance of our results.
Feature selection, Eye movement features, Online learning, Nystrom method, Tobii eye-tracker
181-185
Hussain, Zakria
88b38b90-5d11-4ab2-9246-a66485deb104
Pasupa, Kitsuchart
952ededb-8c97-41b7-a65b-6aba31de2669
Shawe-Taylor, John
b1931d97-fdd0-4bc1-89bc-ec01648e928b
Hussain, Zakria
88b38b90-5d11-4ab2-9246-a66485deb104
Pasupa, Kitsuchart
952ededb-8c97-41b7-a65b-6aba31de2669
Shawe-Taylor, John
b1931d97-fdd0-4bc1-89bc-ec01648e928b

Hussain, Zakria, Pasupa, Kitsuchart and Shawe-Taylor, John (2010) Learning relevant eye movement feature spaces across users. Proceedings of the 6th Biennial Symposium on Eye Tracking Research & Applications (ETRA'2010), Austin, TX, United States. 22 - 24 Mar 2010. pp. 181-185 . (Submitted)

Record type: Conference or Workshop Item (Poster)

Abstract

In this paper we predict the relevance of images based on a lowdimensional feature space found using several users’ eye movements. Each user is given an image-based search task, during which their eye movements are extracted using a Tobii eye tracker. The users also provide us with explicit feedback regarding the relevance of images. We demonstrate that by using a greedy Nystrom algorithm on the eye movement features of different users, we can find a suitable low-dimensional feature space for learning. We validate the suitability of this feature space by projecting the eye movement features of a new user into this space, training an online learning algorithm using these features, and showing that the number of mistakes (regret over time) made in predicting relevant images is lower than when using the original eye movement features. We also plot Recall-Precision and ROC curves, and use a sign test to verify the statistical significance of our results.

Text
LearningEyeNystrom.pdf - Accepted Manuscript
Download (749kB)

More information

Submitted date: 22 March 2010
Additional Information: Event Dates: 22-24 March 2010
Venue - Dates: Proceedings of the 6th Biennial Symposium on Eye Tracking Research & Applications (ETRA'2010), Austin, TX, United States, 2010-03-22 - 2010-03-24
Keywords: Feature selection, Eye movement features, Online learning, Nystrom method, Tobii eye-tracker
Organisations: Electronics & Computer Science

Identifiers

Local EPrints ID: 268502
URI: http://eprints.soton.ac.uk/id/eprint/268502
PURE UUID: 8bcffafe-f978-40f2-85b1-64aae9b8a22e

Catalogue record

Date deposited: 12 Feb 2010 00:16
Last modified: 14 Mar 2024 09:11

Export record

Contributors

Author: Zakria Hussain
Author: Kitsuchart Pasupa
Author: John Shawe-Taylor

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×