The University of Southampton
University of Southampton Institutional Repository

Image Ranking with Eye Movements

Image Ranking with Eye Movements
Image Ranking with Eye Movements
In order to help users navigate an image search system, one could provide explicit rank information on a set of images. These rankings are learnt so to present a new set of relevant images. Although, requiring explicit information may not be feasible in some cases, we consider the setting where the user provides implicit feedback, eye movements, to assist in such a task. This paper explores the idea of implicitly incorporating eye movement features in an image ranking task. Previous work had demonstrated that combining eye movement and image features improved the retrieval accuracy. Despite promising results the proposed approach is unrealistic as no eye movements are given a-priori for new images. We propose a novel search approach which combines image together with eye movements features in a tensor Ranking Support Vector Machine, and show that by extracting the individual source-specific weight vectors we are able to construct a new image-based semantic space which outperforms in retrieval accuracy.
37-42
Pasupa, Kitsuchart
952ededb-8c97-41b7-a65b-6aba31de2669
Szedmak, Sandor
c6a84aa3-2956-4acf-8293-a1b676f6d7d8
Hardoon, David
e9eb22b2-daf6-460c-94b1-8208c917f862
Pasupa, Kitsuchart
952ededb-8c97-41b7-a65b-6aba31de2669
Szedmak, Sandor
c6a84aa3-2956-4acf-8293-a1b676f6d7d8
Hardoon, David
e9eb22b2-daf6-460c-94b1-8208c917f862

Pasupa, Kitsuchart, Szedmak, Sandor and Hardoon, David (2009) Image Ranking with Eye Movements. Proceedings of the 23rd Annual Conference on Neural Information Processing Systems (NIPS'2009) Workshop on Advance in Rankings, Canada. pp. 37-42 .

Record type: Conference or Workshop Item (Paper)

Abstract

In order to help users navigate an image search system, one could provide explicit rank information on a set of images. These rankings are learnt so to present a new set of relevant images. Although, requiring explicit information may not be feasible in some cases, we consider the setting where the user provides implicit feedback, eye movements, to assist in such a task. This paper explores the idea of implicitly incorporating eye movement features in an image ranking task. Previous work had demonstrated that combining eye movement and image features improved the retrieval accuracy. Despite promising results the proposed approach is unrealistic as no eye movements are given a-priori for new images. We propose a novel search approach which combines image together with eye movements features in a tensor Ranking Support Vector Machine, and show that by extracting the individual source-specific weight vectors we are able to construct a new image-based semantic space which outperforms in retrieval accuracy.

PDF
kpetal09nipsranking.pdf - Version of Record
Download (117kB)

More information

Published date: 11 December 2009
Additional Information: Event Dates: 11 December 2009
Venue - Dates: Proceedings of the 23rd Annual Conference on Neural Information Processing Systems (NIPS'2009) Workshop on Advance in Rankings, Canada, 2009-12-11
Organisations: Electronics & Computer Science

Identifiers

Local EPrints ID: 268230
URI: https://eprints.soton.ac.uk/id/eprint/268230
PURE UUID: c8dcfac9-1635-446b-a5ea-c58bec918872

Catalogue record

Date deposited: 17 Nov 2009 15:33
Last modified: 18 Jul 2017 06:56

Export record

Contributors

Author: Kitsuchart Pasupa
Author: Sandor Szedmak
Author: David Hardoon

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of https://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×