The University of Southampton
University of Southampton Institutional Repository

iGesture: A Platform for Investigating Multimodal, Multimedia Gesture-based Interactions

iGesture: A Platform for Investigating Multimodal, Multimedia Gesture-based Interactions
iGesture: A Platform for Investigating Multimodal, Multimedia Gesture-based Interactions
This paper introduces the iGesture platform for investigating multimodal gesture based interactions in multimedia contexts. iGesture is a low-cost, extensible system that uses visual recognition of hand movements to support gesture-based input. Computer vision techniques support gesture based interactions that are lightweight, with minimal interaction constraints. The system enables gestures to be carried out 'in the environment' at a distance from the camera, enabling multimodal interaction in a naturalistic, transparent manner in a ubiquitous computing environment. The iGesture system can also be rapidly scripted to enable gesture-based input with a wide variety of applications. In this paper we present the technology behind the iGesture software, and a performance evaluation of the gesture recognition subsystem. We also present two exemplar multimedia application contexts which we are using to explore ambient gesture-based interactions.
Gesture-based interaction, semaphoric gestures, experiment, usability
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Karam, Maria
4de3a111-3462-4249-ab6d-be974ecaaabf
Lewis, Paul
7aa6c6d9-bc69-4e19-b2ac-a6e20558c020
schraefel, m.c.
ac304659-1692-47f6-b892-15113b8c929f
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Karam, Maria
4de3a111-3462-4249-ab6d-be974ecaaabf
Lewis, Paul
7aa6c6d9-bc69-4e19-b2ac-a6e20558c020
schraefel, m.c.
ac304659-1692-47f6-b892-15113b8c929f

Hare, Jonathon, Karam, Maria, Lewis, Paul and schraefel, m.c. (2005) iGesture: A Platform for Investigating Multimodal, Multimedia Gesture-based Interactions

Record type: Monograph (Project Report)

Abstract

This paper introduces the iGesture platform for investigating multimodal gesture based interactions in multimedia contexts. iGesture is a low-cost, extensible system that uses visual recognition of hand movements to support gesture-based input. Computer vision techniques support gesture based interactions that are lightweight, with minimal interaction constraints. The system enables gestures to be carried out 'in the environment' at a distance from the camera, enabling multimodal interaction in a naturalistic, transparent manner in a ubiquitous computing environment. The iGesture system can also be rapidly scripted to enable gesture-based input with a wide variety of applications. In this paper we present the technology behind the iGesture software, and a performance evaluation of the gesture recognition subsystem. We also present two exemplar multimedia application contexts which we are using to explore ambient gesture-based interactions.

Text
iGesture.pdf - Other
Download (372kB)
Video
iGesture.mov - Other
Download (15MB)

More information

Published date: April 2005
Keywords: Gesture-based interaction, semaphoric gestures, experiment, usability
Organisations: Web & Internet Science, Agents, Interactions & Complexity

Identifiers

Local EPrints ID: 261525
URI: http://eprints.soton.ac.uk/id/eprint/261525
PURE UUID: 62f7c156-ca51-4113-a007-dae85f9ed733
ORCID for Jonathon Hare: ORCID iD orcid.org/0000-0003-2921-4283
ORCID for m.c. schraefel: ORCID iD orcid.org/0000-0002-9061-7957

Catalogue record

Date deposited: 07 Nov 2005
Last modified: 15 Mar 2024 03:25

Export record

Contributors

Author: Jonathon Hare ORCID iD
Author: Maria Karam
Author: Paul Lewis
Author: m.c. schraefel ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×