The University of Southampton
University of Southampton Institutional Repository

A tongue-movement communication and control concept for hands-free human-machine interfaces

Vaidyanathan, R., Chung, B., Gupta, L., Kook, H., Kota, S. and West, J.D. (2007) A tongue-movement communication and control concept for hands-free human-machine interfaces IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 37, (4), pp. 533-546. (doi:10.1109/TSMCA.2007.897919).

Record type: Article


A new communication and control concept using tongue movements is introduced to generate, detect, and classify signals that can be used in novel hands-free human–machine interface applications such as communicating with a computer and controlling devices. The signals that are caused by tongue movements are the changes in the airflow pressure that occur in the ear canal. The goal is to demonstrate that the ear pressure signals that are acquired using a microphone that is inserted into the ear canal, due to specific tongue movements, are distinct and that the signals can be detected and classified very accurately. The strategy that is developed for demonstrating the concept includes energy-based signal detection and segmentation to extract ear pressure signals due to tongue movements, signal normalization to decrease the trial-to-trial variations in the signals, and pairwise cross-correlation signal averaging to obtain accurate estimates from ensembles of pressure signals. A new decision fusion classification algorithm is formulated to assign the pressure signals to their respective tongue-movement classes. The complete strategy of signal detection and segmentation, estimation, and classification is tested on four tongue movements of eight subjects. Through extensive experiments, it is demonstrated that the ear pressure signals due to the tongue movements are distinct and that the four pressure signals can be classified with an accuracy of more than 97% averaged across the eight subjects using the decision fusion classification algorithm. Thus, it is concluded that, through the unique concept that is introduced in this paper, human–computer interfaces that use tongue movements can be designed for hands-free communication and control applications.

Full text not available from this repository.

More information

Published date: July 2007


Local EPrints ID: 46371
ISSN: 1083-4427
PURE UUID: c1ec4267-e6c1-4b6e-9d70-422a771f2551

Catalogue record

Date deposited: 25 Jun 2007
Last modified: 12 Sep 2017 16:32

Export record



Author: R. Vaidyanathan
Author: B. Chung
Author: L. Gupta
Author: H. Kook
Author: S. Kota
Author: J.D. West

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton:

ePrints Soton supports OAI 2.0 with a base URL of

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.