The University of Southampton
University of Southampton Institutional Repository
Warning ePrints Soton is experiencing an issue with some file downloads not being available. We are working hard to fix this. Please bear with us.

Head tracker using webcam for auralization

Head tracker using webcam for auralization
Head tracker using webcam for auralization
Binaural rendering is a technique that seeks to generate virtual auditory environments that replicate the natural listening experience, including the three-dimensional perception of spatialized sound sources. As such, real-time knowledge of the listener’s position, or more specifically, their head and ear orientations allow the transfer of movement from the real world to virtual spaces, which consequently enables a richer immersion and interaction with the virtual scene. This study presents the use of a simple laptop integrated camera (webcam) as a head tracker sensor, disregarding the necessity to mount any hardware to the listener’s head. The software was built on top of a state-of-the-art face landmark detection model, from Google’s MediaPipe library for Python. Manipulations to the coordinate system are performed, in order to translate the origin from the camera to the center of the subject’s head and adequately extract rotation matrices and Euler angles. Low-latency communication is enabled via User Datagram Protocol (UDP), allowing the head tracker to run in parallel and asynchronous with the main application. Empirical experiments have demonstrated reasonable accuracy and quick response, indicating suitability to real-time applications that do not necessarily require methodical precision. Furthermore, cross-validation with existing hardware head trackers revealed an adequate agreement on measured head orientation, confirming its potential as a contactless head tracking device.
head tracker, webcam, Binaural, Auralization, python, Matlab, real-time
Carvalho, Davi Rocha
b6914936-c8a9-4ea8-b79d-cab8de28fda2
Fonseca, William D'Andrea
eb989e8f-2536-4023-9292-d7f802f1f6dc
Hollebon, Jacob
fecec634-ce57-4d47-813e-519135cdf7f2
Mareze, Paulo Henrique
8b746714-2197-43df-a2e7-c69103460f02
Fazi, Filippo
e5aefc08-ab45-47c1-ad69-c3f12d07d807
Carvalho, Davi Rocha
b6914936-c8a9-4ea8-b79d-cab8de28fda2
Fonseca, William D'Andrea
eb989e8f-2536-4023-9292-d7f802f1f6dc
Hollebon, Jacob
fecec634-ce57-4d47-813e-519135cdf7f2
Mareze, Paulo Henrique
8b746714-2197-43df-a2e7-c69103460f02
Fazi, Filippo
e5aefc08-ab45-47c1-ad69-c3f12d07d807

Carvalho, Davi Rocha, Fonseca, William D'Andrea, Hollebon, Jacob, Mareze, Paulo Henrique and Fazi, Filippo (2021) Head tracker using webcam for auralization. Inter-Noise 2021, , Washington, United States. 01 - 05 Aug 2021. 12 pp . (In Press)

Record type: Conference or Workshop Item (Paper)

Abstract

Binaural rendering is a technique that seeks to generate virtual auditory environments that replicate the natural listening experience, including the three-dimensional perception of spatialized sound sources. As such, real-time knowledge of the listener’s position, or more specifically, their head and ear orientations allow the transfer of movement from the real world to virtual spaces, which consequently enables a richer immersion and interaction with the virtual scene. This study presents the use of a simple laptop integrated camera (webcam) as a head tracker sensor, disregarding the necessity to mount any hardware to the listener’s head. The software was built on top of a state-of-the-art face landmark detection model, from Google’s MediaPipe library for Python. Manipulations to the coordinate system are performed, in order to translate the origin from the camera to the center of the subject’s head and adequately extract rotation matrices and Euler angles. Low-latency communication is enabled via User Datagram Protocol (UDP), allowing the head tracker to run in parallel and asynchronous with the main application. Empirical experiments have demonstrated reasonable accuracy and quick response, indicating suitability to real-time applications that do not necessarily require methodical precision. Furthermore, cross-validation with existing hardware head trackers revealed an adequate agreement on measured head orientation, confirming its potential as a contactless head tracking device.

Text
Paper_internoise_2021__Webcam_Headtracker
Restricted to Repository staff only
Request a copy

More information

Accepted/In Press date: 18 June 2021
Venue - Dates: Inter-Noise 2021, , Washington, United States, 2021-08-01 - 2021-08-05
Keywords: head tracker, webcam, Binaural, Auralization, python, Matlab, real-time

Identifiers

Local EPrints ID: 450040
URI: http://eprints.soton.ac.uk/id/eprint/450040
PURE UUID: 673519d2-47dd-4ded-9763-2a0ea8dd3fbc
ORCID for Jacob Hollebon: ORCID iD orcid.org/0000-0002-4119-4070

Catalogue record

Date deposited: 06 Jul 2021 16:32
Last modified: 07 Jul 2021 02:01

Export record

Contributors

Author: Davi Rocha Carvalho
Author: William D'Andrea Fonseca
Author: Jacob Hollebon ORCID iD
Author: Paulo Henrique Mareze
Author: Filippo Fazi

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×