The University of Southampton
University of Southampton Institutional Repository

Immersive virtual reality audio rendering adapted to the listener and the room

Immersive virtual reality audio rendering adapted to the listener and the room
Immersive virtual reality audio rendering adapted to the listener and the room
The visual and auditory modalities are the most important stimuli for humans. In order to maximise the sense of immersion in VR environments, a plausible spatial audio reproduction synchronised with visual information is essential. However, measuring acoustic properties of an environment using audio equipment is a complicated process. In this chapter, we introduce a simple and efficient system to estimate room acoustic for plausible spatial audio rendering using 360° cameras for real scene reproduction in VR. A simplified 3D semantic model of the scene is estimated from captured images using computer vision algorithms and convolutional neural network (CNN). Spatially synchronised audio is reproduced based on the estimated geometric and acoustic properties in the scene. The reconstructed scenes are rendered with synthesised spatial audio.
3D modeling, Acoustic properties, Audio acoustics, Computer vision, Convolutional neural networks, Room acoustics, Sound reproduction, Spatial Audio, virtual reality
293-318
Springer
Kim, Hansung
2c7c135c-f00b-4409-acb2-85b3a9e8225f
Remaggi, Luca
c74406cb-15d2-4575-b086-97b55421649e
Jackson, P.J.B.
82ff8754-919e-4c17-b57a-bba28e41c6ee
Hilton, Adrian
12782a55-4c4d-4dfb-a690-62505f6665db
Magnor, M
Sorkin-Hornung, A
Kim, Hansung
2c7c135c-f00b-4409-acb2-85b3a9e8225f
Remaggi, Luca
c74406cb-15d2-4575-b086-97b55421649e
Jackson, P.J.B.
82ff8754-919e-4c17-b57a-bba28e41c6ee
Hilton, Adrian
12782a55-4c4d-4dfb-a690-62505f6665db
Magnor, M
Sorkin-Hornung, A

Kim, Hansung, Remaggi, Luca, Jackson, P.J.B. and Hilton, Adrian (2020) Immersive virtual reality audio rendering adapted to the listener and the room. In, Magnor, M and Sorkin-Hornung, A (eds.) Real VR – Immersive Digital Reality. (Lecture Notes in Computer Science, , (doi:10.1007/978-3-030-41816-8_13), 11900) Cham. Springer, pp. 293-318. (doi:10.1007/978-3-030-41816-8_13).

Record type: Book Section

Abstract

The visual and auditory modalities are the most important stimuli for humans. In order to maximise the sense of immersion in VR environments, a plausible spatial audio reproduction synchronised with visual information is essential. However, measuring acoustic properties of an environment using audio equipment is a complicated process. In this chapter, we introduce a simple and efficient system to estimate room acoustic for plausible spatial audio rendering using 360° cameras for real scene reproduction in VR. A simplified 3D semantic model of the scene is estimated from captured images using computer vision algorithms and convolutional neural network (CNN). Spatially synchronised audio is reproduced based on the estimated geometric and acoustic properties in the scene. The reconstructed scenes are rendered with synthesised spatial audio.

Full text not available from this repository.

More information

e-pub ahead of print date: 3 March 2020
Published date: 2020
Keywords: 3D modeling, Acoustic properties, Audio acoustics, Computer vision, Convolutional neural networks, Room acoustics, Sound reproduction, Spatial Audio, virtual reality

Identifiers

Local EPrints ID: 440611
URI: http://eprints.soton.ac.uk/id/eprint/440611
PURE UUID: 5efe9acf-30fa-4efd-a81e-c8b0a39c8221
ORCID for Hansung Kim: ORCID iD orcid.org/0000-0003-4907-0491

Catalogue record

Date deposited: 12 May 2020 16:35
Last modified: 18 Feb 2021 17:41

Export record

Altmetrics

Contributors

Author: Hansung Kim ORCID iD
Author: Luca Remaggi
Author: P.J.B. Jackson
Author: Adrian Hilton
Editor: M Magnor
Editor: A Sorkin-Hornung

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×