The University of Southampton
University of Southampton Institutional Repository

Acoustic SLAM

Acoustic SLAM
Acoustic SLAM
An algorithm is presented that enables devices equipped with microphones, such as robots, to move within their environment in order to explore, adapt to, and interact with sound sources of interest. Acoustic scene mapping creates a three-dimensional (3D) representation of the positional information of sound sources across time and space. In practice, positional source information is only provided by Direction-of-Arrival (DoA) estimates of the source directions; the source-sensor range is typically difficult to obtain. DoA estimates are also adversely affected by reverberation, noise, and interference, leading to errors in source location estimation and consequent false DoA estimates. Moreover, many acoustic sources, such as human talkers, are not continuously active, such that periods of inactivity lead to missing DoA estimates. Withal, the DoA estimates are specified relative to the observer's sensor location and orientation. Accurate positional information about the observer therefore is crucial. This paper proposes Acoustic Simultaneous Localization and Mapping (aSLAM), which uses acoustic signals to simultaneously map the 3D positions of multiple sound sources while passively localizing the observer within the scene map. The performance of aSLAM is analyzed and evaluated using a series of realistic simulations. Results are presented to show the impact of the observer motion and sound source localization accuracy.
2329-9304
1484 - 1498
Evers, Christine
93090c84-e984-4cc3-9363-fbf3f3639c4b
Naylor, Patrick
8c20a1a0-4507-4a0f-8324-f3075354dc52
Evers, Christine
93090c84-e984-4cc3-9363-fbf3f3639c4b
Naylor, Patrick
8c20a1a0-4507-4a0f-8324-f3075354dc52

Evers, Christine and Naylor, Patrick (2018) Acoustic SLAM. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 26 (9), 1484 - 1498. (doi:10.1109/TASLP.2018.2828321).

Record type: Article

Abstract

An algorithm is presented that enables devices equipped with microphones, such as robots, to move within their environment in order to explore, adapt to, and interact with sound sources of interest. Acoustic scene mapping creates a three-dimensional (3D) representation of the positional information of sound sources across time and space. In practice, positional source information is only provided by Direction-of-Arrival (DoA) estimates of the source directions; the source-sensor range is typically difficult to obtain. DoA estimates are also adversely affected by reverberation, noise, and interference, leading to errors in source location estimation and consequent false DoA estimates. Moreover, many acoustic sources, such as human talkers, are not continuously active, such that periods of inactivity lead to missing DoA estimates. Withal, the DoA estimates are specified relative to the observer's sensor location and orientation. Accurate positional information about the observer therefore is crucial. This paper proposes Acoustic Simultaneous Localization and Mapping (aSLAM), which uses acoustic signals to simultaneously map the 3D positions of multiple sound sources while passively localizing the observer within the scene map. The performance of aSLAM is analyzed and evaluated using a series of realistic simulations. Results are presented to show the impact of the observer motion and sound source localization accuracy.

Text
08340823 - Version of Record
Available under License Creative Commons Attribution.
Download (5MB)

More information

Accepted/In Press date: 3 April 2018
e-pub ahead of print date: 18 April 2018
Published date: September 2018

Identifiers

Local EPrints ID: 437941
URI: http://eprints.soton.ac.uk/id/eprint/437941
ISSN: 2329-9304
PURE UUID: 57e7ea1d-f90a-4faf-90ec-ca8158cfa939
ORCID for Christine Evers: ORCID iD orcid.org/0000-0003-0757-5504

Catalogue record

Date deposited: 24 Feb 2020 17:31
Last modified: 07 Oct 2020 02:27

Export record

Altmetrics

Contributors

Author: Christine Evers ORCID iD
Author: Patrick Naylor

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×