The University of Southampton
University of Southampton Institutional Repository

Localization of moving microphone arrays from moving sound sources for robot audition

Localization of moving microphone arrays from moving sound sources for robot audition
Localization of moving microphone arrays from moving sound sources for robot audition

Acoustic Simultaneous Localization and Mapping (a-SLAM) jointly localizes the trajectory of a microphone array installed on a moving platform, whilst estimating the acoustic map of surrounding sound sources, such as human speakers. Whilst traditional approaches for SLAM in the vision and optical research literature rely on the assumption that the surrounding map features are static, in the acoustic case the positions of talkers are usually time-varying due to head rotations and body movements. This paper demonstrates that tracking of moving sources can be incorporated in a-SLAM by modelling the acoustic map as a Random Finite Set (RFS) of multiple sources and explicitly imposing models of the source dynamics. The proposed approach is verified and its performance evaluated for realistic simulated data.

2219-5491
1008-1012
European Signal Processing Conference, EUSIPCO
Evers, Christine
93090c84-e984-4cc3-9363-fbf3f3639c4b
Moore, Alastair H.
58d011fd-6a02-449a-9b77-651e8c86166e
Naylor, Patrick A.
13079486-664a-414c-a1a2-01a30bf0997b
Evers, Christine
93090c84-e984-4cc3-9363-fbf3f3639c4b
Moore, Alastair H.
58d011fd-6a02-449a-9b77-651e8c86166e
Naylor, Patrick A.
13079486-664a-414c-a1a2-01a30bf0997b

Evers, Christine, Moore, Alastair H. and Naylor, Patrick A. (2016) Localization of moving microphone arrays from moving sound sources for robot audition. In 2016 24th European Signal Processing Conference, EUSIPCO 2016. vol. 2016-November, European Signal Processing Conference, EUSIPCO. pp. 1008-1012 . (doi:10.1109/EUSIPCO.2016.7760400).

Record type: Conference or Workshop Item (Paper)

Abstract

Acoustic Simultaneous Localization and Mapping (a-SLAM) jointly localizes the trajectory of a microphone array installed on a moving platform, whilst estimating the acoustic map of surrounding sound sources, such as human speakers. Whilst traditional approaches for SLAM in the vision and optical research literature rely on the assumption that the surrounding map features are static, in the acoustic case the positions of talkers are usually time-varying due to head rotations and body movements. This paper demonstrates that tracking of moving sources can be incorporated in a-SLAM by modelling the acoustic map as a Random Finite Set (RFS) of multiple sources and explicitly imposing models of the source dynamics. The proposed approach is verified and its performance evaluated for realistic simulated data.

Full text not available from this repository.

More information

Published date: 28 November 2016
Venue - Dates: 24th European Signal Processing Conference, EUSIPCO 2016, , Budapest, Hungary, 2016-08-27 - 2016-09-01

Identifiers

Local EPrints ID: 444979
URI: http://eprints.soton.ac.uk/id/eprint/444979
ISSN: 2219-5491
PURE UUID: 6a49ea50-2287-432b-97bb-977899de6d86
ORCID for Christine Evers: ORCID iD orcid.org/0000-0003-0757-5504

Catalogue record

Date deposited: 13 Nov 2020 17:34
Last modified: 18 Feb 2021 17:41

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×