The University of Southampton
University of Southampton Institutional Repository

VPR-Bench: an open-source visual place recognition evaluation framework with quantifiable viewpoint and appearance change

VPR-Bench: an open-source visual place recognition evaluation framework with quantifiable viewpoint and appearance change
VPR-Bench: an open-source visual place recognition evaluation framework with quantifiable viewpoint and appearance change
Visual place recognition (VPR) is the process of recognising a previously visited place using visual information, often under varying appearance conditions and viewpoint changes and with computational constraints. VPR is related to the concepts of localisation, loop closure, image retrieval and is a critical component of many autonomous navigation systems ranging from autonomous vehicles to drones and computer vision systems. While the concept of place recognition has been around for many years, VPR research has grown rapidly as a field over the past decade due to improving camera hardware and its potential for deep learning-based techniques, and has become a widely studied topic in both the computer vision and robotics communities. This growth however has led to fragmentation and a lack of standardisation in the field, especially concerning performance evaluation. Moreover, the notion of viewpoint and illumination invariance of VPR techniques has largely been assessed qualitatively and hence ambiguously in the past. In this paper, we address these gaps through a new comprehensive open-source framework for assessing the performance of VPR techniques, dubbed “VPR-Bench”. VPR-Bench (Open-sourced at: https://github.com/MubarizZaffar/VPR-Bench) introduces two much-needed capabilities for VPR researchers: firstly, it contains a benchmark of 12 fully-integrated datasets and 10 VPR techniques, and secondly, it integrates a comprehensive variation-quantified dataset for quantifying viewpoint and illumination invariance. We apply and analyse popular evaluation metrics for VPR from both the computer vision and robotics communities, and discuss how these different metrics complement and/or replace each other, depending upon the underlying applications and system requirements. Our analysis reveals that no universal SOTA VPR technique exists, since: (a) state-of-the-art (SOTA) performance is achieved by 8 out of the 10 techniques on at least one dataset, (b) SOTA technique in one community does not necessarily yield SOTA performance in the other given the differences in datasets and metrics. Furthermore, we identify key open challenges since: (c) all 10 techniques suffer greatly in perceptually-aliased and less-structured environments, (d) all techniques suffer from viewpoint variance where lateral change has less effect than 3D change, and (e) directional illumination change has more adverse effects on matching confidence than uniform illumination change. We also present detailed meta-analyses regarding the roles of varying ground-truths, platforms, application requirements and technique parameters. Finally, VPR-Bench provides a unified implementation to deploy these VPR techniques, metrics and datasets, and is extensible through templates.
Visual place recognition, SLAM, Autonomous robotics, Robotic vision
0920-5691
2136-2174
Zaffar, Mubariz
4ecc6c61-2fff-48a2-9652-3c1564c34de9
Garg, Sourav
b05f2bfb-4c84-415c-b182-18d6bb10e00e
Milford, Michael
9edf5ef3-4a6a-4d05-aec2-6146c00cd407
Kooij, Julian
155f0fe4-72f5-4264-83fc-a9cf3fe678a4
Flynn, David
9cb44648-488b-4f22-b72b-7e5117cd919c
McDonald-Maier, Klaus
4429a771-384b-4cc6-8d45-1813c3792939
Ehsan, Shoaib
ae8922f0-dbe0-4b22-8474-98e84d852de7
Zaffar, Mubariz
4ecc6c61-2fff-48a2-9652-3c1564c34de9
Garg, Sourav
b05f2bfb-4c84-415c-b182-18d6bb10e00e
Milford, Michael
9edf5ef3-4a6a-4d05-aec2-6146c00cd407
Kooij, Julian
155f0fe4-72f5-4264-83fc-a9cf3fe678a4
Flynn, David
9cb44648-488b-4f22-b72b-7e5117cd919c
McDonald-Maier, Klaus
4429a771-384b-4cc6-8d45-1813c3792939
Ehsan, Shoaib
ae8922f0-dbe0-4b22-8474-98e84d852de7

Zaffar, Mubariz, Garg, Sourav, Milford, Michael, Kooij, Julian, Flynn, David, McDonald-Maier, Klaus and Ehsan, Shoaib (2021) VPR-Bench: an open-source visual place recognition evaluation framework with quantifiable viewpoint and appearance change. International Journal of Computer Vision, 129 (7), 2136-2174. (doi:10.1007/s11263-021-01469-5).

Record type: Article

Abstract

Visual place recognition (VPR) is the process of recognising a previously visited place using visual information, often under varying appearance conditions and viewpoint changes and with computational constraints. VPR is related to the concepts of localisation, loop closure, image retrieval and is a critical component of many autonomous navigation systems ranging from autonomous vehicles to drones and computer vision systems. While the concept of place recognition has been around for many years, VPR research has grown rapidly as a field over the past decade due to improving camera hardware and its potential for deep learning-based techniques, and has become a widely studied topic in both the computer vision and robotics communities. This growth however has led to fragmentation and a lack of standardisation in the field, especially concerning performance evaluation. Moreover, the notion of viewpoint and illumination invariance of VPR techniques has largely been assessed qualitatively and hence ambiguously in the past. In this paper, we address these gaps through a new comprehensive open-source framework for assessing the performance of VPR techniques, dubbed “VPR-Bench”. VPR-Bench (Open-sourced at: https://github.com/MubarizZaffar/VPR-Bench) introduces two much-needed capabilities for VPR researchers: firstly, it contains a benchmark of 12 fully-integrated datasets and 10 VPR techniques, and secondly, it integrates a comprehensive variation-quantified dataset for quantifying viewpoint and illumination invariance. We apply and analyse popular evaluation metrics for VPR from both the computer vision and robotics communities, and discuss how these different metrics complement and/or replace each other, depending upon the underlying applications and system requirements. Our analysis reveals that no universal SOTA VPR technique exists, since: (a) state-of-the-art (SOTA) performance is achieved by 8 out of the 10 techniques on at least one dataset, (b) SOTA technique in one community does not necessarily yield SOTA performance in the other given the differences in datasets and metrics. Furthermore, we identify key open challenges since: (c) all 10 techniques suffer greatly in perceptually-aliased and less-structured environments, (d) all techniques suffer from viewpoint variance where lateral change has less effect than 3D change, and (e) directional illumination change has more adverse effects on matching confidence than uniform illumination change. We also present detailed meta-analyses regarding the roles of varying ground-truths, platforms, application requirements and technique parameters. Finally, VPR-Bench provides a unified implementation to deploy these VPR techniques, metrics and datasets, and is extensible through templates.

Text
s11263-021-01469-5 - Version of Record
Available under License Creative Commons Attribution.
Download (11MB)

More information

Accepted/In Press date: 7 April 2021
e-pub ahead of print date: 7 May 2021
Published date: July 2021
Keywords: Visual place recognition, SLAM, Autonomous robotics, Robotic vision

Identifiers

Local EPrints ID: 473505
URI: http://eprints.soton.ac.uk/id/eprint/473505
ISSN: 0920-5691
PURE UUID: 85faef51-6e78-4301-a3d3-98ce525f3750
ORCID for Shoaib Ehsan: ORCID iD orcid.org/0000-0001-9631-1898

Catalogue record

Date deposited: 20 Jan 2023 18:02
Last modified: 17 Mar 2024 04:16

Export record

Altmetrics

Contributors

Author: Mubariz Zaffar
Author: Sourav Garg
Author: Michael Milford
Author: Julian Kooij
Author: David Flynn
Author: Klaus McDonald-Maier
Author: Shoaib Ehsan ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×