An efficient and scalable collection of fly-inspired voting units for visual place recognition in changing environments
An efficient and scalable collection of fly-inspired voting units for visual place recognition in changing environments
State-of-the-art visual place recognition performance is currently being achieved utilizing deep learning based approaches. Despite the recent efforts in designing lightweight convolutional neural network based models, these can still be too expensive for the most hardware restricted robot applications. Low-overhead visual place recognition techniques would not only enable platforms equipped with low-end, cheap hardware but also reduce computation on more powerful systems, allowing these resources to be allocated for other navigation tasks. In this work, our goal is to provide an algorithm of extreme compactness and efficiency while achieving state-of-the-art robustness to appearance changes and small point-of-view variations. Our first contribution is DrosoNet, an exceptionally compact model inspired by the odor processing abilities of the fruit fly, Drosophila melanogaster. Our second and main contribution is a voting mechanism that leverages multiple small and efficient classifiers to achieve more robust and consistent visual place recognition compared to a single one. We use DrosoNet as the baseline classifier for the voting mechanism and evaluate our models on five benchmark datasets, assessing moderate to extreme appearance changes and small to moderate viewpoint variations. We then compare the proposed algorithms to state-of-the-art methods, both in terms ofarea under the precision-recall curve results and computational efficiency.
Computational modeling, Convolutional neural networks, Feature extraction, Hardware, Navigation, Robots, Visualization
2527-2534
Arcanjo, Bruno
8ecc115c-83a1-42ea-97ab-0625dfcb7f32
Ferrarini, Bruno
a93ab204-5ccf-4b6d-a7c2-e02e65729924
Milford, Michael
9edf5ef3-4a6a-4d05-aec2-6146c00cd407
McDonald-Maier, Klaus D.
d35c2e77-744a-4318-9d9d-726459e64db9
Ehsan, Shoaib
ae8922f0-dbe0-4b22-8474-98e84d852de7
1 April 2022
Arcanjo, Bruno
8ecc115c-83a1-42ea-97ab-0625dfcb7f32
Ferrarini, Bruno
a93ab204-5ccf-4b6d-a7c2-e02e65729924
Milford, Michael
9edf5ef3-4a6a-4d05-aec2-6146c00cd407
McDonald-Maier, Klaus D.
d35c2e77-744a-4318-9d9d-726459e64db9
Ehsan, Shoaib
ae8922f0-dbe0-4b22-8474-98e84d852de7
Arcanjo, Bruno, Ferrarini, Bruno, Milford, Michael, McDonald-Maier, Klaus D. and Ehsan, Shoaib
(2022)
An efficient and scalable collection of fly-inspired voting units for visual place recognition in changing environments.
IEEE Robotics and Automation Letters, 7 (2), .
(doi:10.1109/LRA.2022.3140827).
Abstract
State-of-the-art visual place recognition performance is currently being achieved utilizing deep learning based approaches. Despite the recent efforts in designing lightweight convolutional neural network based models, these can still be too expensive for the most hardware restricted robot applications. Low-overhead visual place recognition techniques would not only enable platforms equipped with low-end, cheap hardware but also reduce computation on more powerful systems, allowing these resources to be allocated for other navigation tasks. In this work, our goal is to provide an algorithm of extreme compactness and efficiency while achieving state-of-the-art robustness to appearance changes and small point-of-view variations. Our first contribution is DrosoNet, an exceptionally compact model inspired by the odor processing abilities of the fruit fly, Drosophila melanogaster. Our second and main contribution is a voting mechanism that leverages multiple small and efficient classifiers to achieve more robust and consistent visual place recognition compared to a single one. We use DrosoNet as the baseline classifier for the voting mechanism and evaluate our models on five benchmark datasets, assessing moderate to extreme appearance changes and small to moderate viewpoint variations. We then compare the proposed algorithms to state-of-the-art methods, both in terms ofarea under the precision-recall curve results and computational efficiency.
Text
An_Efficient_and_Scalable_Collection_of_Fly-Inspired_Voting_Units_for_Visual_Place_Recognition_in_Changing_Environments
- Version of Record
More information
e-pub ahead of print date: 6 January 2022
Published date: 1 April 2022
Keywords:
Computational modeling, Convolutional neural networks, Feature extraction, Hardware, Navigation, Robots, Visualization
Identifiers
Local EPrints ID: 473469
URI: http://eprints.soton.ac.uk/id/eprint/473469
ISSN: 2377-3766
PURE UUID: 825bb3c0-bf32-4d10-b755-a98b34c5cb93
Catalogue record
Date deposited: 19 Jan 2023 17:34
Last modified: 17 Mar 2024 04:16
Export record
Altmetrics
Contributors
Author:
Bruno Arcanjo
Author:
Bruno Ferrarini
Author:
Michael Milford
Author:
Klaus D. McDonald-Maier
Author:
Shoaib Ehsan
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics