The University of Southampton
University of Southampton Institutional Repository

Learning Features from georeferenced seafloor imagery with location guided autoencoders

Learning Features from georeferenced seafloor imagery with location guided autoencoders
Learning Features from georeferenced seafloor imagery with location guided autoencoders
Although modern machine learning has the potential to greatly speed up the interpretation of imagery, the varied nature of the seabed and limited availability of expert annotations form barriers to its widespread use in seafloor mapping applications. This motivates research into unsupervised methods that function without large databases of human annotations. This paper develops an unsupervised feature learning method for georeferenced seafloor visual imagery that considers patterns both within the footprint of a single image frame and broader scale spatial characteristics. Features within images are learnt using an autoencoder developed based on the AlexNet deep convolutional neural network. Features larger than each image frame are learnt using a novel loss function that regularises autoencoder training using the Kullback–Leibler divergence function to loosely assume that images captured within a close distance of each other look more similar than those that are far away. The method is used to semantically interpret images taken by an autonomous underwater vehicle at the Southern Hydrates Ridge, an active gas hydrate field and site of a seafloor cabled observatory at a depth of 780 m. The method's performance when applied to clustering and content‐based image retrieval is assessed against a ground truth consisting of more than 18,000 human annotations. The study shows that the location based loss function increases the rate of information retrieval by a factor of two for seafloor mapping applications. The effects of physics‐based colour correction and image rescaling are also investigated, showing that the improved consistency of spatial information achieved by rescaling is beneficial for recognising artificial objects such as cables and infrastructures, but is less effective for natural objects that have greater dimensional variability.
autoencoder, computer vision, mapping, underwater robotics, unsupervised learning
1556-4959
52-67
Yamada, Takaki
81c66c35-0e2b-4342-80fa-cbee6ff9ce5f
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9
Yamada, Takaki
81c66c35-0e2b-4342-80fa-cbee6ff9ce5f
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9

Yamada, Takaki, Prugel-Bennett, Adam and Thornton, Blair (2021) Learning Features from georeferenced seafloor imagery with location guided autoencoders. Journal of Field Robotics, 38 (1), 52-67. (doi:10.1002/rob.21961).

Record type: Article

Abstract

Although modern machine learning has the potential to greatly speed up the interpretation of imagery, the varied nature of the seabed and limited availability of expert annotations form barriers to its widespread use in seafloor mapping applications. This motivates research into unsupervised methods that function without large databases of human annotations. This paper develops an unsupervised feature learning method for georeferenced seafloor visual imagery that considers patterns both within the footprint of a single image frame and broader scale spatial characteristics. Features within images are learnt using an autoencoder developed based on the AlexNet deep convolutional neural network. Features larger than each image frame are learnt using a novel loss function that regularises autoencoder training using the Kullback–Leibler divergence function to loosely assume that images captured within a close distance of each other look more similar than those that are far away. The method is used to semantically interpret images taken by an autonomous underwater vehicle at the Southern Hydrates Ridge, an active gas hydrate field and site of a seafloor cabled observatory at a depth of 780 m. The method's performance when applied to clustering and content‐based image retrieval is assessed against a ground truth consisting of more than 18,000 human annotations. The study shows that the location based loss function increases the rate of information retrieval by a factor of two for seafloor mapping applications. The effects of physics‐based colour correction and image rescaling are also investigated, showing that the improved consistency of spatial information achieved by rescaling is beneficial for recognising artificial objects such as cables and infrastructures, but is less effective for natural objects that have greater dimensional variability.

Text
Learning Features from Georeferenced Seafloor Imagery with Location Guided Autoencoders - Accepted Manuscript
Available under License Creative Commons Attribution.
Download (13MB)
Text
rob.21961 - Version of Record
Available under License Creative Commons Attribution.
Download (15MB)

More information

Accepted/In Press date: 9 May 2020
e-pub ahead of print date: 28 May 2020
Published date: January 2021
Keywords: autoencoder, computer vision, mapping, underwater robotics, unsupervised learning

Identifiers

Local EPrints ID: 441375
URI: http://eprints.soton.ac.uk/id/eprint/441375
ISSN: 1556-4959
PURE UUID: 95889c12-078b-45ec-ad0e-cce77fc8a820

Catalogue record

Date deposited: 10 Jun 2020 16:32
Last modified: 16 Mar 2021 17:40

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×