The University of Southampton
University of Southampton Institutional Repository

Guiding labelling effort for efficient learning with georeferenced images

Guiding labelling effort for efficient learning with georeferenced images
Guiding labelling effort for efficient learning with georeferenced images

We describe a novel semi-supervised learning method that reduces the labelling effort needed to train convolutional neural networks (CNNs) when processing georeferenced imagery. This allows deep learning CNNs to be trained on a per-dataset basis, which is useful in domains where there is limited learning transferability across datasets. The method identifies representative subsets of images from an unlabelled dataset based on the latent representation of a location guided autoencoder. We assess the method's sensitivities to design options using four different ground-truthed datasets of georeferenced environmental monitoring images, where these include various scenes in aerial and seafloor imagery. Efficiency gains are achieved for all the aerial and seafloor image datasets analysed in our experiments, demonstrating the benefit of the method across application domains. Compared to CNNs of the same architecture trained using conventional transfer and active learning, the method achieves equivalent accuracy with an order of magnitude fewer annotations, and 85 % of the accuracy of CNNs trained conventionally with approximately 10,000 human annotations using just 40 prioritised annotations. The biggest gains in efficiency are seen in datasets with unbalanced class distributions and rare classes that have a relatively small number of observations.

Annotations, Deep learning, Environmental monitoring, Labeling, Satellites, Semi-supervised learning, Training, Unsupervised learning, autoencoder, convolutional neural network, georeferenced imagery, pseudo-labelling
1939-3539
593-607
Yamada, Takaki
81c66c35-0e2b-4342-80fa-cbee6ff9ce5f
Massot Campos, Miguel
a55d7b32-c097-4adf-9483-16bbf07f9120
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Pizarro, Oscar
45a78dfb-5f85-4595-bfc8-d1ac46f2e7bb
Williams, Stefan B.
c9477238-5139-4b74-804c-3b9b464f6949
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9
Yamada, Takaki
81c66c35-0e2b-4342-80fa-cbee6ff9ce5f
Massot Campos, Miguel
a55d7b32-c097-4adf-9483-16bbf07f9120
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Pizarro, Oscar
45a78dfb-5f85-4595-bfc8-d1ac46f2e7bb
Williams, Stefan B.
c9477238-5139-4b74-804c-3b9b464f6949
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9

Yamada, Takaki, Massot Campos, Miguel, Prugel-Bennett, Adam, Pizarro, Oscar, Williams, Stefan B. and Thornton, Blair (2022) Guiding labelling effort for efficient learning with georeferenced images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45 (1), 593-607. (doi:10.1109/TPAMI.2021.3140060).

Record type: Article

Abstract

We describe a novel semi-supervised learning method that reduces the labelling effort needed to train convolutional neural networks (CNNs) when processing georeferenced imagery. This allows deep learning CNNs to be trained on a per-dataset basis, which is useful in domains where there is limited learning transferability across datasets. The method identifies representative subsets of images from an unlabelled dataset based on the latent representation of a location guided autoencoder. We assess the method's sensitivities to design options using four different ground-truthed datasets of georeferenced environmental monitoring images, where these include various scenes in aerial and seafloor imagery. Efficiency gains are achieved for all the aerial and seafloor image datasets analysed in our experiments, demonstrating the benefit of the method across application domains. Compared to CNNs of the same architecture trained using conventional transfer and active learning, the method achieves equivalent accuracy with an order of magnitude fewer annotations, and 85 % of the accuracy of CNNs trained conventionally with approximately 10,000 human annotations using just 40 prioritised annotations. The biggest gains in efficiency are seen in datasets with unbalanced class distributions and rare classes that have a relatively small number of observations.

Text
Yamada_2021_PAMI - Accepted Manuscript
Download (2MB)
Text
Guiding_Labelling_Effort_for_Efficient_Learning_With_Georeferenced_Images - Version of Record
Available under License Creative Commons Attribution.
Download (17MB)
Text
Yamada_2021_PAMI_supplementary (1) - Other
Download (2MB)

More information

Accepted/In Press date: 2022
Published date: 4 January 2022
Additional Information: Funding Information: This work was supported in part by U.K. Natural Environment Research Council's Oceanids Biocam under Grant NE/P020887/1 and in part by Australian Research Council's Automated Benthic Understanding Discovery Project under Grant DP190103914. Publisher Copyright: © 1979-2012 IEEE.
Keywords: Annotations, Deep learning, Environmental monitoring, Labeling, Satellites, Semi-supervised learning, Training, Unsupervised learning, autoencoder, convolutional neural network, georeferenced imagery, pseudo-labelling

Identifiers

Local EPrints ID: 453273
URI: http://eprints.soton.ac.uk/id/eprint/453273
ISSN: 1939-3539
PURE UUID: b14854cb-6021-4bc1-b354-639b14e9dbe8
ORCID for Takaki Yamada: ORCID iD orcid.org/0000-0002-5090-7239
ORCID for Miguel Massot Campos: ORCID iD orcid.org/0000-0002-1202-0362

Catalogue record

Date deposited: 11 Jan 2022 17:51
Last modified: 17 Mar 2024 07:02

Export record

Altmetrics

Contributors

Author: Takaki Yamada ORCID iD
Author: Adam Prugel-Bennett
Author: Oscar Pizarro
Author: Stefan B. Williams
Author: Blair Thornton

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×