The University of Southampton
University of Southampton Institutional Repository

MAIA-A machine learning assisted image annotation method for environmental monitoring and exploration

MAIA-A machine learning assisted image annotation method for environmental monitoring and exploration
MAIA-A machine learning assisted image annotation method for environmental monitoring and exploration

Digital imaging has become one of the most important techniques in environmental monitoring and exploration. In the case of the marine environment, mobile platforms such as autonomous underwater vehicles (AUVs) are now equipped with high-resolution cameras to capture huge collections of images from the seabed. However, the timely evaluation of all these images presents a bottleneck problem as tens of thousands or more images can be collected during a single dive. This makes computational support for marine image analysis essential. Computer-aided analysis of environmental images (and marine images in particular) with machine learning algorithms is promising, but challenging and different to other imaging domains because training data and class labels cannot be collected as efficiently and comprehensively as in other areas. In this paper, we present Machine learning Assisted Image Annotation (MAIA), a new image annotation method for environmental monitoring and exploration that overcomes the obstacle of missing training data. The method uses a combination of autoencoder networks and Mask Region-based Convolutional Neural Network (Mask R-CNN), which allows human observers to annotate large image collections much faster than before. We evaluated the method with three marine image datasets featuring different types of background, imaging equipment and object classes. Using MAIA, we were able to annotate objects of interest with an average recall of 84.1% more than twice as fast as compared to "traditional" annotation methods, which are purely based on software-supported direct visual inspection and manual annotation. The speed gain increases proportionally with the size of a dataset. The MAIA approach represents a substantial improvement on the path to greater efficiency in the annotation of large benthic image collections.

1932-6203
e0207498
Zurowietz, Martin
620d1c96-3e2b-45de-a421-5df523bcbf12
Langenkämper, Daniel
101fc0f4-902e-4040-a351-5d4069ea4e78
Hosking, Brett
f0b38c0e-2ae2-4cab-8e10-e05696dd505d
Ruhl, Henry A.
177608ef-7793-4911-86cf-cd9960ff22b6
Nattkemper, Tim W.
a6f7cd11-5871-4aa9-b781-049a392de4a6
Zurowietz, Martin
620d1c96-3e2b-45de-a421-5df523bcbf12
Langenkämper, Daniel
101fc0f4-902e-4040-a351-5d4069ea4e78
Hosking, Brett
f0b38c0e-2ae2-4cab-8e10-e05696dd505d
Ruhl, Henry A.
177608ef-7793-4911-86cf-cd9960ff22b6
Nattkemper, Tim W.
a6f7cd11-5871-4aa9-b781-049a392de4a6

Zurowietz, Martin, Langenkämper, Daniel, Hosking, Brett, Ruhl, Henry A. and Nattkemper, Tim W. (2018) MAIA-A machine learning assisted image annotation method for environmental monitoring and exploration. PLoS ONE, 13 (11), e0207498. (doi:10.1371/journal.pone.0207498).

Record type: Article

Abstract

Digital imaging has become one of the most important techniques in environmental monitoring and exploration. In the case of the marine environment, mobile platforms such as autonomous underwater vehicles (AUVs) are now equipped with high-resolution cameras to capture huge collections of images from the seabed. However, the timely evaluation of all these images presents a bottleneck problem as tens of thousands or more images can be collected during a single dive. This makes computational support for marine image analysis essential. Computer-aided analysis of environmental images (and marine images in particular) with machine learning algorithms is promising, but challenging and different to other imaging domains because training data and class labels cannot be collected as efficiently and comprehensively as in other areas. In this paper, we present Machine learning Assisted Image Annotation (MAIA), a new image annotation method for environmental monitoring and exploration that overcomes the obstacle of missing training data. The method uses a combination of autoencoder networks and Mask Region-based Convolutional Neural Network (Mask R-CNN), which allows human observers to annotate large image collections much faster than before. We evaluated the method with three marine image datasets featuring different types of background, imaging equipment and object classes. Using MAIA, we were able to annotate objects of interest with an average recall of 84.1% more than twice as fast as compared to "traditional" annotation methods, which are purely based on software-supported direct visual inspection and manual annotation. The speed gain increases proportionally with the size of a dataset. The MAIA approach represents a substantial improvement on the path to greater efficiency in the annotation of large benthic image collections.

Text
journal.pone.0207498 - Version of Record
Available under License Creative Commons Attribution.
Download (2MB)

More information

Accepted/In Press date: 31 October 2018
e-pub ahead of print date: 16 November 2018

Identifiers

Local EPrints ID: 426722
URI: http://eprints.soton.ac.uk/id/eprint/426722
ISSN: 1932-6203
PURE UUID: cb824eb4-02d8-4eac-88c7-6e87b21ae996

Catalogue record

Date deposited: 11 Dec 2018 17:30
Last modified: 15 Mar 2024 23:19

Export record

Altmetrics

Contributors

Author: Martin Zurowietz
Author: Daniel Langenkämper
Author: Brett Hosking
Author: Henry A. Ruhl
Author: Tim W. Nattkemper

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×