The University of Southampton
University of Southampton Institutional Repository

Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation

Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation
Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation
The performance of automated object detection and segmentation in marine imaging applications is sensitive to hardware and environmental factors that result in a large variability in the appearance of subjects in images. This paper investigates physics based scale normalisation, lens distortion normalisation, and data augmentation techniques to overcome this, working towards a condition agnostic object detection system. A total of over 700 rockfish in images taken from different altitudes using different camera equipped Autonomous Underwater Vehicles at the Southern Hydrates Ridge (depth 780m) are used to train and test object detection and segmentation using Mask R-CNN. Images taken from low altitudes of 2m achieve a maximum mean average precision (mAP) score of 97.42%, and images taken from high altitudes of ~6m achieve a maximum score of 87.4% when object detection and segmentation is trained and tested on images taken from the same altitudes. When transferring knowledge across different imaging conditions, a mAP score of 87.7% is achieved when transferring knowledge from high to low altitude datasets, and 49.6% when transferring from low to high altitudes. In both cases, significant gains in performance is seen when the images used are scale normalised. The results indicate that increasing the pixel resolution, or the size an object appears within the image, benefits learning regardless of the optical resolution images are taken at, and this should be carefully considered in future object detection and segmentation studies. We also describe a novel method to estimate biomass distribution from the segments output by modern machine learning algorithms that can be easily adapted for different morphospecies.
Autonomous Under-water Vehicles, Machine Learning, Mask RCNN, Object Detection and Segmentation, Seafloor Images
0197-7385
IEEE
Walker, Jennifer
267d3703-c27a-469e-9eb5-1480e94ea13a
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9
Walker, Jennifer
267d3703-c27a-469e-9eb5-1480e94ea13a
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9

Walker, Jennifer, Prugel-Bennett, Adam and Thornton, Blair (2021) Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation. In OCEANS 2021: San Diego - Porto. vol. 2021-September, IEEE. 8 pp . (doi:10.23919/OCEANS44145.2021.9705692).

Record type: Conference or Workshop Item (Paper)

Abstract

The performance of automated object detection and segmentation in marine imaging applications is sensitive to hardware and environmental factors that result in a large variability in the appearance of subjects in images. This paper investigates physics based scale normalisation, lens distortion normalisation, and data augmentation techniques to overcome this, working towards a condition agnostic object detection system. A total of over 700 rockfish in images taken from different altitudes using different camera equipped Autonomous Underwater Vehicles at the Southern Hydrates Ridge (depth 780m) are used to train and test object detection and segmentation using Mask R-CNN. Images taken from low altitudes of 2m achieve a maximum mean average precision (mAP) score of 97.42%, and images taken from high altitudes of ~6m achieve a maximum score of 87.4% when object detection and segmentation is trained and tested on images taken from the same altitudes. When transferring knowledge across different imaging conditions, a mAP score of 87.7% is achieved when transferring knowledge from high to low altitude datasets, and 49.6% when transferring from low to high altitudes. In both cases, significant gains in performance is seen when the images used are scale normalised. The results indicate that increasing the pixel resolution, or the size an object appears within the image, benefits learning regardless of the optical resolution images are taken at, and this should be carefully considered in future object detection and segmentation studies. We also describe a novel method to estimate biomass distribution from the segments output by modern machine learning algorithms that can be easily adapted for different morphospecies.

Text
Walker_2021_Oceans - Accepted Manuscript
Restricted to Repository staff only
Request a copy

More information

Submitted date: August 2021
Published date: 20 September 2021
Additional Information: Publisher Copyright: © 2021 MTS.
Venue - Dates: OCEANS 2021: San Diego - Porto, , San Diego, United States, 2021-09-20 - 2021-09-23
Keywords: Autonomous Under-water Vehicles, Machine Learning, Mask RCNN, Object Detection and Segmentation, Seafloor Images

Identifiers

Local EPrints ID: 450897
URI: http://eprints.soton.ac.uk/id/eprint/450897
ISSN: 0197-7385
PURE UUID: 70f67a28-efe5-4d60-b6c0-d3d5928eac4d
ORCID for Jennifer Walker: ORCID iD orcid.org/0000-0002-1449-9012

Catalogue record

Date deposited: 19 Aug 2021 16:30
Last modified: 17 Mar 2024 03:47

Export record

Altmetrics

Contributors

Author: Jennifer Walker ORCID iD
Author: Adam Prugel-Bennett
Author: Blair Thornton

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×