The University of Southampton
University of Southampton Institutional Repository
Warning ePrints Soton is experiencing an issue with some file downloads not being available. We are working hard to fix this. Please bear with us.

Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation

Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation
Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation
The performance of automated object detection and segmentation in marine imaging applications is sensitive to hardware and environmental factors that result in a large variability in the appearance of subjects in images. This paper investigates physics based scale normalisation, lens distortion normalisation, and data augmentation techniques to overcome this, working towards a condition agnostic object detection system. A total of over 700 rockfish in images taken from different altitudes using different camera equipped Autonomous Underwater Vehicles at the Southern Hydrates Ridge (depth 780m) are used to train and test object detection and segmentation using Mask R-CNN. Images taken from low altitudes of 2m achieve a maximum mean average precision (mAP) score of 97.42%, and images taken from high altitudes of ~6m achieve a maximum score of 87.4% when object detection and segmentation is trained and tested on images taken from the same altitudes. When transferring knowledge across different imaging conditions, a mAP score of 87.7% is achieved when transferring knowledge from high to low altitude datasets, and 49.6% when transferring from low to high altitudes. In both cases, significant gains in performance is seen when the images used are scale normalised. The results indicate that increasing the pixel resolution, or the size an object appears within the image, benefits learning regardless of the optical resolution images are taken at, and this should be carefully considered in future object detection and segmentation studies. We also describe a novel method to estimate biomass distribution from the segments output by modern machine learning algorithms that can be easily adapted for different morphospecies.
Seafloor Images, Machine Learning, Object Detection and Segmentation, Mask RCNN, Autonomous Underwater Vehicles
Walker, Jennifer
267d3703-c27a-469e-9eb5-1480e94ea13a
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9
Walker, Jennifer
267d3703-c27a-469e-9eb5-1480e94ea13a
Prugel-Bennett, Adam
b107a151-1751-4d8b-b8db-2c395ac4e14e
Thornton, Blair
8293beb5-c083-47e3-b5f0-d9c3cee14be9

Walker, Jennifer, Prugel-Bennett, Adam and Thornton, Blair (2021) Towards observation condition agnostic fauna detection and segmentation in seafloor imagery for biomass estimation. OCEANS 2021, San Diego, San Diego, United States. 20 - 23 Sep 2021. (Submitted)

Record type: Conference or Workshop Item (Paper)

Abstract

The performance of automated object detection and segmentation in marine imaging applications is sensitive to hardware and environmental factors that result in a large variability in the appearance of subjects in images. This paper investigates physics based scale normalisation, lens distortion normalisation, and data augmentation techniques to overcome this, working towards a condition agnostic object detection system. A total of over 700 rockfish in images taken from different altitudes using different camera equipped Autonomous Underwater Vehicles at the Southern Hydrates Ridge (depth 780m) are used to train and test object detection and segmentation using Mask R-CNN. Images taken from low altitudes of 2m achieve a maximum mean average precision (mAP) score of 97.42%, and images taken from high altitudes of ~6m achieve a maximum score of 87.4% when object detection and segmentation is trained and tested on images taken from the same altitudes. When transferring knowledge across different imaging conditions, a mAP score of 87.7% is achieved when transferring knowledge from high to low altitude datasets, and 49.6% when transferring from low to high altitudes. In both cases, significant gains in performance is seen when the images used are scale normalised. The results indicate that increasing the pixel resolution, or the size an object appears within the image, benefits learning regardless of the optical resolution images are taken at, and this should be carefully considered in future object detection and segmentation studies. We also describe a novel method to estimate biomass distribution from the segments output by modern machine learning algorithms that can be easily adapted for different morphospecies.

Text
Walker_2021_Oceans - Accepted Manuscript
Restricted to Repository staff only
Request a copy

More information

Submitted date: August 2021
Venue - Dates: OCEANS 2021, San Diego, San Diego, United States, 2021-09-20 - 2021-09-23
Keywords: Seafloor Images, Machine Learning, Object Detection and Segmentation, Mask RCNN, Autonomous Underwater Vehicles

Identifiers

Local EPrints ID: 450897
URI: http://eprints.soton.ac.uk/id/eprint/450897
PURE UUID: 70f67a28-efe5-4d60-b6c0-d3d5928eac4d

Catalogue record

Date deposited: 19 Aug 2021 16:30
Last modified: 25 Aug 2021 16:31

Export record

Contributors

Author: Jennifer Walker
Author: Adam Prugel-Bennett
Author: Blair Thornton

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×