The University of Southampton
University of Southampton Institutional Repository

Current and future trends in marine image annotation software

Current and future trends in marine image annotation software
Current and future trends in marine image annotation software
Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation – the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images. Integration into available MIAS is currently limited to semi-automated processes of pixel recognition through computer-vision modules that compile expert-based knowledge. Important topics aiding the choice of a specific software are outlined, the ideal software is discussed and future trends are presented.
Underwater visual surveys, Image analysis, Image annotation, Data collection, Data storage, Monitoring, Marine imaging
0079-6611
106-120
Gomes-Pereira, Jose Nuno
2bb74f45-c02f-4fa8-a7aa-c5cc9cd9a61a
Auger, Vincent
3256b9a1-9333-4169-8a15-7a0a1965d61d
Beisiegel, Kolja
1d722dc1-c840-4c9a-b199-717cca7446b5
Benjamin, Robert
c34bb964-fb20-45ea-ab5c-1d28b99d0138
Bergmann, Melanie
cc9c0b63-c541-4e8a-8518-64d3c95f1213
Bowden, David
ea9951e6-f4d1-47cd-b7e1-842b60e749d6
Buhl-Mortensen, Pal
74a1ec77-7f56-44e5-96c0-d37bafd33df6
De Leo, Fabio C.
9b0ad277-0235-436e-920c-48246b83ca48
Dionísio, Gisela
ff1e8e7b-13b0-4b52-8e75-da34efe7d5cd
Durden, Jennifer
d7101246-b76b-44bc-8956-8ca4ae62ae1f
Edwards, Luke
3d170334-d531-4f06-897b-654c136786d0
Friedman, Ariell
6133e6fa-4fad-4d7e-bbf5-90c54c22d050
Greinert, Jens
4e3d5578-8e20-402e-bb76-6742df96f8db
Jacobsen-Stout, Nancy
2790e002-65b6-41bf-8aca-9c0bf61a861e
Lerner, Steve
b79b6c96-3166-4c80-aaa9-1c0d6ce1575e
Leslie, Murray
085bf55d-03c6-470f-8b90-52cd338b613f
Nattkemper, Tim W.
a6f7cd11-5871-4aa9-b781-049a392de4a6
Sameoto, Jessica A.
b6e69a88-1ee3-4c7b-a97d-b5ee8dfc514f
Schoening, Timm
76c160ff-472f-41bb-ba72-ba7388fde000
Schouten, Ronald
faffc99f-5443-434c-b1e6-b481101856ce
Seager, James
001f7f65-4654-4d6d-89cb-619199ff288b
Singh, Hanumant
9c106fa2-28f0-4de9-aaf7-519d6874b1f0
Soubigou, Olivier
265ac2b1-4f74-41b1-bf5f-8b33f85772aa
Tojeira, Inês
7ad57a46-2561-4635-9dfd-5f5c450daa18
van den Beld, Inge
fca09155-bbdc-45f6-a22b-827e843c6742
Dias, Frederico
8a96f28a-6eb2-43b8-891e-772d9d1587d4
Tempera, Fernando
36294445-c327-4ea3-8e77-705cc62ff884
Santos, Ricardo S.
8efe9cd5-5a98-48d5-8408-3e6a3ccb97e1
Gomes-Pereira, Jose Nuno
2bb74f45-c02f-4fa8-a7aa-c5cc9cd9a61a
Auger, Vincent
3256b9a1-9333-4169-8a15-7a0a1965d61d
Beisiegel, Kolja
1d722dc1-c840-4c9a-b199-717cca7446b5
Benjamin, Robert
c34bb964-fb20-45ea-ab5c-1d28b99d0138
Bergmann, Melanie
cc9c0b63-c541-4e8a-8518-64d3c95f1213
Bowden, David
ea9951e6-f4d1-47cd-b7e1-842b60e749d6
Buhl-Mortensen, Pal
74a1ec77-7f56-44e5-96c0-d37bafd33df6
De Leo, Fabio C.
9b0ad277-0235-436e-920c-48246b83ca48
Dionísio, Gisela
ff1e8e7b-13b0-4b52-8e75-da34efe7d5cd
Durden, Jennifer
d7101246-b76b-44bc-8956-8ca4ae62ae1f
Edwards, Luke
3d170334-d531-4f06-897b-654c136786d0
Friedman, Ariell
6133e6fa-4fad-4d7e-bbf5-90c54c22d050
Greinert, Jens
4e3d5578-8e20-402e-bb76-6742df96f8db
Jacobsen-Stout, Nancy
2790e002-65b6-41bf-8aca-9c0bf61a861e
Lerner, Steve
b79b6c96-3166-4c80-aaa9-1c0d6ce1575e
Leslie, Murray
085bf55d-03c6-470f-8b90-52cd338b613f
Nattkemper, Tim W.
a6f7cd11-5871-4aa9-b781-049a392de4a6
Sameoto, Jessica A.
b6e69a88-1ee3-4c7b-a97d-b5ee8dfc514f
Schoening, Timm
76c160ff-472f-41bb-ba72-ba7388fde000
Schouten, Ronald
faffc99f-5443-434c-b1e6-b481101856ce
Seager, James
001f7f65-4654-4d6d-89cb-619199ff288b
Singh, Hanumant
9c106fa2-28f0-4de9-aaf7-519d6874b1f0
Soubigou, Olivier
265ac2b1-4f74-41b1-bf5f-8b33f85772aa
Tojeira, Inês
7ad57a46-2561-4635-9dfd-5f5c450daa18
van den Beld, Inge
fca09155-bbdc-45f6-a22b-827e843c6742
Dias, Frederico
8a96f28a-6eb2-43b8-891e-772d9d1587d4
Tempera, Fernando
36294445-c327-4ea3-8e77-705cc62ff884
Santos, Ricardo S.
8efe9cd5-5a98-48d5-8408-3e6a3ccb97e1

Gomes-Pereira, Jose Nuno, Auger, Vincent, Beisiegel, Kolja, Benjamin, Robert, Bergmann, Melanie, Bowden, David, Buhl-Mortensen, Pal, De Leo, Fabio C., Dionísio, Gisela, Durden, Jennifer, Edwards, Luke, Friedman, Ariell, Greinert, Jens, Jacobsen-Stout, Nancy, Lerner, Steve, Leslie, Murray, Nattkemper, Tim W., Sameoto, Jessica A., Schoening, Timm, Schouten, Ronald, Seager, James, Singh, Hanumant, Soubigou, Olivier, Tojeira, Inês, van den Beld, Inge, Dias, Frederico, Tempera, Fernando and Santos, Ricardo S. (2016) Current and future trends in marine image annotation software. Progress in Oceanography, 149, 106-120. (doi:10.1016/j.pocean.2016.07.005).

Record type: Article

Abstract

Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation – the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images. Integration into available MIAS is currently limited to semi-automated processes of pixel recognition through computer-vision modules that compile expert-based knowledge. Important topics aiding the choice of a specific software are outlined, the ideal software is discussed and future trends are presented.

This record has no associated files available for download.

More information

Accepted/In Press date: 11 July 2016
Published date: 1 December 2016
Keywords: Underwater visual surveys, Image analysis, Image annotation, Data collection, Data storage, Monitoring, Marine imaging
Organisations: Ocean and Earth Science

Identifiers

Local EPrints ID: 404134
URI: http://eprints.soton.ac.uk/id/eprint/404134
ISSN: 0079-6611
PURE UUID: 03b2b30d-aedc-4c02-9734-3f2332695e3d

Catalogue record

Date deposited: 21 Dec 2016 14:50
Last modified: 27 Apr 2022 05:20

Export record

Altmetrics

Contributors

Author: Jose Nuno Gomes-Pereira
Author: Vincent Auger
Author: Kolja Beisiegel
Author: Robert Benjamin
Author: Melanie Bergmann
Author: David Bowden
Author: Pal Buhl-Mortensen
Author: Fabio C. De Leo
Author: Gisela Dionísio
Author: Jennifer Durden
Author: Luke Edwards
Author: Ariell Friedman
Author: Jens Greinert
Author: Nancy Jacobsen-Stout
Author: Steve Lerner
Author: Murray Leslie
Author: Tim W. Nattkemper
Author: Jessica A. Sameoto
Author: Timm Schoening
Author: Ronald Schouten
Author: James Seager
Author: Hanumant Singh
Author: Olivier Soubigou
Author: Inês Tojeira
Author: Inge van den Beld
Author: Frederico Dias
Author: Fernando Tempera
Author: Ricardo S. Santos

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×