The University of Southampton
University of Southampton Institutional Repository

Explainable Deep Learning to Classify Royal Navy Ships

Explainable Deep Learning to Classify Royal Navy Ships
Explainable Deep Learning to Classify Royal Navy Ships

We research how deep learning convolutional neural networks can be used to automatically classify the unique data set of black-and-white naval ships images from the Wright and Logan photographic collection held by the National Museum of the Royal Navy. We contrast various types of deep learning methods: pretrained models such as ConvNeXt, ResNet and EfficientNet, and ConvMixer. We also thoroughly investigate the impact of data preprocessing and externally obtained images on model performance. Finally, we research how the models estimated can be made transparent using visually appealing interpretability techniques such as Grad-CAM. We find that ConvNeXt has the best performance for our data set achieving an accuracy of 79.62% for 0-notch classification and an impressive 94.86% for 1-notch classification. The results indicate the importance of appropriate image preprocessing. Image segmentation combined with soft augmentation significantly contributes to model performance. We consider this research to be original in several aspects. Notably, it distinguishes itself through the uniqueness of the acquired dataset. Additionally, its distinctiveness extends to the analytical modeling pipeline, which encompasses a comprehensive range of modeling steps, including data preprocessing (incorporating external data, image segmentation, and image augmentation) and the use of deep learning techniques such as ConvNeXt, ResNet, EfficientNet, and ConvMixer. Furthermore, the research employs explanatory tools like Grad-CAM to enhance model interpretability and usability. We believe the proposed methodology offers lots of potential for documenting historic image collections.

Convolutional Neural Networks, Convolutional neural networks, Deep Learning, Deep learning, Digitised Archives, Documentation, Explainability, Image Classification, Labeling, Marine vehicles, Modeling, Museums, Royal Navy, royal navy, explainability, image classification, deep learning, digitised archives
2169-3536
1774-1785
Baesens, Bart
f7c6496b-aa7f-4026-8616-ca61d9e216f0
Adams, Amy
222cd26a-7833-4dab-984e-4102c9d74f02
Pacheco-Ruiz, Rodrigo
5966635c-eca5-4852-8f73-3b5b00e3b7ec
Baesens, Ann-Sophie
2c881a45-fe58-4fb6-8233-22dc5a4937cc
Vanden Broucke, Seppe
89c69367-232e-4c1e-9e57-531bf474e12d
Baesens, Bart
f7c6496b-aa7f-4026-8616-ca61d9e216f0
Adams, Amy
222cd26a-7833-4dab-984e-4102c9d74f02
Pacheco-Ruiz, Rodrigo
5966635c-eca5-4852-8f73-3b5b00e3b7ec
Baesens, Ann-Sophie
2c881a45-fe58-4fb6-8233-22dc5a4937cc
Vanden Broucke, Seppe
89c69367-232e-4c1e-9e57-531bf474e12d

Baesens, Bart, Adams, Amy, Pacheco-Ruiz, Rodrigo, Baesens, Ann-Sophie and Vanden Broucke, Seppe (2024) Explainable Deep Learning to Classify Royal Navy Ships. IEEE Access, 12, 1774-1785. (doi:10.1109/ACCESS.2023.3346061).

Record type: Article

Abstract

We research how deep learning convolutional neural networks can be used to automatically classify the unique data set of black-and-white naval ships images from the Wright and Logan photographic collection held by the National Museum of the Royal Navy. We contrast various types of deep learning methods: pretrained models such as ConvNeXt, ResNet and EfficientNet, and ConvMixer. We also thoroughly investigate the impact of data preprocessing and externally obtained images on model performance. Finally, we research how the models estimated can be made transparent using visually appealing interpretability techniques such as Grad-CAM. We find that ConvNeXt has the best performance for our data set achieving an accuracy of 79.62% for 0-notch classification and an impressive 94.86% for 1-notch classification. The results indicate the importance of appropriate image preprocessing. Image segmentation combined with soft augmentation significantly contributes to model performance. We consider this research to be original in several aspects. Notably, it distinguishes itself through the uniqueness of the acquired dataset. Additionally, its distinctiveness extends to the analytical modeling pipeline, which encompasses a comprehensive range of modeling steps, including data preprocessing (incorporating external data, image segmentation, and image augmentation) and the use of deep learning techniques such as ConvNeXt, ResNet, EfficientNet, and ConvMixer. Furthermore, the research employs explanatory tools like Grad-CAM to enhance model interpretability and usability. We believe the proposed methodology offers lots of potential for documenting historic image collections.

Text
Royal_Navy_IEEE_Access_Baesens - Accepted Manuscript
Download (11MB)

More information

Accepted/In Press date: 15 December 2023
e-pub ahead of print date: 22 December 2023
Published date: 2024
Additional Information: Publisher Copyright: © 2013 IEEE.
Keywords: Convolutional Neural Networks, Convolutional neural networks, Deep Learning, Deep learning, Digitised Archives, Documentation, Explainability, Image Classification, Labeling, Marine vehicles, Modeling, Museums, Royal Navy, royal navy, explainability, image classification, deep learning, digitised archives

Identifiers

Local EPrints ID: 485960
URI: http://eprints.soton.ac.uk/id/eprint/485960
ISSN: 2169-3536
PURE UUID: 1ca89786-6c80-453f-8d73-7666d85fa738
ORCID for Bart Baesens: ORCID iD orcid.org/0000-0002-5831-5668

Catalogue record

Date deposited: 04 Jan 2024 06:19
Last modified: 18 Mar 2024 02:59

Export record

Altmetrics

Contributors

Author: Bart Baesens ORCID iD
Author: Amy Adams
Author: Ann-Sophie Baesens
Author: Seppe Vanden Broucke

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×