The University of Southampton
University of Southampton Institutional Repository

Enhanced deep-joint segmentation with deep learning networks of glioma tumor for multi-grade classification using MR images

Enhanced deep-joint segmentation with deep learning networks of glioma tumor for multi-grade classification using MR images
Enhanced deep-joint segmentation with deep learning networks of glioma tumor for multi-grade classification using MR images
The crucial imaging modality employed in medicinal diagnostic tools to detect the tumors is magnetic resonance image (MRI). Based on the glioma anatomical structures, MRI poses the capability to provide detailed information. Anyhow, in the MRI classification the foremost problem is the semantic gap between optical information at the low level, which is attained from the MRI machine, whereas information at the high level is alleged by a clinician. In this research, Tunicate-Exponential weighted moving average (TEWMA)-based deep convolutional neural Network (TEWMA-deep CNN) is devised for multi-grade classification. In this method, the preprocessing is employed to eradicate the artifacts present in the image. Moreover, deep-joint segmentation is modified with the weighted Euclidean and Levenshtein distance measures, which are effectively used for segmenting the tumor regions. Then, the classification is done from the image-segmented areas by deep CNN to determine gliomas, meningioma, pituitary, and others, which is tuned by developed TEWMA. The experimentation of the devised approach is performed by three datasets, such as BRATS 2015, figshare, and BRATS 2020 dataset. The developed TEWMA is designed by incorporating Tunicate swarm algorithm (TSA) and exponentially weighted moving average (EWMA) algorithm, with the highest specificity of 99%, highest accuracy of 98.76%, highest sensitivity of 98.88%, maximal precision of 94.76%, maximal F1-measure of 98.46%, and minimal time of 7.24 s using dataset-1 for classification. Also, the proposed method attains average specificity, accuracy, sensitivity, precision, F-measure, and time of 91.09, 93.79, 95.46, 92.33, 94.30%, and 6.23 s, respectively, using dataset-1.
Deep convolutional neural network, Deep-joint segmentation, Exponentially weighted moving average, Magnetic resonance images, Tunicate swarm optimization
1433-7541
891-911
Divya, S.
3ee4e63f-4f55-41da-80ae-18de34842645
Padma Suresh, L.
38fc3ec3-1bec-4a5a-bc62-ec1a81c69a4b
John, A.
0df1aae3-e580-4a5e-ae6e-7499d8c225fe
Divya, S.
3ee4e63f-4f55-41da-80ae-18de34842645
Padma Suresh, L.
38fc3ec3-1bec-4a5a-bc62-ec1a81c69a4b
John, A.
0df1aae3-e580-4a5e-ae6e-7499d8c225fe

Divya, S., Padma Suresh, L. and John, A. (2022) Enhanced deep-joint segmentation with deep learning networks of glioma tumor for multi-grade classification using MR images. Pattern Analysis and Applications, 25 (4), 891-911. (doi:10.1007/s10044-022-01064-5).

Record type: Article

Abstract

The crucial imaging modality employed in medicinal diagnostic tools to detect the tumors is magnetic resonance image (MRI). Based on the glioma anatomical structures, MRI poses the capability to provide detailed information. Anyhow, in the MRI classification the foremost problem is the semantic gap between optical information at the low level, which is attained from the MRI machine, whereas information at the high level is alleged by a clinician. In this research, Tunicate-Exponential weighted moving average (TEWMA)-based deep convolutional neural Network (TEWMA-deep CNN) is devised for multi-grade classification. In this method, the preprocessing is employed to eradicate the artifacts present in the image. Moreover, deep-joint segmentation is modified with the weighted Euclidean and Levenshtein distance measures, which are effectively used for segmenting the tumor regions. Then, the classification is done from the image-segmented areas by deep CNN to determine gliomas, meningioma, pituitary, and others, which is tuned by developed TEWMA. The experimentation of the devised approach is performed by three datasets, such as BRATS 2015, figshare, and BRATS 2020 dataset. The developed TEWMA is designed by incorporating Tunicate swarm algorithm (TSA) and exponentially weighted moving average (EWMA) algorithm, with the highest specificity of 99%, highest accuracy of 98.76%, highest sensitivity of 98.88%, maximal precision of 94.76%, maximal F1-measure of 98.46%, and minimal time of 7.24 s using dataset-1 for classification. Also, the proposed method attains average specificity, accuracy, sensitivity, precision, F-measure, and time of 91.09, 93.79, 95.46, 92.33, 94.30%, and 6.23 s, respectively, using dataset-1.

Text
s10044-022-01064-5 - Version of Record
Restricted to Repository staff only
Available under License Other.
Request a copy

More information

Accepted/In Press date: 11 March 2022
e-pub ahead of print date: 1 July 2022
Published date: November 2022
Additional Information: Publisher Copyright: © 2022, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.
Keywords: Deep convolutional neural network, Deep-joint segmentation, Exponentially weighted moving average, Magnetic resonance images, Tunicate swarm optimization

Identifiers

Local EPrints ID: 501227
URI: http://eprints.soton.ac.uk/id/eprint/501227
ISSN: 1433-7541
PURE UUID: d9dcd716-5cec-4c9f-979d-c5ed4759c688
ORCID for S. Divya: ORCID iD orcid.org/0000-0002-7302-7146

Catalogue record

Date deposited: 27 May 2025 17:58
Last modified: 28 May 2025 02:18

Export record

Altmetrics

Contributors

Author: S. Divya ORCID iD
Author: L. Padma Suresh
Author: A. John

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×