The University of Southampton
University of Southampton Institutional Repository

Non-invasive detection of anemia using lip mucosa images transfer learning convolutional neural networks

Non-invasive detection of anemia using lip mucosa images transfer learning convolutional neural networks
Non-invasive detection of anemia using lip mucosa images transfer learning convolutional neural networks
Anemia is defined as a drop in the number of erythrocytes or hemoglobin concentration below normal levels in healthy people. The increase in paleness of the skin might vary based on the color of the skin, although there is currently no quantifiable measurement. The pallor of the skin is best visible in locations where the cuticle is thin, such as the interior of the mouth, lips, or conjunctiva. This work focuses on anemia-related pallors and their relationship to blood count values and artificial intelligence. In this study, a deep learning approach using transfer learning and Convolutional Neural Networks (CNN) was implemented in which VGG16, Xception, MobileNet, and ResNet50 architectures, were pre-trained to predict anemia using lip mucous images. A total of 138 volunteers (100 women and 38 men) participated in the work to develop the dataset that contains two image classes: healthy and anemic. Image processing was first performed on a single frame with only the mouth area visible, data argumentation was preformed, and then CNN models were applied to classify the dataset lip images. Statistical metrics were employed to discriminate the performance of the models in terms of Accuracy, Precision, Recal, and F1 Score. Among the CNN algorithms used, Xception was found to categorize the lip images with 99.28% accuracy, providing the best results. The other CNN architectures had accuracies of 96.38% for MobileNet, 95.65% for ResNet %, and 92.39% for VGG16. Our findings show that anemia may be diagnosed using deep learning approaches from a single lip image. This data set will be enhanced in the future to allow for real-time classification
anemia, classification, convolutional neural network (CNN), deep learning, image processing
2624-909X
Mahmud, Shekhar
158ca15c-3bae-472c-8822-6cdf0ea3b616
Mansour, Mohammed
23071427-e171-4a2f-9bb7-4aa6db8b5c65
Donmez, Turker Berk
cb0f9d27-63ba-4d8b-9c68-350a47531249
Kutlu, Mustafa
f0592223-1cb3-493b-8034-a2e07e3e9f3b
Freeman, Chris
ccdd1272-cdc7-43fb-a1bb-b1ef0bdf5815
Mahmud, Shekhar
158ca15c-3bae-472c-8822-6cdf0ea3b616
Mansour, Mohammed
23071427-e171-4a2f-9bb7-4aa6db8b5c65
Donmez, Turker Berk
cb0f9d27-63ba-4d8b-9c68-350a47531249
Kutlu, Mustafa
f0592223-1cb3-493b-8034-a2e07e3e9f3b
Freeman, Chris
ccdd1272-cdc7-43fb-a1bb-b1ef0bdf5815

Mahmud, Shekhar, Mansour, Mohammed, Donmez, Turker Berk, Kutlu, Mustafa and Freeman, Chris (2023) Non-invasive detection of anemia using lip mucosa images transfer learning convolutional neural networks. Frontiers in Big Data, 6, [1291329]. (doi:10.3389/fdata.2023.1291329).

Record type: Article

Abstract

Anemia is defined as a drop in the number of erythrocytes or hemoglobin concentration below normal levels in healthy people. The increase in paleness of the skin might vary based on the color of the skin, although there is currently no quantifiable measurement. The pallor of the skin is best visible in locations where the cuticle is thin, such as the interior of the mouth, lips, or conjunctiva. This work focuses on anemia-related pallors and their relationship to blood count values and artificial intelligence. In this study, a deep learning approach using transfer learning and Convolutional Neural Networks (CNN) was implemented in which VGG16, Xception, MobileNet, and ResNet50 architectures, were pre-trained to predict anemia using lip mucous images. A total of 138 volunteers (100 women and 38 men) participated in the work to develop the dataset that contains two image classes: healthy and anemic. Image processing was first performed on a single frame with only the mouth area visible, data argumentation was preformed, and then CNN models were applied to classify the dataset lip images. Statistical metrics were employed to discriminate the performance of the models in terms of Accuracy, Precision, Recal, and F1 Score. Among the CNN algorithms used, Xception was found to categorize the lip images with 99.28% accuracy, providing the best results. The other CNN architectures had accuracies of 96.38% for MobileNet, 95.65% for ResNet %, and 92.39% for VGG16. Our findings show that anemia may be diagnosed using deep learning approaches from a single lip image. This data set will be enhanced in the future to allow for real-time classification

Text
fdata-06-1291329 - Version of Record
Available under License Creative Commons Attribution.
Download (1MB)
Text
Correction notice
Available under License Creative Commons Attribution.
Download (62kB)

More information

e-pub ahead of print date: 3 November 2023
Published date: 2023
Additional Information: A correction has been attached to this output located at https://www.frontiersin.org/articles/10.3389/fdata.2023.1338363/full and https://doi.org/10.3389/fdata.2023.1338363 Funding Information: The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article. Publisher Copyright: Copyright © 2023 Mansour, Donmez, Kutlu and Mahmud.
Keywords: anemia, classification, convolutional neural network (CNN), deep learning, image processing

Identifiers

Local EPrints ID: 485851
URI: http://eprints.soton.ac.uk/id/eprint/485851
ISSN: 2624-909X
PURE UUID: 89122da2-13a0-4916-821d-f3b63e5d494d
ORCID for Chris Freeman: ORCID iD orcid.org/0000-0003-0305-9246

Catalogue record

Date deposited: 20 Dec 2023 17:40
Last modified: 11 Dec 2024 02:39

Export record

Altmetrics

Contributors

Author: Shekhar Mahmud
Author: Mohammed Mansour
Author: Turker Berk Donmez
Author: Mustafa Kutlu
Author: Chris Freeman ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×