The University of Southampton
University of Southampton Institutional Repository

Fully-channel regional attention network for disease-location recognition with tongue images

Fully-channel regional attention network for disease-location recognition with tongue images
Fully-channel regional attention network for disease-location recognition with tongue images
Objective
Using the deep learning model to realize tongue image-based disease location recognition and focus on solving two problems: 1. The ability of the general convolution network to model detailed regional tongue features is weak; 2. Ignoring the group relationship between convolution channels, which caused the high redundancy of the model.

Methods
To enhance the convolutional neural networks. In this paper, a stochastic region pooling method is proposed to gain detailed regional features. Also, an inner-imaging channel relationship modeling method is proposed to model multi-region relations on all channels. Moreover, we combine it with the spatial attention mechanism.

Results
The tongue image dataset with the clinical disease-location label is established. Abundant experiments are carried out on it. The experimental results show that the proposed method can effectively model the regional details of tongue image and improve the performance of disease location recognition.

Conclusion
In this paper, we construct the tongue image dataset with disease-location labels to mine the relationship between tongue images and disease locations. A novel fully-channel regional attention network is proposed to model the local detail tongue features and improve the modeling efficiency.

Significance
The applications of deep learning in tongue image disease-location recognition and the proposed innovative models have guiding significance for other assistant diagnostic tasks. The proposed model provides an example of efficient modeling of detailed tongue features, which is of great guiding significance for other auxiliary diagnosis applications.
0933-3657
Yang, Huanjia
0f90e934-3d0c-4be7-9b3a-b676373d15e3
Wen, Guihua
411fd94f-89bd-4ad7-908d-9c876afd7564
Luo, Mingnan
43faccbb-eead-4787-af0f-d3fbe7f2538b
Yang, Peixin
15edaeff-c775-41be-9cc2-dc2a6db9c0b0
Dai, Dan
85b7cbb9-cd58-46e1-b7ff-c264e9f46908
Yu, Zhiwen
1421a503-983b-460b-8372-e2df423f6890
Wang, Changjun
4cf32501-c45f-4c08-8435-8f9bf6842539
Hall, Wendy
11f7f8db-854c-4481-b1ae-721a51d8790c
Yang, Huanjia
0f90e934-3d0c-4be7-9b3a-b676373d15e3
Wen, Guihua
411fd94f-89bd-4ad7-908d-9c876afd7564
Luo, Mingnan
43faccbb-eead-4787-af0f-d3fbe7f2538b
Yang, Peixin
15edaeff-c775-41be-9cc2-dc2a6db9c0b0
Dai, Dan
85b7cbb9-cd58-46e1-b7ff-c264e9f46908
Yu, Zhiwen
1421a503-983b-460b-8372-e2df423f6890
Wang, Changjun
4cf32501-c45f-4c08-8435-8f9bf6842539
Hall, Wendy
11f7f8db-854c-4481-b1ae-721a51d8790c

Yang, Huanjia, Wen, Guihua, Luo, Mingnan, Yang, Peixin, Dai, Dan, Yu, Zhiwen, Wang, Changjun and Hall, Wendy (2021) Fully-channel regional attention network for disease-location recognition with tongue images. Artificial Intelligence in Medicine, 118, [102110]. (doi:10.1016/j.artmed.2021.102110).

Record type: Article

Abstract

Objective
Using the deep learning model to realize tongue image-based disease location recognition and focus on solving two problems: 1. The ability of the general convolution network to model detailed regional tongue features is weak; 2. Ignoring the group relationship between convolution channels, which caused the high redundancy of the model.

Methods
To enhance the convolutional neural networks. In this paper, a stochastic region pooling method is proposed to gain detailed regional features. Also, an inner-imaging channel relationship modeling method is proposed to model multi-region relations on all channels. Moreover, we combine it with the spatial attention mechanism.

Results
The tongue image dataset with the clinical disease-location label is established. Abundant experiments are carried out on it. The experimental results show that the proposed method can effectively model the regional details of tongue image and improve the performance of disease location recognition.

Conclusion
In this paper, we construct the tongue image dataset with disease-location labels to mine the relationship between tongue images and disease locations. A novel fully-channel regional attention network is proposed to model the local detail tongue features and improve the modeling efficiency.

Significance
The applications of deep learning in tongue image disease-location recognition and the proposed innovative models have guiding significance for other assistant diagnostic tasks. The proposed model provides an example of efficient modeling of detailed tongue features, which is of great guiding significance for other auxiliary diagnosis applications.

Text
Fully-channel regional attention network for disease-location recognition with tongue images - Accepted Manuscript
Download (10MB)

More information

Accepted/In Press date: 11 May 2021
e-pub ahead of print date: 26 May 2021

Identifiers

Local EPrints ID: 450290
URI: http://eprints.soton.ac.uk/id/eprint/450290
ISSN: 0933-3657
PURE UUID: 25a10dbb-b54f-4ef5-970c-9c38a3611ad9
ORCID for Wendy Hall: ORCID iD orcid.org/0000-0003-4327-7811

Catalogue record

Date deposited: 20 Jul 2021 16:32
Last modified: 23 Jul 2021 01:32

Export record

Altmetrics

Contributors

Author: Huanjia Yang
Author: Guihua Wen
Author: Mingnan Luo
Author: Peixin Yang
Author: Dan Dai
Author: Zhiwen Yu
Author: Changjun Wang
Author: Wendy Hall ORCID iD

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×