The University of Southampton
University of Southampton Institutional Repository

AAU-Net: an adaptive attention U-net for breast lesions segmentation in ultrasound images

AAU-Net: an adaptive attention U-net for breast lesions segmentation in ultrasound images
AAU-Net: an adaptive attention U-net for breast lesions segmentation in ultrasound images

Various deep learning methods have been proposed to segment breast lesions from ultrasound images. However, similar intensity distributions, variable tumor morphologies and blurred boundaries present challenges for breast lesions segmentation, especially for malignant tumors with irregular shapes. Considering the complexity of ultrasound images, we develop an adaptive attention U-net (AAU-net) to segment breast lesions automatically and stably from ultrasound images. Specifically, we introduce a hybrid adaptive attention module (HAAM), which mainly consists of a channel self-attention block and a spatial self-attention block, to replace the traditional convolution operation. Compared with the conventional convolution operation, the design of the hybrid adaptive attention module can help us capture more features under different receptive fields. Different from existing attention mechanisms, the HAAM module can guide the network to adaptively select more robust representation in channel and space dimensions to cope with more complex breast lesions segmentation. Extensive experiments with several state-of-the-art deep learning segmentation methods on three public breast ultrasound datasets show that our method has better performance on breast lesions segmentation. Furthermore, robustness analysis and external experiments demonstrate that our proposed AAU-net has better generalization performance in the breast lesion segmentation. Moreover, the HAAM module can be flexibly applied to existing network frameworks. The source code is available on https://github.com/CGPxy/AAU-net.

adaptive learning, breast tumors segmentation, deep learning, hybrid attention, Ultrasound images
0278-0062
1289-1300
Chen, Gongping
2f250cca-bb4f-450a-85e0-319954e22b5d
Li, Lei
2da88502-0bd8-4e6b-8f7d-0c01a48b399e
Dai, Yu
64724011-6503-4fdd-9bf5-cee916e4ae2f
Zhang, Jianxun
d6ea6243-8663-4605-ba00-a5ffd9a8c3ed
Yap, Moi Hoon
ac1ce1b5-7608-4c7e-b3da-7c6b8da1c4c1
et al.
Chen, Gongping
2f250cca-bb4f-450a-85e0-319954e22b5d
Li, Lei
2da88502-0bd8-4e6b-8f7d-0c01a48b399e
Dai, Yu
64724011-6503-4fdd-9bf5-cee916e4ae2f
Zhang, Jianxun
d6ea6243-8663-4605-ba00-a5ffd9a8c3ed
Yap, Moi Hoon
ac1ce1b5-7608-4c7e-b3da-7c6b8da1c4c1

Chen, Gongping, Li, Lei and Dai, Yu , et al. (2023) AAU-Net: an adaptive attention U-net for breast lesions segmentation in ultrasound images. IEEE Transactions on Medical Imaging, 42 (5), 1289-1300. (doi:10.1109/TMI.2022.3226268).

Record type: Article

Abstract

Various deep learning methods have been proposed to segment breast lesions from ultrasound images. However, similar intensity distributions, variable tumor morphologies and blurred boundaries present challenges for breast lesions segmentation, especially for malignant tumors with irregular shapes. Considering the complexity of ultrasound images, we develop an adaptive attention U-net (AAU-net) to segment breast lesions automatically and stably from ultrasound images. Specifically, we introduce a hybrid adaptive attention module (HAAM), which mainly consists of a channel self-attention block and a spatial self-attention block, to replace the traditional convolution operation. Compared with the conventional convolution operation, the design of the hybrid adaptive attention module can help us capture more features under different receptive fields. Different from existing attention mechanisms, the HAAM module can guide the network to adaptively select more robust representation in channel and space dimensions to cope with more complex breast lesions segmentation. Extensive experiments with several state-of-the-art deep learning segmentation methods on three public breast ultrasound datasets show that our method has better performance on breast lesions segmentation. Furthermore, robustness analysis and external experiments demonstrate that our proposed AAU-net has better generalization performance in the breast lesion segmentation. Moreover, the HAAM module can be flexibly applied to existing network frameworks. The source code is available on https://github.com/CGPxy/AAU-net.

This record has no associated files available for download.

More information

e-pub ahead of print date: 1 December 2022
Published date: 1 May 2023
Keywords: adaptive learning, breast tumors segmentation, deep learning, hybrid attention, Ultrasound images

Identifiers

Local EPrints ID: 488804
URI: http://eprints.soton.ac.uk/id/eprint/488804
ISSN: 0278-0062
PURE UUID: 4a52e235-b876-4a99-9993-17f48deb5716
ORCID for Lei Li: ORCID iD orcid.org/0000-0003-1281-6472

Catalogue record

Date deposited: 05 Apr 2024 16:44
Last modified: 10 Apr 2024 02:14

Export record

Altmetrics

Contributors

Author: Gongping Chen
Author: Lei Li ORCID iD
Author: Yu Dai
Author: Jianxun Zhang
Author: Moi Hoon Yap
Corporate Author: et al.

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×