The University of Southampton
University of Southampton Institutional Repository

FG-UNet: fine-grained feature-guided UNet for segmentation of weeds and crops in UAV images

FG-UNet: fine-grained feature-guided UNet for segmentation of weeds and crops in UAV images
FG-UNet: fine-grained feature-guided UNet for segmentation of weeds and crops in UAV images

Background: semantic segmentation of weed and crop images is a key component and prerequisite for automated weed management. For weeds in unmanned aerial vehicle (UAV) images, which are usually characterized by small size and easily confused with crops at early growth stages, existing semantic segmentation models have difficulties to extract sufficiently fine features. This leads to their limited performance in weed and crop segmentation of UAV images. 

Results: we proposed a fine-grained feature-guided UNet, named FG-UNet, for weed and crop segmentation in UAV images. Specifically, there are two branches in FG-UNet, namely the fine-grained feature branch and the UNet branch. In the fine-grained feature branch, a fine feature-aware (FFA) module was designed to mine fine features in order to enhance the model's ability to segment small objects. In the UNet branch, we used an encoder–decoder structure to realize high-level semantic feature extraction in images. In addition, a contextual feature fusion (CFF) module was designed for the fusion of the fine features and high-level semantic features, thus enhancing the feature discrimination capability of the model. The experimental results showed that our proposed FG-UNet, achieved state-of-the-art performance compared to other semantic segmentation models, with mean intersection over union (MIOU) and mean pixel accuracy (MPA) of 88.06% and 92.37%, respectively. 

Conclusion: the proposed method in this study lays a solid foundation for accurate detection and intelligent management of weeds. It will have a positive impact on the development of smart agriculture.

contextual feature fusion module, fine feature-aware module, image segmentation, weed and crop segmentation
1526-498X
Lin, Jianwu
7db282c5-c8a9-447c-bad4-34241c034b32
Zhang, Xin
348adda4-f310-4ce3-92e2-a152e6c83be4
Qin, Yongbin
d1d0dabe-7b43-4287-b0f3-739863cd360f
Yang, Shengxian
2e3f533c-fb9d-416d-ba13-e1530eebd453
Wen, Xingtian
a9074589-7255-45a7-a7c4-8f632dd184cf
Cernava, Tomislav
a13d65aa-2529-479a-ba90-69ebbc4ba07f
Chen, Xiaoyulong
02c0a0f6-0927-47d3-80ab-68b70e81c9fb
Lin, Jianwu
7db282c5-c8a9-447c-bad4-34241c034b32
Zhang, Xin
348adda4-f310-4ce3-92e2-a152e6c83be4
Qin, Yongbin
d1d0dabe-7b43-4287-b0f3-739863cd360f
Yang, Shengxian
2e3f533c-fb9d-416d-ba13-e1530eebd453
Wen, Xingtian
a9074589-7255-45a7-a7c4-8f632dd184cf
Cernava, Tomislav
a13d65aa-2529-479a-ba90-69ebbc4ba07f
Chen, Xiaoyulong
02c0a0f6-0927-47d3-80ab-68b70e81c9fb

Lin, Jianwu, Zhang, Xin, Qin, Yongbin, Yang, Shengxian, Wen, Xingtian, Cernava, Tomislav and Chen, Xiaoyulong (2024) FG-UNet: fine-grained feature-guided UNet for segmentation of weeds and crops in UAV images. Pest Management Science. (doi:10.1002/ps.8489).

Record type: Article

Abstract

Background: semantic segmentation of weed and crop images is a key component and prerequisite for automated weed management. For weeds in unmanned aerial vehicle (UAV) images, which are usually characterized by small size and easily confused with crops at early growth stages, existing semantic segmentation models have difficulties to extract sufficiently fine features. This leads to their limited performance in weed and crop segmentation of UAV images. 

Results: we proposed a fine-grained feature-guided UNet, named FG-UNet, for weed and crop segmentation in UAV images. Specifically, there are two branches in FG-UNet, namely the fine-grained feature branch and the UNet branch. In the fine-grained feature branch, a fine feature-aware (FFA) module was designed to mine fine features in order to enhance the model's ability to segment small objects. In the UNet branch, we used an encoder–decoder structure to realize high-level semantic feature extraction in images. In addition, a contextual feature fusion (CFF) module was designed for the fusion of the fine features and high-level semantic features, thus enhancing the feature discrimination capability of the model. The experimental results showed that our proposed FG-UNet, achieved state-of-the-art performance compared to other semantic segmentation models, with mean intersection over union (MIOU) and mean pixel accuracy (MPA) of 88.06% and 92.37%, respectively. 

Conclusion: the proposed method in this study lays a solid foundation for accurate detection and intelligent management of weeds. It will have a positive impact on the development of smart agriculture.

Text
Accepted_Manuscript - Accepted Manuscript
Restricted to Repository staff only until 17 October 2025.
Request a copy

More information

Accepted/In Press date: 6 October 2024
e-pub ahead of print date: 17 October 2024
Keywords: contextual feature fusion module, fine feature-aware module, image segmentation, weed and crop segmentation

Identifiers

Local EPrints ID: 496138
URI: http://eprints.soton.ac.uk/id/eprint/496138
ISSN: 1526-498X
PURE UUID: 63e63b76-e01c-4fe1-842d-70524c64aa83
ORCID for Tomislav Cernava: ORCID iD orcid.org/0000-0001-7772-4080

Catalogue record

Date deposited: 05 Dec 2024 17:31
Last modified: 06 Dec 2024 03:07

Export record

Altmetrics

Contributors

Author: Jianwu Lin
Author: Xin Zhang
Author: Yongbin Qin
Author: Shengxian Yang
Author: Xingtian Wen
Author: Tomislav Cernava ORCID iD
Author: Xiaoyulong Chen

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×