Weakly-supervised butterfly detection based on saliency map
Weakly-supervised butterfly detection based on saliency map
Given the actual needs for detecting multiple features of butterflies in natural ecosystems, this paper proposes a model of weakly-supervised butterfly detection based on a saliency map (WBD-SM) to enhance the accuracy of butterfly detection in the ecological environment as well as to overcome the difficulty of fine annotation. Our proposed model first extracts the features of different scales using the VGG16 without the fully connected layers as the backbone network. Next, the saliency maps of butterfly images are extracted using the deep supervision network with shortcut connections (DSS) used for the butterfly target location. The class activation maps of butterfly images are derived via the adversarial complementary learning (ACoL) network for butterfly target recognition. Then, the saliency and class activation maps are post-processed with conditional random fields, thereby obtaining the refined saliency maps of butterfly objects. Finally, the locations of the butterflies are acquired based on the saliency maps. Experimental results on the 20 categories of butterfly dataset collected in this paper indicate that the WBD-SM achieves a higher recognition accuracy than that of the VGG16 under different division ratios. At the same time, when the training set and test set are 8:2, our WBD-SM attains a 95.67% localization accuracy, which is 9.37% and 11.87% higher than the results of the DSS and ACoL, respectively. Compared with three state-of-the-art fully-supervised object detection networks, RefineDet, YOLOv3 and single-shot detection (SSD), the detection performance of our WBD-SM is better than RefineDet, and YOLOv3, and is almost the same as SSD.
Butterfly detection, Class activation map, Saliency map, Weakly-supervised object detection
Zhang, Ting
9afe33e5-998b-4b25-962e-2b2d376d062a
Waqas, Muhammad
28f978b5-2da0-4060-aa7c-d5cadc1a48e1
Fang, Yu
dcf63393-7c47-490c-89b7-93c5b2814451
Liu, Zhaoying
20c67d51-e992-4083-ab63-6dec8a97c4a3
Halim, Zahid
4c6555ce-bf70-48d1-9b0c-2172ba5f22d3
Li, Yujian
77efa252-9f54-4f2d-a7ea-8ef79b42978f
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
1 June 2023
Zhang, Ting
9afe33e5-998b-4b25-962e-2b2d376d062a
Waqas, Muhammad
28f978b5-2da0-4060-aa7c-d5cadc1a48e1
Fang, Yu
dcf63393-7c47-490c-89b7-93c5b2814451
Liu, Zhaoying
20c67d51-e992-4083-ab63-6dec8a97c4a3
Halim, Zahid
4c6555ce-bf70-48d1-9b0c-2172ba5f22d3
Li, Yujian
77efa252-9f54-4f2d-a7ea-8ef79b42978f
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Zhang, Ting, Waqas, Muhammad, Fang, Yu, Liu, Zhaoying, Halim, Zahid, Li, Yujian and Chen, Sheng
(2023)
Weakly-supervised butterfly detection based on saliency map.
Pattern Recognition, 138, [109313].
(doi:10.1016/j.patcog.2023.109313).
Abstract
Given the actual needs for detecting multiple features of butterflies in natural ecosystems, this paper proposes a model of weakly-supervised butterfly detection based on a saliency map (WBD-SM) to enhance the accuracy of butterfly detection in the ecological environment as well as to overcome the difficulty of fine annotation. Our proposed model first extracts the features of different scales using the VGG16 without the fully connected layers as the backbone network. Next, the saliency maps of butterfly images are extracted using the deep supervision network with shortcut connections (DSS) used for the butterfly target location. The class activation maps of butterfly images are derived via the adversarial complementary learning (ACoL) network for butterfly target recognition. Then, the saliency and class activation maps are post-processed with conditional random fields, thereby obtaining the refined saliency maps of butterfly objects. Finally, the locations of the butterflies are acquired based on the saliency maps. Experimental results on the 20 categories of butterfly dataset collected in this paper indicate that the WBD-SM achieves a higher recognition accuracy than that of the VGG16 under different division ratios. At the same time, when the training set and test set are 8:2, our WBD-SM attains a 95.67% localization accuracy, which is 9.37% and 11.87% higher than the results of the DSS and ACoL, respectively. Compared with three state-of-the-art fully-supervised object detection networks, RefineDet, YOLOv3 and single-shot detection (SSD), the detection performance of our WBD-SM is better than RefineDet, and YOLOv3, and is almost the same as SSD.
Text
PR-2023-2
- Accepted Manuscript
More information
Accepted/In Press date: 8 January 2023
e-pub ahead of print date: 20 January 2023
Published date: 1 June 2023
Additional Information:
Funding Information:
This work is supported by the National Natural Science Foundation of China ( 61806013 , 61876010,61906005 ), General project of Science and Technology Plan of Beijing Municipal Education Commission (KM202110005028), Project of Interdisciplinary Research Institute of Beijing University of Technology (2021020101) and International Research Cooperation Seed Fund of Beijing University of Technology (2021A01).
Publisher Copyright:
© 2023 Elsevier Ltd
Keywords:
Butterfly detection, Class activation map, Saliency map, Weakly-supervised object detection
Identifiers
Local EPrints ID: 474275
URI: http://eprints.soton.ac.uk/id/eprint/474275
ISSN: 0031-3203
PURE UUID: f0e546b3-8efd-49a6-9a70-67e719dac7ec
Catalogue record
Date deposited: 17 Feb 2023 17:30
Last modified: 17 Mar 2024 07:38
Export record
Altmetrics
Contributors
Author:
Ting Zhang
Author:
Muhammad Waqas
Author:
Yu Fang
Author:
Zhaoying Liu
Author:
Zahid Halim
Author:
Yujian Li
Author:
Sheng Chen
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics