The University of Southampton
University of Southampton Institutional Repository

VPRS-based regional decision fusion of CNN and MRF classifications for very fine resolution remotely sensed images

VPRS-based regional decision fusion of CNN and MRF classifications for very fine resolution remotely sensed images
VPRS-based regional decision fusion of CNN and MRF classifications for very fine resolution remotely sensed images
Recent advances in computer vision and pattern recognition have demonstrated the superiority of deep neural networks using spatial feature representation, such as convolutional neural networks (CNN), for image classification. However, any classifier, regardless of its model structure (deep or shallow), involves prediction uncertainty when classifying spatially and spectrally complicated very fine spatial resolution (VFSR) imagery. We propose here to characterise the uncertainty distribution of CNN classification and integrate it into a regional decision fusion to increase classification accuracy. Specifically, a variable precision rough set (VPRS) model is proposed to quantify the uncertainty within CNN classifications of VFSR imagery, and partition this uncertainty into positive regions (correct classifications) and non-positive regions (uncertain or incorrect classifications). Those “more correct” areas were trusted by the CNN, whereas the uncertain areas were rectified by a Multi-Layer Perceptron (MLP)-based Markov random field (MLP-MRF) classifier to provide crisp and accurate boundary delineation. The proposed MRF-CNN fusion decision strategy exploited the complementary characteristics of the two classifiers based on VPRS uncertainty description and classification integration. The effectiveness of the MRF-CNN method was tested in both urban and rural areas of southern England as well as Semantic Labelling datasets. The MRF-CNN consistently outperformed the benchmark MLP, SVM, MLP-MRF and CNN and the baseline methods. This research provides a regional decision fusion framework within which to gain the advantages of model-based CNN, while overcoming the problem of losing effective resolution and uncertain prediction at object boundaries, which is especially pertinent for complex VFSR image classification.
0196-2892
1-15
Zhang, Ce
72e137e7-06c5-483e-bdc7-21629e03bb5b
Sargent, Isabel
3df2050d-b24e-4f60-bc6e-8b1fafdb3f5a
Pan, Xin
387a1d0d-63a4-432a-a443-0654cfcc9321
Gardiner, Andy
7dc7b072-ffa4-47d0-a97f-7586fbaae5ee
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Atkinson, Peter M.
29ab8d8a-31cb-4a19-b0fb-f0558a1f110a
Zhang, Ce
72e137e7-06c5-483e-bdc7-21629e03bb5b
Sargent, Isabel
3df2050d-b24e-4f60-bc6e-8b1fafdb3f5a
Pan, Xin
387a1d0d-63a4-432a-a443-0654cfcc9321
Gardiner, Andy
7dc7b072-ffa4-47d0-a97f-7586fbaae5ee
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
Atkinson, Peter M.
29ab8d8a-31cb-4a19-b0fb-f0558a1f110a

Zhang, Ce, Sargent, Isabel, Pan, Xin, Gardiner, Andy, Hare, Jonathon and Atkinson, Peter M. (2018) VPRS-based regional decision fusion of CNN and MRF classifications for very fine resolution remotely sensed images. IEEE Transactions on Geoscience and Remote Sensing, 56 (8), 1-15. (doi:10.1109/TGRS.2018.2822783).

Record type: Article

Abstract

Recent advances in computer vision and pattern recognition have demonstrated the superiority of deep neural networks using spatial feature representation, such as convolutional neural networks (CNN), for image classification. However, any classifier, regardless of its model structure (deep or shallow), involves prediction uncertainty when classifying spatially and spectrally complicated very fine spatial resolution (VFSR) imagery. We propose here to characterise the uncertainty distribution of CNN classification and integrate it into a regional decision fusion to increase classification accuracy. Specifically, a variable precision rough set (VPRS) model is proposed to quantify the uncertainty within CNN classifications of VFSR imagery, and partition this uncertainty into positive regions (correct classifications) and non-positive regions (uncertain or incorrect classifications). Those “more correct” areas were trusted by the CNN, whereas the uncertain areas were rectified by a Multi-Layer Perceptron (MLP)-based Markov random field (MLP-MRF) classifier to provide crisp and accurate boundary delineation. The proposed MRF-CNN fusion decision strategy exploited the complementary characteristics of the two classifiers based on VPRS uncertainty description and classification integration. The effectiveness of the MRF-CNN method was tested in both urban and rural areas of southern England as well as Semantic Labelling datasets. The MRF-CNN consistently outperformed the benchmark MLP, SVM, MLP-MRF and CNN and the baseline methods. This research provides a regional decision fusion framework within which to gain the advantages of model-based CNN, while overcoming the problem of losing effective resolution and uncertain prediction at object boundaries, which is especially pertinent for complex VFSR image classification.

Text
FINAL_VERSION - Accepted Manuscript
Download (1MB)

More information

Accepted/In Press date: 27 March 2018
e-pub ahead of print date: 23 April 2018
Published date: 22 July 2018

Identifiers

Local EPrints ID: 419438
URI: http://eprints.soton.ac.uk/id/eprint/419438
ISSN: 0196-2892
PURE UUID: cd9d7228-0114-47b8-8417-a238ada59aaf
ORCID for Jonathon Hare: ORCID iD orcid.org/0000-0003-2921-4283

Catalogue record

Date deposited: 12 Apr 2018 16:30
Last modified: 16 Mar 2024 03:50

Export record

Altmetrics

Contributors

Author: Ce Zhang
Author: Isabel Sargent
Author: Xin Pan
Author: Andy Gardiner
Author: Jonathon Hare ORCID iD
Author: Peter M. Atkinson

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×