The University of Southampton
University of Southampton Institutional Repository

Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition
Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition
A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.
multi-focus image fusion, multi-exposure image fusion, signal decomposition, multivariate empirical mode decomposition, multiresolution analysis, non-subsampled contourlet transform
1424-8220
10923-10947
Rehman, Naveed Ur
8cd2ee50-73fb-4df1-9bb5-b278b911b70f
Ehsan, Shoaib
ae8922f0-dbe0-4b22-8474-98e84d852de7
Abdullah, Syed Muhammad Umer
68a4b02e-7024-4072-8786-bb3c55580415
Technology, COMSATS
4fffbdd1-3199-4c20-89c7-e76a8b508818
Mandic, Danilo P.
cc3b7e8d-6df2-42f1-a14c-dda76ac32745
McDonald-Maier, Klaus D.
d35c2e77-744a-4318-9d9d-726459e64db9
Rehman, Naveed Ur
8cd2ee50-73fb-4df1-9bb5-b278b911b70f
Ehsan, Shoaib
ae8922f0-dbe0-4b22-8474-98e84d852de7
Abdullah, Syed Muhammad Umer
68a4b02e-7024-4072-8786-bb3c55580415
Technology, COMSATS
4fffbdd1-3199-4c20-89c7-e76a8b508818
Mandic, Danilo P.
cc3b7e8d-6df2-42f1-a14c-dda76ac32745
McDonald-Maier, Klaus D.
d35c2e77-744a-4318-9d9d-726459e64db9

Rehman, Naveed Ur, Ehsan, Shoaib, Abdullah, Syed Muhammad Umer, Technology, COMSATS, Mandic, Danilo P. and McDonald-Maier, Klaus D. (2015) Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition. Sensors, 15 (5), 10923-10947. (doi:10.3390/s150510923).

Record type: Article

Abstract

A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

This record has no associated files available for download.

More information

Published date: May 2015
Keywords: multi-focus image fusion, multi-exposure image fusion, signal decomposition, multivariate empirical mode decomposition, multiresolution analysis, non-subsampled contourlet transform

Identifiers

Local EPrints ID: 478878
URI: http://eprints.soton.ac.uk/id/eprint/478878
ISSN: 1424-8220
PURE UUID: a78ce9b6-829c-433f-83c8-059f1b3f2e3c
ORCID for Shoaib Ehsan: ORCID iD orcid.org/0000-0001-9631-1898

Catalogue record

Date deposited: 12 Jul 2023 16:36
Last modified: 17 Mar 2024 04:16

Export record

Altmetrics

Contributors

Author: Naveed Ur Rehman
Author: Shoaib Ehsan ORCID iD
Author: Syed Muhammad Umer Abdullah
Author: COMSATS Technology
Author: Danilo P. Mandic
Author: Klaus D. McDonald-Maier

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×