Pixel-level image fusion scheme based on linear algebra Conference Paper uri icon

abstract

  • Image fusion refers to the process of integrating complementary image sources from multiple imaging sensor such that the resulting fused image improves the performance of computational analysis tasks such as segmentation, feature extraction and object recognition. The paper introduces a pixellevel image fusion scheme based on linear algebra. The image fusion process begins by computing the discrete wavelet transform of the source images. Then, the wavelet transform of the images are fused using a feature-based rule. A salient feature may extend to several pixels; therefore, a rule that can include a region of pixels containing it results in a more efficient integration. The fusion rule is based on a measurement of the linear dependency of a small window centered on the pixel under consideration. The linear dependency measurement is the Wronskian determinant that is a simple and rigorous test. The performance assessment of the proposed method is established by using mutual information measurement as well as Root Mean Square Error and Peak Signal to Noise Ratio. The simulation results show that the proposed method is an efficient approach to image fusion. © 2007 IEEE.

publication date

  • 2007-01-01