Image Fusion Based on Contrast Decomposition

Ioannis Andreadis and Odysseas Bouzos

Keywords

HDR Imaging, Exposure Fusion, Illumination Estimation

Abstract

This paper presents a novel method for fusing two or more differently exposed Low Dynamic Range LDR images of a High Dynamic Range HDR scene. Illumination estimation is applied to coarse and fine scales leading to the contrast decomposition process at both scales and assessing to the definition of the “well-exposed” pixels in the original exposure sequence. Membership functions are then employed to ensure that well-exposed pixels from each contrast decomposed image will be selected, in order to maximize the visual information of the HDR scene. Finally, blending functions are used to create two fused images for each scale. These images contain visual information in different scales. Then, they are blended together to form the final fused image. Comparative results demonstrate that the proposed algorithm, successfully preserves the visual information of the HDR scene.

Important Links:



Go Back