Multi-Exposure Image Fusion based on Illumination Estimation

Vassilios Vonikakis, Odysseas Bouzos, and Ioannis Andreadis


High Dynamic Range Imaging, Image Fusion, Illumination Estimation


This paper presents a new method for fusing two or more differently exposed images of a High Dynamic Range (HDR) scene. The method employs illumination estimation filtering, in order to assess the degree of “well-exposedness” of the pixels in the original image sequence. These initial estimations are then modulated by membership functions, which assign specific weights to the pixels of the image sequence. The membership functions favor the well-exposed pixels from each exposure, in order to maximize the visual information in the final result. The fused image is derived by interpolating between all the pixels, using the weights assigned by the membership functions. The experimental results show that the proposed algorithm preserves well the information from each exposure of the sequence, resulting to an image with extended information both in the shadows and in the highlights.

Important Links:

Go Back