The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XXXIX-B3
https://doi.org/10.5194/isprsarchives-XXXIX-B3-485-2012
https://doi.org/10.5194/isprsarchives-XXXIX-B3-485-2012
01 Aug 2012
 | 01 Aug 2012

MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION

X. Liu, Q. Yu, X. Zhang, Y. Shang, X. Zhu, and Z. Lei

Keywords: Image Matching, Local Average Phase, Local Weighted Amplitude, Local Best-Matching Point, Similarity Measurement, Local Frequency Information

Abstract. Image Matching is often one of the first tasks in many Photogrammetry and Remote Sensing applications. This paper presents an efficient approach to automated multi-temporal and multi-sensor image matching based on local frequency information. Two new independent image representations, Local Average Phase (LAP) and Local Weighted Amplitude (LWA), are presented to emphasize the common scene information, while suppressing the non-common illumination and sensor-dependent information. In order to get the two representations, local frequency information is firstly obtained from Log-Gabor wavelet transformation, which is similar to that of the human visual system; then the outputs of odd and even symmetric filters are used to construct the LAP and LWA. The LAP and LWA emphasize on the phase and amplitude information respectively. As these two representations are both derivative-free and threshold-free, they are robust to noise and can keep as much of the image details as possible. A new Compositional Similarity Measure (CSM) is also presented to combine the LAP and LWA with the same weight for measuring the similarity of multi-temporal and multi-sensor images. The CSM can make the LAP and LWA compensate for each other and can make full use of the amplitude and phase of local frequency information. In many image matching applications, the template is usually selected without consideration of its matching robustness and accuracy. In order to overcome this problem, a local best matching point detection is presented to detect the best matching template. In the detection method, we employ self-similarity analysis to identify the template with the highest matching robustness and accuracy. Experimental results using some real images and simulation images demonstrate that the presented approach is effective for matching image pairs with significant scene and illumination changes and that it has advantages over other state-of-the-art approaches, which include: the Local Frequency Response Vectors (LFRV), Phase Congruence (PC), and Four Directional-Derivative-Energy Image (FDDEI), especially when there is a low signal-to-noise ratio (SNR). As few assumptions are made, our proposed method can foreseeably be used in a wide variety of image-matching applications.