Night and Day Aerial Photogrammetry
Keywords: Day-night, Image matching, Deep Learning, Aerial triangulation, ALIKED, SuperPoint, LoFTR
Abstract. Recent advancements in aerial imaging, including high-resolution sensors and integrated GNSS/IMU systems, have significantly enhanced photogrammetric methods for geospatial data acquisition. While most aerial data is captured during daylight, night-time imaging is increasingly being used in applications such as urban analysis and disaster assessment. However, automatic co-registration of day and night imagery remains challenging due to substantial radiometric differences. This study investigates the use of deep learning-based feature matching techniques for the alignment of multi-temporal, day-night aerial datasets. Experimental results show that feature extraction is highly sensitive to scale, with only a limited subset of deep learning (DL) methods—particularly ALIKED with LightGlue and SuperPoint with SuperGlue—proving robust under low-illumination conditions. Additionally, a U-Net-like model was trained to pre-process night-time images by approximating their radiometric characteristics to those of daytime images, enabling consistent feature matching across all tested methods. Among them, ALIKED with LightGlue offered the best balance between match quantity and computational efficiency. Object-space evaluations confirmed that the proposed pre-processing step significantly improves co-registration accuracy. The methodology offers a promising foundation for future multi-sensor and multi-modal image alignment tasks, including RGB-thermal and 2D-3D matching.