The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Share
Publications Copernicus
Download
Citation
Share
Articles | Volume XLVIII-1/W6-2025
https://doi.org/10.5194/isprs-archives-XLVIII-1-W6-2025-235-2025
https://doi.org/10.5194/isprs-archives-XLVIII-1-W6-2025-235-2025
31 Dec 2025
 | 31 Dec 2025

Multi-Modal and Multi-Sensor Photogrammetric Data Fusion Exploiting a New Repository for Infrared Thermography Datasets

Neil Sutherland, Luca Morelli, Jon Mills, Paul Bryan, Stuart Marsh, and Fabio Remondino

Keywords: InfraRed Thermography (IRT), data fusion, multi-modal, architectural heritage, SuperPoint, LightGlue

Abstract. InfraRed Thermography 3D-Data Fusion (IRT-3DDF), an emerging field of research combining 2D thermal images with 3D models, has demonstrated its competency visualising the performance of historic buildings and the behaviour of materials under varying environmental conditions. However, for 3D thermal models to become viable tools in the assessment of architectural heritage, fully-automatic IRT-3DDF methods capable of producing both geometrically- and radiometrically-accurate models require greater investigation. Therefore, using a new repository of multi-modal, multi-sensor and multi-platform datasets, this paper presents a fully-automatic IRT-3DDF method using deep learning-based multi-modal image matching, fusing multiple aerial and terrestrial sensors using combined bundle block adjustments. Results demonstrate the successful orientation several multi-sensor datasets using pre-trained neural networks, achieving sub-centimetre geometric accuracy and radiometrically consistent thermal models suitable for building diagnostics. Future work will assess the generalisability of the proposed pipeline across additional datasets, expanding its application to broader conservation, repair and maintenance (CRM) practices.

Share