Uncertainty Estimation for Photogrammetric Point Clouds of UAV Imagery
Keywords: Uncertainty Estimation, Error Propagation, Dense Matching, Point Cloud, Unmanned Aerial Vehicle, Photogrammetry
Abstract. Nowadays, unmanned aerial vehicles (UAVs) are widely used in various photogrammetric applications to collect high-resolution images for 3D reconstruction. Modern photogrammetric reconstruction often employs Structure-from-Motion (SfM) and Multi-View Stereo (MVS) to generate dense 3D point clouds from unordered image sets. Estimating the uncertainty of 3D point clouds is crucial, as it predicts error covariance matrices and indicates the reliability of the reconstructed point clouds. Despite its importance, little effort has been made to model uncertainty, particularly during the MVS stage, and to rigorously propagate uncertainties through the photogrammetric reconstruction process to the final 3D point clouds, leading to improper interpretation of their quality. Recent works on disparity uncertainty estimation also focus solely on stereo matching, ignoring the rich information provided by the MVS framework. In this work, we propose a novel method for estimating metric uncertainty in UAV imagery-derived 3D point clouds using error propagation. Specifically, we leverage multi-ray points from the MVS framework to map dense matching costs to metric disparity uncertainty. Our method requires no training data, making it generalizable to various UAV datasets. We evaluate our method on public and self-collected UAV datasets, and the results demonstrate that it outperforms existing approaches in terms of bounding rate.