The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B2-2021
https://doi.org/10.5194/isprs-archives-XLIII-B2-2021-361-2021
https://doi.org/10.5194/isprs-archives-XLIII-B2-2021-361-2021
28 Jun 2021
 | 28 Jun 2021

REAL-TIME DENSE 3D RECONSTRUCTION FROM MONOCULAR VIDEO DATA CAPTURED BY LOW-COST UAVS

M. Hermann, B. Ruf, and M. Weinmann

Keywords: 3D Reconstruction, Real-time, SLAM, Multi-View-Stereo, Depth Map Fusion, Oblique Aerial Imagery, UAVs

Abstract. Real-time 3D reconstruction enables fast dense mapping of the environment which benefits numerous applications, such as navigation or live evaluation of an emergency. In contrast to most real-time capable approaches, our method does not need an explicit depth sensor. Instead, we only rely on a video stream from a camera and its intrinsic calibration. By exploiting the self-motion of the unmanned aerial vehicle (UAV) flying with oblique view around buildings, we estimate both camera trajectory and depth for selected images with enough novel content. To create a 3D model of the scene, we rely on a three-stage processing chain. First, we estimate the rough camera trajectory using a simultaneous localization and mapping (SLAM) algorithm. Once a suitable constellation is found, we estimate depth for local bundles of images using a Multi-View Stereo (MVS) approach and then fuse this depth into a global surfel-based model. For our evaluation, we use 55 video sequences with diverse settings, consisting of both synthetic and real scenes. We evaluate not only the generated reconstruction but also the intermediate products and achieve competitive results both qualitatively and quantitatively. At the same time, our method can keep up with a 30 fps video for a resolution of 768 × 448 pixels.