Research on 3D Virtual Scene Reconstruction and Application Based on Multi-source Data Fusion
Keywords: 3D virtual campus, UAV data processing, 3D reconstruction, Mobile mapping technology, Point cloud processing
Abstract. As digital cities evolve, the demand for 3D reconstruction increases, but challenges in accuracy and completeness remain.This study proposes a 3D virtual scene reconstruction and application framework, which overcomes the limitations of single-source data by incorporating multi-source data, improving accuracy, completeness, and interactivity. The method incorporates Unmanned Aerial Vehicle (UAV) photogrammetry, UAV oblique photogrammetry, and Backpack Laser Scanning (BLS) to generate high resolution mapping products, including Digital Orthophoto Map (DOM), Digital Surface Model (DSM), and dense point clouds. In this study, UAV images are utilized for 3D reconstructing large-scale scenes. For smaller scenes and complex individual buildings, two distinct data sources - oblique photogrammetry and BLS - are employed for modeling. The fusion of multi-source data addresses issues such as image blind spots and deformations, resulting in models with improved geometric accuracy and richer textural details. System development integrates a high resolution image viewer optimized with tiling technology and a 3D virtual reality interactive system constructed based on Unreal Engine 5, enabling immersive exploration and real - time interaction. This study offers a scalable solution for urban 3D reconstruction and provides tools for campus management and virtual tours, holding potential applications in the development of smart cities.
