The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume XLVIII-1/W2-2023
13 Dec 2023
 | 13 Dec 2023


C. Ye, Z. Kang, and X. Guo

Keywords: multi-sensor fusion, BaySAC, calibration, vanishing point detection, indoor 3D

Abstract. The calibration of the camera and LiDAR is one of the basis for the construction of the multi-sensor fusion mapping system. Planar features of walls and grounds in indoor environments provides effective constraints for multi-sensor calibration. In this paper, we proposed a new camera-LiDAR calibration method with the constraint of indoor spatial structure. Using the image and point cloud data collected by sensors, visual odometry and LiDAR odometry can be constructed to calculate the transformation between sensors. Based on visual odometry and LiDAR odometry, structural parameters in indoor environment are extracted from images and point cloud to constrain rotation estimation between sensors. In the method proposed in this paper, lines are extracted from the images acquired by the camera and used to estimate and track vanishing points. The direction estimated with vanishing points is used as a global constraint to optimize the rotation parameter estimation of the camera. The fitted planes from the point cloud acquired by the LiDAR are used to compute a set of orthogonal normal vectors corresponding to the ground and wall surfaces, which are used as global constraints to optimize the rotation parameter estimation of the LiDAR. The calibration method proposed in this paper is targetless and only constrained by the indoor spatial structure. The result shows that the proposed indoor spatial structure constraint calibration method can calibrate LiDAR and camera without generating cumulative errors during the rotation estimation process.