The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XL-1/W2
https://doi.org/10.5194/isprsarchives-XL-1-W2-223-2013
https://doi.org/10.5194/isprsarchives-XL-1-W2-223-2013
16 Aug 2013
 | 16 Aug 2013

IMPROVED UAV-BORNE 3D MAPPING BY FUSING OPTICAL AND LASERSCANNER DATA

B. Jutzi, M. Weinmann, and J. Meidow

Keywords: UAV, multi-pulse laserscanning, sensor calibration, self-localization, data fusion

Abstract. In this paper, a new method for fusing optical and laserscanner data is presented for improved UAV-borne 3D mapping. We propose to equip an unmanned aerial vehicle (UAV) with a small platform which includes two sensors: a standard low-cost digital camera and a lightweight Hokuyo UTM-30LX-EW laserscanning device (210 g without cable). Initially, a calibration is carried out for the utilized devices. This involves a geometric camera calibration and the estimation of the position and orientation offset between the two sensors by lever-arm and bore-sight calibration. Subsequently, a feature tracking is performed through the image sequence by considering extracted interest points as well as the projected 3D laser points. These 2D results are fused with the measured laser distances and fed into a bundle adjustment in order to obtain a Simultaneous Localization and Mapping (SLAM). It is demonstrated that an improvement in terms of precision for the pose estimation is derived by fusing optical and laserscanner data.