INVESTIGATIONS ON A COMBINED RGB / TIME-OF-FLIGHT APPROACH FOR CLOSE RANGE APPLICATIONS
Keywords: Close Range, Metrology, Vision, Image, Multisensor, Calibration, Fusion
Abstract. 3D surface and scene reconstruction for close range applications mainly rely on high resolution and accurate system devices and powerful algorithms. Camera systems based on the time-of-flight principle allow for real-time 3D distance measurements. Unfortunately these devices are limited in resolution and accuracy. But applying calibration models and combining with high-resolution image data offers a promising approach in order to form a multisensor system for close range applications. This article will present investigations on such a multisensor system. Different options on data fusion processing of distance information and high-resolution color information in order to generate dense 2 1/2 D and 3D point clouds will be presented. The multisensor system is calibrated with respect to its interior and exterior orientation. The time-of-flight distance information is optimized extracting best information of different data captures with a set of integration times following the principle of high dynamic range imaging. The high-resolution RGB image information is projected into object space and intersected with the object surface from the time-of-flight camera. First results of this solution on dense monoplotting and its verification will be presented.