OPTO-ACOUSTIC DATA FUSION FOR SUPPORTING THE GUIDANCE OF REMOTELY OPERATED UNDERWATER VEHICLES (ROVs)
Keywords: 3D opto-acoustic camera, optical and acoustic data fusion, stereovision system, system calibration, ROV guidance
Abstract. Remotely Operated underwater Vehicles (ROVs) play an important role in a number of operations conducted in shallow and deep water (e.g.: exploration, survey, intervention, etc.), in several application fields like marine science, offshore construction, and underwater archeology. ROVs are usually equipped with different imaging devices, both optical and acoustic. Optical sensors are able to generate better images in close range and clear water conditions, while acoustic systems are usually employed in long range acquisitions and do not suffer from the presence of turbidity, a well-known cause of coarser resolution and harder data extraction. In this work we describe the preliminary steps in the development of an opto-acoustic camera able to provide an on-line 3D reconstruction of the acquired scene. Taking full advantage of the benefits arising from the opto-acoustic data fusion techniques, the system was conceived as a support tool for ROV operators during the navigation in turbid waters, or in operations conducted by means of mechanical manipulators.
The paper presents an overview of the device, an ad-hoc methodology for the extrinsic calibration of the system and a custom software developed to control the opto-acoustic camera and supply the operator with visual information.