The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume XLII-2/W13
https://doi.org/10.5194/isprs-archives-XLII-2-W13-665-2019
https://doi.org/10.5194/isprs-archives-XLII-2-W13-665-2019
04 Jun 2019
 | 04 Jun 2019

UAVS ENHANCED NAVIGATION IN OUTDOOR GNSS DENIED ENVIRONMENT USING UWB AND MONOCULAR CAMERA SYSTEMS

S. Zahran, A. Masiero, M. M. Mostafa, A. M. Moussa, A. Vettore, and N. El-Sheimy

Keywords: Optical Flow, Ultra-Wideband, Multi-Sensor Fusion, Inertial Navigation Systems, Multi-Sensor Fusion, Global Navigation Satellite System Denied Environment

Abstract. The demand for small Unmanned Aerial Vehicles (UAVs) is massively increasing these days, due to the wide variety of applications utilizing such vehicles to perform tasks that may be dangerous or just to save time, effort, or cost. Small UAVs navigation system mainly depends on the integration between Global Navigation Satellite Systems (GNSS) and Inertial Measurement Unit (INS) to estimate the Positions, Velocities, and Attitudes (PVT) of the vehicle. Without GNSS such UAVs cannot navigate for long periods of time depending on INS alone, as the low-cost INS typically exhibits massive accumulation of errors during GNSS absence. Given the importance of ensuring full operability of the UAVs even during GNSS signals unavailability, other sensors must be used to bound the INS errors and enhance the navigation system performance. This paper proposes an enhanced UAV navigation system based on integration between monocular camera, Ultra-Wideband (UWB) system, and INS. In addition to using variable EKF weighting scheme. The paper also investigates this integration in the case of low density of UWB anchors, to reduce the cost required for such UWB system infrastructure. A GoPro Camera and UWB rover were attached to the belly of a quadcopter, an on the shelf commercial drone (3DR Solo), during the experimental flight. The velocity of the vehicle is estimated with Optical Flow (OF) from camera successive images, while the range measurements between the UWB rover and the stationary UWB anchors, which were distributed on the field, were used to estimate UAV position.