BENCHMARKING COLLABORATIVE POSITIONING AND NAVIGATION BETWEEN GROUND AND UAS PLATFORMS
Keywords: collaborative positioning, UWB, vision, LiDAR, UAS, Kalman filter, GNSS, SLAM
Abstract. The availability of Global Positioning System (GPS), or more in general of Global Navigation Satellite Systems (GNSS), and the development of smart mobile devices, able to exploit the geospatial information provided by GPS/GNSS and integrate their use within many applications, have had a dramatic impact on the everyday life of most of the World population. While GNSS allows for real-time positioning in a wide range of scenarios, there are many challenging environments, such as tunnels and urban canyons, where GNSS-based solutions are inaccurate, unreliable, or even unavailable. The enormous interest in applications requiring ubiquitous positioning (e.g., self-driving vehicles) has been motivating the development of alternative positioning systems to support or substitute GNSS when operating in challenging scenarios. Alternative positioning systems to GNSS are usually developed by employing several sensors, such as radio-based, vision, LiDAR (Light Detection and Ranging), and RADAR (Radio Detection and Ranging). Furthermore, a collaborative approach can also be developed to increase the robustness of the navigation solution of inter-connected vehicles. To support research in this area, we are presenting the CONTEST (Collaborative pOsitioning and NavigaTion bEtween ground and uaS plaTforms) dataset, aiming at providing multiple data streams to test collaborative positioning approaches, involving both terrestrial and aerial platforms, based on the use of several sensors, such as Ultra-Wide Band (UWB) transceivers, cameras, LiDARs, GNSS. Data are described and some initial results presented.