A TEST ON COLLABORATIVE VISION AND UWB-BASED POSITIONING
Keywords: Collaborative positioning, SLAM, Vision, UWB
Abstract. Despite GNSS (Global Navigation Satellite System) enables positioning, navigation and timing (PNT) almost everywhere, the development of applications like self-driving vehicles and indoor navigation requires extending accurate positioning to scenarios where GNSS either is not reliable or does garantee a sufficiently precise solution. Integrating inforamtion provided by different sensors is commonly accepted to be a quite viable way for such extension. In particular this work is part of a project aiming at investigating the positioning performance that can be obtained by integrating vision with radio-based systems and inertial sensors, which are commonly installed on many smart devices, such as smartphones. Furthermore, this work considers positioning in a collaborative scenario, where different interconnected platforms, i.e. unmanned aerial vehicles and pedestrians provided with smartphones, are moving on the same area. The results obtained in the considered tests show a good potential (submetric 2D positioning error) for what concerns the implemented strategies, where the integration of different technologies can ensure decent performance in a wider range of working cases.