The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Share
Publications Copernicus
Download
Citation
Share
Articles | Volume XLVIII-G-2025
https://doi.org/10.5194/isprs-archives-XLVIII-G-2025-47-2025
https://doi.org/10.5194/isprs-archives-XLVIII-G-2025-47-2025
28 Jul 2025
 | 28 Jul 2025

Enhancing VINS with Smart Feature Grading: Overcoming Cautious and Excessive Removal of Dynamic Features for Robust Urban Localization

Mahmoud Adham, Wu Chen, Ahmed Mansour, Mostafa Mahmoud, and Yaxin Li

Keywords: Visual-inertial systems (VINS), dynamic features, visual feature grading, urban localization, autonomous vehicles

Abstract. Visual-inertial navigation systems (VINS) have emerged as a popular and effective solution for autonomous navigation due to their accuracy, real-time capabilities, and cost-effectiveness. However, while traditional VINS methods excel in static environments with well-distributed features, they struggle in highly dynamic urban environments where moving objects distort feature tracking, leading to pose estimation errors and localization inaccuracies. Recent approaches, such as image geometric constraints-based methods, aim to address these challenges but are limited when moving objects dominate the scene. Deep learning (DL)-based methods, which directly remove potential dynamic objects, often degrade accuracy in low-texture scenes and overlook the resulting uneven feature distribution, further impacting state estimation. To address these issues, we propose a novel VINS method that combines visual and inertial information with a smart feature grading module to overcome cautious and excessive dynamic feature removal, effectively handling the complexities of dominant and ambiguous dynamic objects beyond the limitations of traditional DL and vision-based methods. The method's performance shows effective identification and filtering of dynamic features while preserving static ones. Tests carried out on multiple datasets in urban dynamic environments highlight the method's enhanced accuracy and robustness.

Share