The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Share
Publications Copernicus
Download
Citation
Share
Articles | Volume XLVIII-2/W11-2025
https://doi.org/10.5194/isprs-archives-XLVIII-2-W11-2025-87-2025
https://doi.org/10.5194/isprs-archives-XLVIII-2-W11-2025-87-2025
30 Oct 2025
 | 30 Oct 2025

Autonomous Drone Navigation in Forest Environments Using Deep Learning

Guglielmo Del Col, Väinö Karjalainen, Teemu Hakala, and Eija Honkavaara

Keywords: Autonomous navigation, deep learning, UAV, forest environments, obstacle avoidance, trajectory planning

Abstract. Autonomous drone navigation in dense forests remains challenging due to unreliable GNSS signals, difficulty detecting thin branches, and cumulative drift in Visual-Inertial Odometry (VIO). This work investigates a deep learning-based navigation solution using a simulation-to-reality approach, focusing on boreal forests where fine obstacles and dense foliage are prevalent. A vision-based system was deployed, combining a semantically-enhanced depth autoencoder for small-branch detection and a Collision Prediction Network (CPN) based on motion primitive evaluation. The system, trained using RotorS and Aerial Gym simulations data, was implemented on a custom drone featuring a RealSense D435i and a RealSense T265 sensor suite and NVIDIA Orin NX for onboard processing. Real-world tests in open, lightly vegetated, and dense forests revealed robust performance against larger obstacles but highlighted limitations in thin-branch avoidance and odometry drift in highly cluttered environments. While simulation results were satisfactory, real-world trials achieved moderate success (60 m flights), demonstrating the potential of the framework for forestry applications. As future directions, integrating higher-resolution sensors, RGB-depth fusion, a y-velocity integration and possibly a small lidar, to address current gaps are proposed. The findings underscore the need for real-world validation beyond simulation to bridge the perception-action gap in complex environments.

Share