Deep Learning for Palm Tree Health Assessment: UAV-Based Segmentation in the Figuig Region of Morocco
Keywords: UAV, Deep Learning, Palm Tree, Segmentation
Abstract. This study addresses the challenge of classifying healthy and unhealthy date palm trees within the Figuig oasis region of Morocco using high-resolution Unmanned Aerial Vehicle (UAV) imagery and deep learning. Traditional methods like ground surveys are often time-consuming, costly, and subjective for large areas, while satellite-based remote sensing may lack the spatial resolution to assess individual tree health accurately. Our UAV-based deep learning approach aims to overcome these limitations by providing improved scalability, objectivity, and spatial precision. Recognizing that the ’unhealthy’ status observable in top-down UAV imagery represents a visually complex aggregation of symptoms—potentially caused by various stressors prevalent in the Figuig oasis such as Bayoud disease (caused by Fusarium oxysporum f.sp. albedinis) and drought stress—our annotations focused on classifying trees exhibiting these general visual signs of poor health rather than specific causal agents. We therefore focused on semantic segmentation for pixel-level classification. High-resolution RGB orthomosaics were acquired via UAV, processed into tiles, and manually annotated to create a dataset distinguishing three classes: healthy palm, unhealthy palm, and background. This dataset, comprising 296 tiles derived from an initial set and split into training (70%), validation (20%), and testing (10%), was used to train and evaluate U-Net and DeepLabV3+ models implemented from scratch. Quantitative evaluation on the unseen test set demonstrated promising performance: the DeepLabV3+ model achieved a Macro Average F1-score of 82.06%, slightly outperforming the U-Net model’s score of 81.28%. Both models showed strong capability in identifying background and healthy palms. However, accurately segmenting the diverse ’unhealthy’ class remained the most significant challenge. This potentially highlights inherent difficulties in differentiating subtle or varied stress symptoms from aerial RGB data alone, and may also reflect potential limitations of these architectures in fully capturing fine-grained textural variations or distinguishing between visually similar stress responses without additional input (e.g., multispectral data). Despite these challenges, the findings underscore the potential of integrating UAV technology with custom deep learning models for practical, large-scale palm health status assessment in precision agriculture, offering a marked improvement over less scalable or lower-resolution traditional techniques. While limitations related to dataset diversity and the inability to distinguish specific stressors from RGB data exist, this research provides a foundation for developing AI-driven tools to support timely crop management decisions and promote sustainable date palm cultivation. Future work may focus on enhancing model robustness, incorporating complementary data sources (like thermal or multispectral imagery), and investigating model architectures better suited for subtle feature extraction.
