FS_YOLOv8: A Deep Learning Network for Ground Fissures Instance Segmentation in UAV Images of the Coal Mining Area
Keywords: Ground fissures identification, UAV image processing, Instance segmentation, Improved YOLOv8, Coal mining area
Abstract. The ground fissures caused by coal mining have seriously affected the ecological environment of the land. Timely and accurate identification and landfill treatment of ground fissures can avoid secondary geological disasters in coal mine areas. At present, the fissure identification methods based on deep learning show excellent performance on roads and walls, etc. Nevertheless, the automatic and reliable segmentation of ground fissures in remote sensing images poses a challenge for deep learning networks, due to the diverse and complex texture information included in the mining ground fissures and background.
To overcome these challenges, we propose an improved YOLOv8 instance segmentation network to automatically and efficiently segment the ground fissures in coal mining areas. In detail, a model called FS_YOLOv8 is proposed. The DSPP (Dynamic Snake convolutional Pyramid Pooling) module is incorporated into the FS_YOLOv8 model to establish a multi-scale dynamic snake convolution feature aggregation structure. This module replaces the conventional convolution found in the SPPF module of YOLOv8 and aims to enhance the model's ability to extract features related to fissures with tubular structures. Furthermore, the D-LKA (Deformable Large Kernel Attention) module is employed to autonomously collect fissure context information. To enhance the detection capability of challenging samples in remote sensing images with intricate background and fissure texture, we employ a Slide Loss function. Ultimately, the ground fissure dataset of unmanned aerial vehicle (UAV) images in coal mine areas is subjected to experimental analysis. The experimental findings demonstrate that FS_YOLOv8 exhibits exceptional proficiency in segmenting ground fissures within intricate and expansive mining areas.