UAV image sequence reveals feature damage during hurricane evolution
Keywords: UAV, Hurricane, Damage assessment, Semantic segmentation, Spatial analysis
Abstract. In recent years, hurricane disasters have occurred frequently, causing significant losses to human society. UAV technology, with its advantages of high mobility and low cost, has been widely applied in post-disaster loss assessment. However, existing UAV image assessment methods still exhibit deficiencies in model accuracy and the rapid analysis of spatial distribution of disaster events. To address these issues, this study proposes a semantic segmentation model, H4DNet, which combines the global feature extraction capability of SegFormer and the information reconstruction capability of U-Net, aiming to efficiently extract information about damaged objects from UAV images. Experimental results show that H4DNet achieved a 93.71% average accuracy, 87.80% mean accuracy, and 78.01% mean Intersection over Union (mIoU) on the RescueNet dataset, outperforming other comparative models. Furthermore, this paper introduces the Disaster Damage Index (DDI), which generates a spatial distribution map of disaster events by calculating the area proportion of damaged objects and using an Adjusted Inverse Distance Weighting (AIDW) spatial interpolation algorithm. The results indicate that DDI can accurately reflect the severity of disaster-affected areas. The study also verified the energy attenuation process after hurricane landfall through the spatial distribution of objects affected by different disaster levels, providing valuable insights for disaster assessment and emergency response.
