A COMPARISON BETWEEN CYCLE-GAN BASED FEATURE TRANSLATION AND OPTICAL-SAR VEGETATION INDICES
Keywords: Cycle-GAN, Deep Learning, Vegetation Indices, Feature Translation, Change Detection
Abstract. Optical and microwave remote sensing technologies have become key tools for local and global change detection applications. Generally, optical data has been the focus of remote sensing for change detection because of the varied spatial and temporal resolutions that allow for reliable information. However, the dependence of optical data on weather conditions prevents continuous and up-to-date information. On the other hand, Synthetic Aperture Radar (SAR) data can record all-weather and all-time polarization information which is critical for change detection in poor weather conditions; nonetheless, SAR is not precise as optical data for forestry change detection applications as it cannot provide the spectral features of interest. The combined processing of optical and SAR images allow for the retrieval of information of interest with a precision that none of them could achieve alone. In this context, Cycle-Consistent Generative Adversarial Networks (CycleGAN) based deep feature translation method was proposed in this study for change detection. The CycleGAN transfers images from one domain (optical) to another domain (SAR) into the same feature space using a cyclic structure. Thus, it can provide continuous and up-to-date information for change detection while keeping its spectral features. The accuracy of the fake images generated from CycleGAN was evaluated by correlating them with spectral indices (e.g., Normalized Difference Vegetation Index (NDVI), Modified Radar Vegetation Index (mRVI), and Modified Radar Forest Degradation Index (mRFDI)) directly obtained from optical and SAR data. As a result, the best correlation coefficients (R) were found between real NDVI (optical data) and fake NDVI (CycleGAN) with 0.98 and 0.97 for two different dated datasets.