LEARNING HARMONISED PLEIADES AND SENTINEL-2 SURFACE REFLECTANCES
Keywords: Surface reflectance, cross-calibration, machine learning, Pleiades, Sentinel-2
Abstract. In this paper, we investigate the use of machine-learning techniques in order to produce harmonised surface reflectances between Sentinel-2 and Pleiades images, and reduce the impact of the differences in sensors, view conditions, and atmospheric correction differences between them. We demonstrate that if a simple linear regression considering Sentinel-2 surface reflectances as the target domain can overcome this problem when both images are calibrated to Top of Canopy reflectances, the non-linearity brought by a simple Multi-Layer-Perceptron is already useful when Pleiades is corrected to Top of Atmosphere level and contributions of the atmosphere need to be learned. We also demonstrate that learning a Convolution Neural Network instead of a plain MLP can learn undesired spatial effects such as mis-registration or differences in spatial frequency content, that will affect the image quality of the corrected Pleiades product. We overcome this issue by proposing an adhoc input convolutional layer that will capture those effects and can later be unplugged during inference. Last, we also propose an architecture and loss function that is robust to unmasked clouds and produces a confidence prediction during inference.