The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Download
Citation
Articles | Volume XLIII-B3-2022
https://doi.org/10.5194/isprs-archives-XLIII-B3-2022-405-2022
https://doi.org/10.5194/isprs-archives-XLIII-B3-2022-405-2022
30 May 2022
 | 30 May 2022

TRANSFER LEARNING WITH LIMITED SAMPLES FOR THE SAME SOURCE HYPERSPECTRAL REMOTE SENSING IMAGES CLASSIFICATION

W. Li, Q. Liu, Y. Wang, and H. Li

Keywords: Hyperspectral image (HSI), Transfer learning, Few-shot learning, Convolutional neural network(CNN), Same source

Abstract. A classification method for hyperspectral datasets with a limited number of samples based on transferred convolutional neural network (CNN) is proposed. For the CNN model, a lot of labeled samples are needed for the classification of hyperspectral images, but it takes plenty of time and labor to annotate images in the experiment. In our work, CNN model and transfer learning are applied to solve this problem. By pre-training the model on the other hyperspectral dataset, the classification results of the target hyperspectral images can be effectively improved, when the number of trainable samples is limited. Three transfer approaches are chosen for classifying hyperspectral images and their performance are compared and analyzed. With the decrease of the number of samples, transfer learning has an increasing impact on the classification results of hyperspectral images. In the three transferred models, freezing the convolutional layer weights and retraining the fully connected layer weights yields the best classification performance, which reaches 77.23% in classification accuracy, when the number of samples per class is set 10. And when the number of training samples is 5, the classification accuracy growth rate reaches a maximum of 33%. The results indicate that a relatively high classification accuracy could be obtained by training only a limited number of samples with the same domain transferred parameters.