The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume XLII-3/W10
07 Feb 2020
 | 07 Feb 2020


M. Zhu, B. Wu, Y. N. He, and Y. Q. He

Keywords: Deep Learning, Convolutional Neural Networks, Land Cover Classification, High Resolution Satellite Image, Semantic Segmentation

Abstract. In the coming era of big data, the high resolution satellite image plays an important role in providing a rich source of information for a variety of applications. Land cover classification is a major field of remote sensing application. The main task of land cover classification is to divide the pixels or regions in remote sensing imagery into several categories according to application requirements. Recently, machine interpretation methods including artificial neural network and decision tree are developing rapidly with certain fruits achieved. Compared with traditional methods, deep learning is completely data-driven, which can automatically find the best ways to extract land cover features through high resolution satellite image. This study presents a detailed investigation of convolutional neural networks for the classification of complex land cover classes using high resolution satellite image. The main contributions of this paper are as follows: (1) Aiming at the uneven spatial distribution of surface coverage, we study the training errors caused by this uneven distribution. An improved SMOTE algorithm is designed for automatic processing the task of sample augmentation. Through experimental verification, the improver algorithm can increase 2–5% classification accuracy by the same network structure. (2) The main representations of the network are also shared between the edge loss reinforced structures and semantic segmentation, which means that the CNN simultaneously achieves semantic segmentation by edge detection. (3) We use Beijing-2 satellite (BJ-2) remote sensing data for training and evaluation with Integrated Model, and the total accuracy reaches 89.6%.