Abstract
Crop type mapping is regarded as an essential part of effective agricultural management. Automated crop type mapping using remote sensing images is preferred for the consistent monitoring of crop types. However, the main obstacle to generating annual crop type maps is the collection of sufficient training data for supervised classification. Classification based on unsupervised domain adaptation, which uses prior information from the source domain for target domain classification, can solve the impractical problem of collecting sufficient training data. This study presents self-training with domain adversarial network (STDAN), a novel unsupervised domain adaptation framework for crop type classification. The core purpose of STDAN is to combine adversarial training to alleviate spectral discrepancy problems with self-training to automatically generate new training data in the target domain using an existing thematic map or ground truth data. STDAN consists of three analysis stages: (1) initial classification using domain adversarial neural networks; (2) the self-training-based updating of training candidates using constraints specific to crop classification; and (3) the refinement of training candidates using iterative classification and final classification. The potential of STDAN was evaluated by conducting six experiments reflecting various domain discrepancy conditions in unmanned aerial vehicle images acquired at different regions and times. In most cases, the classification performance of STDAN was found to be compatible with the classification using training data collected from the target domain. In particular, the superiority of STDAN was shown to be prominent when the domain discrepancy was substantial. Based on these results, STDAN can be effectively applied to automated cross-domain crop type mapping without analyst intervention when prior information is available in the target domain.
Subject
General Earth and Planetary Sciences
Cited by
13 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献