Affiliation:
1. Korea Ocean Satellite Center, Korea Institute of Ocean Science and Technology, Busan 49111, Republic of Korea
2. Ocean Climate Response & Ecosystem Report Department, Korea Institute of Ocean Science and Technology, Busan 49111, Republic of Korea
Abstract
Recent advances in deep learning (DL) and unmanned aerial vehicle (UAV) technologies have made it possible to monitor salt marshes more efficiently and precisely. However, studies have rarely compared the classification performance of DL with the pixel-based method for coastal wetland monitoring using UAV data. In particular, many studies have been conducted at the landscape level; however, little is known about the performance of species discrimination in very small patches and in mixed vegetation. We constructed a dataset based on UAV-RGB data and compared the performance of pixel-based and DL methods for five scenarios (combinations of annotation type and patch size) in the classification of salt marsh vegetation. Maximum likelihood, a pixel-based classification method, showed the lowest overall accuracy of 73%, whereas the U-Net classification method achieved over 90% accuracy in all classification scenarios. As expected, in a comparison of pixel-based and DL methods, the DL approach achieved the most accurate classification results. Unexpectedly, there was no significant difference in overall accuracy between the two annotation types and labeling data sizes in this study. However, when comparing the classification results in detail, we confirmed that polygon-type annotation was more effective for mixed-vegetation classification than the bounding-box type. Moreover, the smaller size of labeling data was more effective for detecting small vegetation patches. Our results suggest that a combination of UAV-RGB data and DL can facilitate the accurate mapping of coastal salt marsh vegetation at the local scale.
Funder
Korea Institute of Ocean Science and Technology
Subject
General Earth and Planetary Sciences
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献