Self-Adaptive-Filling Deep Convolutional Neural Network Classification Method for Mountain Vegetation Type Based on High Spatial Resolution Aerial Images

Author:

Li Shiou1,Fei Xianyun1,Chen Peilong1,Wang Zhen2,Gao Yajun2,Cheng Kai1,Wang Huilong1,Zhang Yuanzhi34ORCID

Affiliation:

1. School of Geomatics and Marine Information, Jiangsu Ocean University, Lianyungang 222002, China

2. Lianyungang Forestry Technical Guidance Station, Lianyungang 222005, China

3. Key Laboratory of Lunar and Deep Space Exploration, National Astronomical Observatory, Chinese Academy of Sciences, Beijing 100101, China

4. School of Astronomy and Space Science, University of Chinese Academy of Sciences, Beijing 100049, China

Abstract

The composition and structure of mountain vegetation are complex and changeable, and thus urgently require the integration of Object-Based Image Analysis (OBIA) and Deep Convolutional Neural Networks (DCNNs). However, while integration technology studies are continuing to increase, there have been few studies that have carried out the classification of mountain vegetation by combining OBIA and DCNNs, for it is difficult to obtain enough samples to trigger the potential of DCNNs for mountain vegetation type classification, especially using high-spatial-resolution remote sensing images. To address this issue, we propose a self-adaptive-filling method (SAF) to incorporate the OBIA method to improve the performance of DCNNs in mountain vegetation type classification using high-spatial-resolution aerial images. Using this method, SAF technology was employed to produce enough regular sample data for DCNNs by filling the irregular objects created by image segmenting using interior adaptive pixel blocks. Meanwhile, non-sample segmented image objects were shaped into different regular rectangular blocks via SAF. Then, the classification result was defined by voting combining the DCNN performance. Compared to traditional OBIA methods, SAF generates more samples for the DCNN and fully utilizes every single pixel of the DCNN input. We design experiments to compare them with traditional OBIA and semantic segmentation methods, such as U-net, MACU-net, and SegNeXt. The results show that our SAF-DCNN outperforms traditional OBIA in terms of accuracy and it is similar to the accuracy of the best performing method in semantic segmentation. However, it reduces the common pretzel phenomenon of semantic segmentation (black and white noise generated in classification). Overall, the SAF-based OBIA using DCNNs, which is proposed in this paper, is superior to other commonly used methods for vegetation classification in mountainous areas.

Funder

National Natural Science Foundation of China

Key Laboratory of Coastal Salt Marsh Ecology and Resources, Ministry of Natural Resources

Key subject of “Surveying and Mapping Science and Technology” of Jiangsu Ocean University

Postgraduate Research & Practice Innovation Program of Jiangsu Ocean University

Publisher

MDPI AG

Subject

General Earth and Planetary Sciences

Reference47 articles.

1. Satellite remote sensing of ecosystem functions: Opportunities, challenges and way forward;Pettorelli;Remote Sens. Ecol. Conserv.,2018

2. Review of studies on tree species classification from remotely sensed data;Fassnacht;Remote Sens. Environ.,2016

3. Remote sensing technologies for enhancing forest inventories: A review;White;Can. J. Remote Sens.,2016

4. Remote sensing image fusion on 3D scenarios: A review of applications for agriculture and forestry;Jurado;Int. J. Appl. Earth Obs. Geoinf.,2022

5. Suitability and adaptation of PROSAIL radiative transfer model for hyperspectral grassland studies;Atzberger;Remote Sens. Lett.,2013

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3