Research on Grape-Planting Structure Perception Method Based on Unmanned Aerial Vehicle Multispectral Images in the Field
-
Published:2022-11-10
Issue:11
Volume:12
Page:1894
-
ISSN:2077-0472
-
Container-title:Agriculture
-
language:en
-
Short-container-title:Agriculture
Author:
Qu Aili,Yan Zhipeng,Wei Haiyan,Ma Liefei,Gu Ruipeng,Li Qianfeng,Zhang Weiwei,Wang Yutan
Abstract
In order to accurately obtain the distribution of large-field grape-planting sites and their planting information in complex environments, the unmanned aerial vehicle (UAV) multispectral image semantic segmentation model based on improved DeepLabV3+ is used to solve the problem that large-field grapes in complex environments are affected by factors such as scattered planting sites and complex background environment of planting sites, which makes the identification of planting areas less accurate and more difficult to manage. In this paper, firstly, the standard deviation (SD) and interband correlation of UAV multispectral images were calculated to obtain the best band combinations for large-field grape images, and five preferred texture features and two preferred vegetation indices were screened using color space transformation and grayscale coevolution matrix. Then, supervised classification methods, such as maximum likelihood (ML), random forest (RF), and support vector machine (SVM), unsupervised classification methods, such as the Iterative Self-organizing Data Analysis Techniques Algorithm (ISO DATA) model and an improved DeepLabV3+ model, are used to evaluate the accuracy of each model in combination with the field visual translation results to obtain the best classification model. Finally, the effectiveness of the classification features on the best model is verified. The results showed that among the four machine learning methods, SVM obtained the best overall classification accuracy of the model; the DeepLabV3+ deep learning scheme based on spectral information + texture + vegetation index + digital surface model (DSM) obtained the best accuracy of overall accuracy (OA) and frequency weight intersection over union (FW-IOU) of 87.48% and 83.23%, respectively, and the grape plantation area relative error of extraction was 1.9%. This collection scheme provides a research basis for accurate interpretation of the planting structure of large-field grapes.
Funder
Ningxia Key research and development program
Subject
Plant Science,Agronomy and Crop Science,Food Science
Reference47 articles.
1. Comparison between satellite and ground data with UAV-based information to analyse vineyard spatio-temporal variability;OENO One,2020 2. Sun, L., Gao, F., Anderson, M.C., Kustas, W.P., Alsina, M.M., Sanchez, L., Sams, B., McKee, L., Dulaney, W., and White, W.A. (2017). Daily Mapping of 30 m LAI and NDVI for Grape Yield Prediction in California Vineyards. Remote Sens., 9. 3. Cruz-Ramos, C., Garcia-Salgado, B., Reyes-Reyes, R., Ponomaryov, V., and Sadovnychiy, S. (2021). Gabor Features Extraction and Land-Cover Classification of Urban Hyperspectral Images for Remote Sensing Applications. Remote Sens., 13. 4. Lee, G., Hwang, J., and Cho, S. (2021). A Novel Index to Detect Vegetation in Urban Areas Using UAV-Based Multispectral Images. Appl. Sci., 11. 5. Peng, X., Chen, D., Zhou, Z., Zhang, Z., Xu, C., Zha, Q., Wang, F., and Hu, X. (2022). Prediction of the Nitrogen, Phosphorus and Potassium Contents in Grape Leaves at Different Growth Stages Based on UAV Multispectral Remote Sensing. Remote Sens., 14.
|
|