Abstract
AbstractMiniaturised hyperspectral cameras are becoming more easily accessible and smaller, enabling efficient monitoring of agricultural crops using unoccupied aerial systems (UAS). This study’s objectives were to develop and assess the performance of UAS-based hyperspectral cameras in the estimation of quantity and quality parameters of grass sward, including the fresh and dry matter yield, the nitrogen concentration (Ncont) in dry matter (DM), the digestibility of organic matter in DM (the D-value), neutral detergent fibre (NDF), and water-soluble carbohydrates (WSC). Next-generation hyperspectral cameras in visible-near-infrared (VNIR, 400–1000 nm; 224 bands) and shortwave-infrared (SWIR; 900–1700 nm; 224 bands) spectral ranges were used, and they were compared with commonly used RGB and VNIR multispectral cameras. The implemented machine-learning framework identified the most informative predictors of various parameters, and estimation models were then built using a random forest (RF) algorithm for each camera and its combinations. The results indicated accurate estimations; the best normalised root-mean-square errors (NRMSE) were 8.40% for the quantity parameters, and the best NRMSEs for the quality parameters were 7.44% for Ncont, 1% for D-value, 1.24% for NDF, and 12.02% for WSC. The hyperspectral datasets provided the best results, whereas the worst accuracies were obtained using the crop height model and RGB data. The integration of the VNIR and SWIR hyperspectral cameras generally provided the highest accuracies. This study showed for the first time the performance of novel SWIR range hyperspectral UAS cameras in agricultural application.
Funder
Academy of Finland
European Agricultural Fund for Rural Development
Interreg
Ministry of Agriculture and Forestry of Finland
National Land Survey of Finland
Publisher
Springer Science and Business Media LLC
Subject
General Agricultural and Biological Sciences
Reference125 articles.
1. Agisoft. (2022). Retrieved August 22, 2022, from https://agisoft.freshdesk.com/support/solutions/articles/31000148381-micasense-altum-processing-workflow-including-reflectance-calibration-in-agisoft-metashape-professi.
2. Aguate, F. M., Trachsel, S., Pérez, L. G., Burgueño, J., Crossa, J., Balzarini, M., Gouache, D., Bogard, M., & de los Campos, G. (2017). Use of hyperspectral image data outperforms vegetation indices in prediction of maize yield. Crop Science, 57, 2517–2524. https://doi.org/10.2135/cropsci2017.01.0007
3. Angel, Y., & McCabe, M. F. (2022). Machine learning strategies for the retrieval of leaf-chlorophyll dynamics: model choice, sequential versus retraining learning, and hyperspectral predictors. Frontiers in Plant Science. https://doi.org/10.3389/fpls.2022.722442
4. Ashburn, P. (1979). The vegetative index number and crop identification. NASA. In Johnson Space Center Proc. of Tech. Sessions, Vol. 1 and 2.
5. Askari, M. S., McCarthy, T., Magee, A., & Murphy, D. J. (2019). Evaluation of grass quality under different soil management scenarios using remote sensing techniques. Remote Sensing, 11, 1835. https://doi.org/10.3390/rs11151835
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献