Abstract
Abstract
Purpose
This paper proposes a low-cost and low-effort solution for determining the area of corn crops damaged by the wildlife facility utilising field images collected by an unmanned aerial vehicle (UAV). The proposed solution allows for the determination of the percentage of the damaged crops and their location.
Methods
The method utilises image segmentation models based on deep convolutional neural networks (e.g., UNet family) and transformers (SegFormer) trained on over 300 hectares of diverse corn fields in western Poland. A range of neural network architectures was tested to select the most accurate final solution.
Results
The tests show that despite using only easily accessible RGB data available from inexpensive, consumer-grade UAVs, the method achieves sufficient accuracy to be applied in practical solutions for agriculture-related tasks, as the IoU (Intersection over Union) metric for segmentation of healthy and damaged crop reaches 0.88.
Conclusion
The proposed method allows for easy calculation of the total percentage and visualisation of the corn crop damages. The processing code and trained model are shared publicly.
Publisher
Springer Science and Business Media LLC