Soybean Canopy Stress Classification Using 3D Point Cloud Data
Author:
Young Therin J.1, Chiranjeevi Shivani1, Elango Dinakaran2ORCID, Sarkar Soumik13, Singh Asheesh K.2, Singh Arti2, Ganapathysubramanian Baskar13, Jubery Talukder Z.3
Affiliation:
1. Department of Mechanical Engineering, Iowa State University, Ames, IA 50011, USA 2. Department of Agronomy, Iowa State University, Ames, IA 50011, USA 3. Translational AI Research and Education Center, Iowa State University, Ames, IA 50011, USA
Abstract
Automated canopy stress classification for field crops has traditionally relied on single-perspective, two-dimensional (2D) photographs, usually obtained through top-view imaging using unmanned aerial vehicles (UAVs). However, this approach may fail to capture the full extent of plant stress symptoms, which can manifest throughout the canopy. Recent advancements in LiDAR technologies have enabled the acquisition of high-resolution 3D point cloud data for the entire canopy, offering new possibilities for more accurate plant stress identification and rating. This study explores the potential of leveraging 3D point cloud data for improved plant stress assessment. We utilized a dataset of RGB 3D point clouds of 700 soybean plants from a diversity panel exposed to iron deficiency chlorosis (IDC) stress. From this unique set of 700 canopies exhibiting varying levels of IDC, we extracted several representations, including (a) handcrafted IDC symptom-specific features, (b) canopy fingerprints, and (c) latent feature-based features. Subsequently, we trained several classification models to predict plant stress severity using these representations. We exhaustively investigated several stress representations and model combinations for the 3-D data. We also compared the performance of these classification models against similar models that are only trained using the associated top-view 2D RGB image for each plant. Among the feature-model combinations tested, the 3D canopy fingerprint features trained with a support vector machine yielded the best performance, achieving higher classification accuracy than the best-performing model based on 2D data built using convolutional neural networks. Our findings demonstrate the utility of color canopy fingerprinting and underscore the importance of considering 3D data to assess plant stress in agricultural applications.
Funder
Iowa Soybean Association, R.F. Baker Center for Plant Breeding, Plant Sciences Institute AI Institute for Resilient Agriculture COALESCE: COntext Aware LEarning for Sustainable CybEr-Agricultural Systems FACT: A Scalable Cyber Ecosystem for Acquisition, Curation, and Analysis of Multispectral UAV Image Data Smart Integrated Farm Network for Rural Agricultural Communities USDA CRIS Project
Reference67 articles.
1. Crops that feed the World 2. Soybean—worldwide production, use, and constraints caused by pathogens and pests;Hartman;Food Secur.,2011 2. Singh, A.K., Singh, A., Sarkar, S., Ganapathysubramanian, B., Schapaugh, W., Miguez, F.E., Carley, C.N., Carroll, M.E., Chiozza, M.V., and Chiteri, K.O. (2021). High-Throughput Crop Phenotyping, Springer. 3. Micronutrients in Crop Production;Sparks;Advances in Agronomy,2002 4. Agronomic Performance of Soybeans with Differing Levels of Iron Deficiency Chlorosis on Calcareous Soil 1;Froechlich;Crop Sci.,1981 5. Identification of candidate genes underlying an iron efficiency quantitative trait locus in soybean;Peiffer;Plant Physiol.,2012
|
|