Affiliation:
1. Tokyo Metropolitan University, 6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan
Abstract
During the period from sowing and planting to harvesting, outdoor crops are directly affected by the natural environment, including wind, rain, frost, and sunlight. Under such circumstances, vegetables change their growth conditions, shape, and flexibility daily. We aimed to develop an agricultural work-support robot that automates monitoring, cultivation, disease detection, and treatment. In recent years, many researchers and venture companies have developed agricultural harvesting robots. In this study, instead of focusing on intensive harvesting operations, we focused on daily farm operations from the beginning of cultivation to immediately before harvest. Therefore, gripping and cutting are considered basic functions that are common to several routine agricultural tasks. To find the assumed objects from a camera image with a low computational load, this study focuses on branch points to detect and identify even if the stems, lateral branches, and axillary buds are swaying in the wind. A branch point is a characteristic part close to the working position, even when the wind blows. Therefore, we propose a method to detect the assumed branch points simultaneously and divide each branch point into the main stem, lateral branch, and axillary bud. The effectiveness of this method is demonstrated through experimental evaluations using three types of vegetables, regardless of whether their stems are swaying.
Publisher
Fuji Technology Press Ltd.
Reference21 articles.
1. Y. Matsumoto, “Toward the realization of sustainable food systems – Achieving both productivity improvement and sustainability through smart agriculture, etc. –,” Japanese J. of Pesticide Science, Vol.47, No.2, pp. 117-120, 2022 (in Japanese). https://doi.org/10.1584/jpestics.W22-25
2. Y. Muto, “Contact Theoretic Analysis of Pest Management Support Services: Analysis Based on an Example Using Smart Farming Technologies,” J. of Rural Economics, Vol.93, No.3, pp. 295-300, 2021 (in Japanese). https://doi.org/10.11472/nokei.93.295
3. J. Saito, “Suspended Automatic Harvesting Robot Utilizing AI,” J. of the Robotics Society of Japan, Vol.39, No.10, pp. 901-906, 2021 (in Japanese). https://doi.org/10.7210/jrsj.39.901
4. T. Mikami et al., “Hidden Main Stem Detection of Tree in Dense Fruit Vegetable Field – Image-to-Image Translation by Deep Convolutional Neural Networks Learned with Realistic Computer Graphics –,” J. of the Robotics Society of Japan, Vol.40, No.2, pp. 143-153, 2022 (in Japanese). https://doi.org/10.7210/jrsj.40.143
5. T. Isokane, F. Okura, A. Ide, Y. Matsushita, and Y. Yagi, “Probabilistic plant modeling via multi-view image-to-image translation,” Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 2906-2915, 2018. https://doi.org/10.48550/arXiv.1804.09404