Affiliation:
1. Henan Key Laboratory of Grain Photoelectric Detection and Control, Henan University of Technology, Zhengzhou 450001, China
2. School of International Education, Kaifeng University, Kaifeng 475001, China
Abstract
RFID-based technology innovated a new field of wireless sensing, which has been applied in posture recognition, object localization, and the other sensing fields. Due to the presence of a Fresnel zone around a magnetic field when the RFID-based system is working, the signal undergoes significant changes when an object moves through two or more different Fresnel zones. Therefore, the moving object can be sensed more easily, and most of the sensing applications required the tag to be attached to the moving object for better sensing, significantly limiting their applications. The existing technologies to detect static objects in agricultural settings are mainly based on X-ray or high-power radar, which are costly and bulky, making them difficult to deploy on a large scale. It is a challenging task to sense a static target without a tag attached in NLOS (non-line-of-sight) detection with low cost. We utilized RFID technologies to sense the static foreign objects in agricultural products, and take metal, rock, rubber, and clod as sensing targets that are common in agriculture. By deploying tag matrices to create a sensing region, we observed the signal variations before and after the appearance of the targets in this sensing region, and determined the targets’ positions and their types. Here, we buried the targets in the media of seedless cotton and wheat, and detected them using a non-contact method. Research has illustrated that, by deploying appropriate tag matrices and adjusting the angle of a single RFID antenna, the matrices’ signals are sensitive to the static targets’ positions and their properties, i.e., matrices’ signals vary with different targets and their positions. Specifically, we achieved a 100% success rate in locating metallic targets, while the success rate for clods was the lowest at 86%. We achieved a 100% recognition rate for the types of all the four objects.