Abstract
Artificially extracted agricultural phenotype information has high subjectivity and low accuracy, and the use of image extraction information is easily disturbed by haze. Moreover, the agricultural image dehazing method used to extract such information is ineffective, as the images often contain unclear texture information and image colors. To address these shortcomings, we propose unpaired image dehazing via a cycle-consistent generative adversarial network for the agricultural plant phenotype (AgriGAN). The algorithm improves the dehazing performance of the network by adding the atmospheric scattering model, which improves the discriminator model, and uses the whole-detail consistent discrimination method to improve the efficiency of the discriminator so that the adversarial network can accelerate the convergence to the Nashi equilibrium state. Finally, the dehazed images are obtained by training with network adversarial loss + cycle consistent loss. Experiments and a comparative analysis were conducted to evaluate the algorithm, and the results show that it improved the dehazing accuracy of agricultural images, retained detailed texture information, and mitigated the problem of color deviation. In turn, useful information was obtained, such as crop height, chlorophyll and nitrogen content, and the presence and extent of disease. The algorithm's object identification and information extraction can be useful in crop growth monitoring and yield and quality estimation.