L-DIG: A GAN-Based Method for LiDAR Point Cloud Processing under Snow Driving Conditions
Author:
Zhang Yuxiao1ORCID, Ding Ming12, Yang Hanting1ORCID, Niu Yingjie1, Feng Yan1, Ohtani Kento1, Takeda Kazuya13
Affiliation:
1. Graduate School of Informatics, Nagoya University, Furo-cho, Chikusa-Ward, Nagoya 464-8601, Japan 2. Institutes of Innovation for Future Society, Nagoya University, Furo-cho, Chikusa-Ward, Nagoya 464-8601, Japan 3. Tier IV Inc., Nagoya University Open Innovation Center, 1-3, Mei-eki 1-chome, Nakamura-Ward, Nagoya 450-6610, Japan
Abstract
LiDAR point clouds are significantly impacted by snow in driving scenarios, introducing scattered noise points and phantom objects, thereby compromising the perception capabilities of autonomous driving systems. Current effective methods for removing snow from point clouds largely rely on outlier filters, which mechanically eliminate isolated points. This research proposes a novel translation model for LiDAR point clouds, the ‘L-DIG’ (LiDAR depth images GAN), built upon refined generative adversarial networks (GANs). This model not only has the capacity to reduce snow noise from point clouds, but it also can artificially synthesize snow points onto clear data. The model is trained using depth image representations of point clouds derived from unpaired datasets, complemented by customized loss functions for depth images to ensure scale and structure consistencies. To amplify the efficacy of snow capture, particularly in the region surrounding the ego vehicle, we have developed a pixel-attention discriminator that operates without downsampling convolutional layers. Concurrently, the other discriminator equipped with two-step downsampling convolutional layers has been engineered to effectively handle snow clusters. This dual-discriminator approach ensures robust and comprehensive performance in tackling diverse snow conditions. The proposed model displays a superior ability to capture snow and object features within LiDAR point clouds. A 3D clustering algorithm is employed to adaptively evaluate different levels of snow conditions, including scattered snowfall and snow swirls. Experimental findings demonstrate an evident de-snowing effect, and the ability to synthesize snow effects.
Funder
Nagoya University JSPS KAKENHI
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference47 articles.
1. Carballo, A., Lambert, J., Monrroy, A., Wong, D., Narksri, P., Kitsukawa, Y., Takeuchi, E., Kato, S., and Takeda, K. (November, January 19). LIBRE: The multiple 3D LiDAR dataset. Proceedings of the Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA. 2. Canadian adverse driving conditions dataset;Pitropov;Int. J. Robot. Res.,2021 3. Diaz-Ruiz, C.A., Xia, Y., You, Y., Nino, J., Chen, J., Monica, J., Chen, X., Luo, K., Wang, Y., and Emond, M. (2022, January 18–24). Ithaca365: Dataset and Driving Perception Under Repeated and Challenging Weather Conditions. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA. 4. Perception and sensing for autonomous vehicles under adverse weather conditions: A survey;Zhang;ISPRS J. Photogramm. Remote Sens.,2023 5. Guo, A., Feng, Y., and Chen, Z. (2022, January 18–22). LiRTest: Augmenting LiDAR point clouds for automated testing of autonomous driving systems. Proceedings of the 31st ACM SIGSOFT International Symposium on Software Testing and Analysis, Virtual.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|