Affiliation:
1. Institute of Computer Science, University of Tartu, Narva Maantee 18, 51009 Tartu, Estonia
Abstract
For autonomous driving, perception is a primary and essential element that fundamentally deals with the insight into the ego vehicle’s environment through sensors. Perception is challenging, wherein it suffers from dynamic objects and continuous environmental changes. The issue grows worse due to interrupting the quality of perception via adverse weather such as snow, rain, fog, night light, sand storms, strong daylight, etc. In this work, we have tried to improve camera-based perception accuracy, such as autonomous-driving-related object detection in adverse weather. We proposed the improvement of YOLOv8-based object detection in adverse weather through transfer learning using merged data from various harsh weather datasets. Two prosperous open-source datasets (ACDC and DAWN) and their merged dataset were used to detect primary objects on the road in harsh weather. A set of training weights was collected from training on the individual datasets, their merged versions, and several subsets of those datasets according to their characteristics. A comparison between the training weights also occurred by evaluating the detection performance on the datasets mentioned earlier and their subsets. The evaluation revealed that using custom datasets for training significantly improved the detection performance compared to the YOLOv8 base weights. Furthermore, using more images through the feature-related data merging technique steadily increased the object detection performance.
Funder
This research has been financed by the European Social Fund via “ICT programme” measure.
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference68 articles.
1. (2023, October 10). Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles (J3016B). Available online: https://www.sae.org/standards/content/j3016_201806/.
2. Perception and Sensing for Autonomous Vehicles under Adverse Weather Conditions: A Survey;Zhang;ISPRS J. Photogramm. Remote Sens.,2023
3. Bijelic, M., Gruber, T., Mannan, F., Kraus, F., Ritter, W., Dietmayer, K., and Heide, F. (2020, January 13–19). Seeing through Fog without Seeing Fog: Deep Multimodal Sensor Fusion in Unseen Adverse Weather. Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA.
4. Lee, U., Jung, J., Shin, S., Jeong, Y., Park, K., Shim, D.H., and Kweon, I.S. (2016, January 9–14). EureCar Turbo: A Self-Driving Car That Can Handle Adverse Weather Conditions. Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea.
5. Qian, K., Zhu, S., Zhang, X., and Li, L.E. (2021, January 20–25). Robust Multimodal Vehicle Detection in Foggy Weather Using Complementary Lidar and Radar Signals. Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA.
Cited by
18 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献