Author:
Gamerdinger Jörg,Volk Georg,Teufel Sven,von Bernuth Alexander,Müller Stefan,Hospach Dennis,Bringmann Oliver
Abstract
AbstractRobust perception of the environment under a variety of ambient conditions is crucial for autonomous driving. Convolutional Neural Networks (CNNs) achieve high accuracy for vision-based object detection, but are strongly affected by adverse weather conditions such as rain, snow, and fog, as well as soiled sensors. We propose physically correct simulations of these conditions for vision-based systems, since publicly available data sets lack scenarios with different environmental conditions. In addition, we provide a data set of real images containing adverse weather for evaluation. By training CNNs with augmented data, we achieve a significant improvement in robustness for object detection. Furthermore, we present the advantages of cooperative perception to compensate for limited sensor ranges of local perception. A key aspect of autonomous driving is safety; therefore, a robustness evaluation of the perception system is necessary, which requires an appropriate safety metric. In contrast to existing approaches, our safety metric focuses on scene semantics and the relevance of surrounding objects. The performance of our approaches is evaluated using real-world data as well as augmented and virtual reality scenarios.
Publisher
Springer International Publishing