Affiliation:
1. Valeo Detection Systems GmbH
2. Valeo
Abstract
<div class="section abstract"><div class="htmlview paragraph">In the rapidly evolving era of software and autonomous driving systems, there is a pressing demand for extensive validation and accelerated development. This necessity arises from the need for copious amounts of data to effectively develop and train neural network algorithms, especially for autonomous vehicles equipped with sensor suites encompassing various specialized algorithms, such as object detection, classification, and tracking. To construct a robust system, sensor data fusion plays a vital role. One approach to ensure an ample supply of data is to simulate the physical behavior of sensors within a simulation framework. This methodology guarantees redundancy, robustness, and safety by fusing the raw data from each sensor in the suite, including images, polygons, and point clouds, either on a per-sensor level or on an object level. Creating a physical simulation for a sensor is an extensive and intricate task that demands substantial computational power. Alternatively, another method involves statistically and phenomenologically modeling the sensor by simulating the behavior of the perception stack. This technique enables faster-than-real-time simulation, expediting the development process. This paper aims to elucidate the development and validation of a phenomenological LIDAR sensor model, as well as its utilization in the development of sensor fusion algorithms. By leveraging this approach, researchers can effectively simulate sensor behavior, facilitate faster development cycles, and enhance algorithmic advancements in autonomous systems.</div></div>