Multi-Sensor Data Fusion for Real-Time Multi-Object Tracking
-
Published:2023-02-07
Issue:2
Volume:11
Page:501
-
ISSN:2227-9717
-
Container-title:Processes
-
language:en
-
Short-container-title:Processes
Author:
Senel Numan1, Kefferpütz Klaus12, Doycheva Kristina2, Elger Gordon12
Affiliation:
1. Technische Hochschule Ingolstadt, Esplanade 10, 85049 Ingolstadt, Germany 2. Fraunhofer-Anwendungszentrum Vernetzte Mobilität und Infrastruktur, Stauffenbergstrasse 2a, 85051 Ingolstadt, Germany
Abstract
Sensor data fusion is essential for environmental perception within smart traffic applications. By using multiple sensors cooperatively, the accuracy and probability of the perception are increased, which is crucial for critical traffic scenarios or under bad weather conditions. In this paper, a modular real-time capable multi-sensor fusion framework is presented and tested to fuse data on the object list level from distributed automotive sensors (cameras, radar, and LiDAR). The modular multi-sensor fusion architecture receives an object list (untracked objects) from each sensor. The fusion framework combines classical data fusion algorithms, as it contains a coordinate transformation module, an object association module (Hungarian algorithm), an object tracking module (unscented Kalman filter), and a movement compensation module. Due to the modular design, the fusion framework is adaptable and does not rely on the number of sensors or their types. Moreover, the method continues to operate because of this adaptable design in case of an individual sensor failure. This is an essential feature for safety-critical applications. The architecture targets environmental perception in challenging time-critical applications. The developed fusion framework is tested using simulation and public domain experimental data. Using the developed framework, sensor fusion is obtained well below 10 milliseconds of computing time using an AMD Ryzen 7 5800H mobile processor and the Python programming language. Furthermore, the object-level multi-sensor approach enables the detection of changes in the extrinsic calibration of the sensors and potential sensor failures. A concept was developed to use the multi-sensor framework to identify sensor malfunctions. This feature will become extremely important in ensuring the functional safety of the sensors for autonomous driving.
Funder
Bundesministerium für Wirtschaft und Energie Bayerisches Staatsministerium für Wirtschaft, Energie und Landesentwicklung
Subject
Process Chemistry and Technology,Chemical Engineering (miscellaneous),Bioengineering
Reference45 articles.
1. Galvao, L.G., Abbod, M., Kalganova, T., Palade, V., and Huda, M.N. (2021). Pedestrian and Vehicle Detection in Autonomous Vehicle Perception Systems—A Review. Sensors, 21. 2. An introduction to multisensor data fusion;Llinas;Proc. IEEE,1997 3. Chen, X., Ma, H., Wan, J., Li, B., and Xia, T. (2017, January 21–26). Multi-View 3D Object Detection Network for Autonomous Driving. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA. 4. Kaempchen, N., Buehler, M., and Dietmayer, K. (2005, January 6–8). Feature-level fusion for free-form object tracking using laserscanner and video. Proceedings of the IEEE Intelligent Vehicles Symposium, Las Vegas, NV, USA. 5. Floudas, N., Polychronopoulos, A., Aycard, O., Burlet, J., and Ahrholdt, M. (2007, January 13–15). High Level Sensor Data Fusion Approaches For Object Recognition. Proceedings of the IEEE Intelligent Vehicles Symposium, Istanbul, Turkey.
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|