Affiliation:
1. Research and Transfer Center CeMOS, Mannheim University of Applied Sciences, 68163 Mannheim, Germany
2. Department of Computer Science, RPTU Kaiserslautern-Landau, 67663 Kaiserslautern, Germany
Abstract
Detecting vulnerable road users is a major challenge for autonomous vehicles due to their small size. Various sensor modalities have been investigated, including mono or stereo cameras and 3D LiDAR sensors, which are limited by environmental conditions and hardware costs. Radar sensors are a low-cost and robust option, with high-resolution 4D radar sensors being suitable for advanced detection tasks. However, they involve challenges such as few and irregularly distributed measurement points and disturbing artifacts. Learning-based approaches utilizing pillar-based networks show potential in overcoming these challenges. However, the severe sparsity of radar data makes detecting small objects with only a few points difficult. We extend a pillar network with our novel Sparsity-Robust Feature Fusion (SRFF) neck, which combines high- and low-level multi-resolution features through a lightweight attention mechanism. While low-level features aid in better localization, high-level features allow for better classification. As sparse input data are propagated through a network, the increasing effective receptive field leads to feature maps of different sparsities. The combination of features with different sparsities improves the robustness of the network for classes with few points.
Funder
Federal Ministry of Education and Research, Germany
Reference38 articles.
1. Mobility and Transport (2011). Mobility and Transport ITS & Vulnerable Road Users, European Commission. Standard.
2. Shi, Y., Fan, Y., Xu, S., Gao, Y., and Gao, R. (2022). Object detection by attention-guided feature fusion network. Symmetry, 14.
3. Fürst, M., Wasenmüller, O., and Stricker, D. (2020, January 20–23). LRPD: Long range 3d pedestrian detection leveraging specific strengths of lidar and rgb. Proceedings of the IEEE International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece.
4. Yoshida, T., Wasenmüller, O., and Stricker, D. (2017, January 17–20). Time-of-flight sensor depth enhancement for automotive exhaust gas. Proceedings of the IEEE International Conference on Image Processing (ICIP), Beijing, China.
5. Multi-class road user detection with 3+ 1D radar in the View-of-Delft dataset;Palffy;IEEE Robot. Autom. Lett.,2022