Affiliation:
1. School of Computer Science and Engineering, Macau University of Science and Technology, China
Abstract
In intelligent transportation systems, various sensors, including radar and conventional frame cameras, are used to improve system robustness in various challenging scenarios. An event camera is a novel bio-inspired sensor that has attracted the interest of several researchers. It provides a form of neuromorphic vision to capture motion information asynchronously at high speeds. Thus, it possesses advantages for intelligent transportation systems that conventional frame cameras cannot match, such as high temporal resolution, high dynamic range, as well as sparse and minimal motion blur. Therefore, this study proposes an E-detector based on event cameras that asynchronously detect moving objects. The main innovation of our framework is that the spatiotemporal domain of the event camera can be adjusted according to different velocities and scenarios. It overcomes the inherent challenges that traditional cameras face when detecting moving objects in complex environments, such as high speed, complex lighting, and motion blur. Moreover, our approach adopts filter models and transfer learning to improve the performance of event-based object detection. Experiments have shown that our method can detect high-speed moving objects better than conventional cameras using state-of-the-art detection algorithms. Thus, our proposed approach is extremely competitive and extensible, as it can be extended to other scenarios concerning high-speed moving objects. The study findings are expected to unlock the potential of event cameras in intelligent transportation system applications.
Funder
Science and Technology Development Fund (FDCT) of Macau
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Networks and Communications,Hardware and Architecture
Reference79 articles.
1. Multi-task feature learning;Argyriou Andreas;Advances in Neural Information Processing Systems,2006
2. Self-driving cars: A survey
3. Patrick Bardow, Andrew J. Davison, and Stefan Leutenegger. 2016. Simultaneous optical flow and intensity estimation from an event camera. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’16). 884–892.
4. YOLOV4: Optimal speed and accuracy of object detection;Bochkovskiy Alexey;Retrieved from,2020
5. A 240 × 180 130 dB 3 µs Latency Global Shutter Spatiotemporal Vision Sensor
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献