Affiliation:
1. Key Laboratory of Photoelectronic Imaging Technology and System, Ministry of Education of China, Beijing Institute of Technology, Beijing 100081, China
Abstract
Event cameras are bio-inspired neuromorphic sensors that have emerged in recent years, with advantages such as high temporal resolutions, high dynamic ranges, low latency, and low power consumption. Event cameras can be used to build event-based imaging polarimeters, overcoming the limited frame rates and low dynamic ranges of existing systems. Since events cannot provide absolute brightness intensity in different angles of polarization (AoPs), degree of linear polarization (DoLP) recovery in non-division-of-time (non-DoT) event-based imaging polarimeters is an ill-posed problem. Thus, we need a data-driven deep learning approach. Deep learning requires large amounts of data for training, and constructing a dataset for event-based non-DoT imaging polarimeters requires significant resources, scenarios, and time. We propose a method for generating datasets using simulated polarization distributions from existing red–green–blue images. Combined with event simulator V2E, the proposed method can easily construct large datasets for network training. We also propose an end-to-end event-based DoLP recovery network to solve the problem of DoLP recovery using event-based non-DoT imaging polarimeters. Finally, we construct a division-of-time event-based imaging polarimeter simulating an event-based four-channel non-DoT imaging polarimeter. Using real-world polarization events and DoLP ground truths, we demonstrate the effectiveness of the proposed simulation method and network.
Funder
National Natural Science Foundation of China
Reference64 articles.
1. Event-based vision: A survey;Gallego;IEEE Trans. Pattern Anal. Mach. Intell.,2020
2. Event-based, 6-DOF camera tracking from photometric depth maps;Gallego;IEEE Trans. Pattern Anal. Mach. Intell.,2017
3. Ultimate SLAM? Combining events, images, and IMU for robust visual SLAM in HDR and high-speed scenarios;Vidal;IEEE Robot. Autom. Lett.,2018
4. Hidalgo-Carrió, J., Gallego, G., and Scaramuzza, D. (2022, January 18–24). Event-aided direct sparse odometry. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA.
5. Scheerlinck, C., Barnes, N., and Mahony, R. (2018, January 2–6). Continuous-time intensity estimation using event cameras. Proceedings of the Computer Vision–ACCV 2018: 14th Asian Conference on Computer Vision, Perth, Australia. Revised Selected Papers, Part V.