Optimizing Camera Exposure Time for Automotive Applications
Author:
Lin Hao12ORCID, Mullins Darragh12ORCID, Molloy Dara123ORCID, Ward Enda3, Collins Fiachra3, Denny Patrick14ORCID, Glavin Martin12ORCID, Deegan Brian12ORCID, Jones Edward12
Affiliation:
1. School of Engineering, University of Galway, University Road, H91 TK33 Galway, Ireland 2. Ryan Institute, University of Galway, University Road, H91 TK33 Galway, Ireland 3. Valeo Vision Systems, Tuam, Co., H54 Y276 Galway, Ireland 4. Computer Science and Information Systems (CSIS), Faculty of Science and Engineering, University of Limerick, Castletroy, V94 T9PX Limerick, Ireland
Abstract
Camera-based object detection is integral to advanced driver assistance systems (ADAS) and autonomous vehicle research, and RGB cameras remain indispensable for their spatial resolution and color information. This study investigates exposure time optimization for such cameras, considering image quality in dynamic ADAS scenarios. Exposure time, the period during which the camera sensor is exposed to light, directly influences the amount of information captured. In dynamic scenarios, such as those encountered in typical driving scenarios, optimizing exposure time becomes challenging due to the inherent trade-off between Signal-to-Noise Ratio (SNR) and motion blur, i.e., extending exposure time to maximize information capture increases SNR, but also increases the risk of motion blur and overexposure, particularly in low-light conditions where objects may not be fully illuminated. The study introduces a comprehensive methodology for exposure time optimization under various lighting conditions, examining its impact on image quality and computer vision performance. Traditional image quality metrics show a poor correlation with computer vision performance, highlighting the need for newer metrics that demonstrate improved correlation. The research presented in this paper offers guidance into the enhancement of single-exposure camera-based systems for automotive applications. By addressing the balance between exposure time, image quality, and computer vision performance, the findings provide a road map for optimizing camera settings for ADAS and autonomous driving technologies, contributing to safety and performance advancements in the automotive landscape.
Funder
Science Foundation Ireland European Regional Development Fund
Reference69 articles.
1. Making of Night Vision: Object Detection under Low-Illumination;Xiao;IEEE Access,2020 2. Zhang, J., Zhu, L., Xu, L., and Xie, Q. (2020, January 6–8). Research on the Correlation between Image Enhancement and Underwater Object Detection. Proceedings of the 2020 Chinese Automation Congress (CAC), Shanghai, China. 3. A Review of the Impact of Rain on Camera-Based Perception in Automated Driving Systems;Brophy;IEEE Access,2023 4. Analysis of the Impact of Lens Blur on Safety-Critical Automotive Object Detection;Molloy;IEEE Access,2024 5. Molloy, D., Deegan, B., Mullins, D., Ward, E., Horgan, J., Eising, C., Denny, P., Jones, E., and Glavin, M. (2023). Impact of ISP Tuning on Object Detection. J. Imaging, 9.
|
|