Affiliation:
1. School of Qilu Transportation, Shandong University, Jinan City, Shandong Province, China
Abstract
With the rapid development of convolutional neural networks (CNNs), real-time traffic monitoring, which benefits both traffic optimization and management, has been widely applied with smart cameras on streets. However, compared with the daytime traffic detection, nighttime traffic detection is limited by labeling annotations and is thus unstable, inaccurate, and inefficient in practice. To address this issue, this study proposes to use the augmented nighttime traffic images generated by cycle generative adversarial networks (CycleGAN) for better nighttime detection performance. Using CycleGAN, transferred nighttime traffic images are generated by using the daytime traffic images, as both share the same annotations of traffic instances in the daytime. For comparison, different learning rates and crop sizes are adopted for day-to-night traffic image transfer. The previously proposed detection network, dense traffic detection network (DTDNet), is adopted to train the prepared image data set. The indicators of mean average precision (mAP), precision, and recall are adopted for training performance evaluation and comparison. Based on the visualization results, CycleGAN, with a learning rate of 2e-5 and a crop size of 64, has better performance on day-to-night traffic image transfer with our proposed image data set. Considering the indicators of training performance, DTDNet, with 60% of transferred nighttime images and 40% of the original images, has better accuracy in four categories. Overall, this study provides a possible solution for addressing the issue of training data limitation in nighttime traffic detection and demonstrates the potential of GAN-based data augmentation in the transportation domain.
Funder
Shanghai Key Laboratory of Rail Infrastructure Durability and System Safety
Subject
Mechanical Engineering,Civil and Structural Engineering
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献