Using Deep Learning and Google Street View Imagery to Assess and Improve Cyclist Safety in London

Author:

Rita Luís12,Peliteiro Miguel2,Bostan Tudor-Codrin2,Tamagusko Tiago3ORCID,Ferreira Adelino3ORCID

Affiliation:

1. Division of Cancer, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, London SW72AZ, UK

2. CycleAI, 1800-359 Lisbon, Portugal

3. Research Center for Territory, Transports and Environment (CITTA), Department of Civil Engineering, University of Coimbra, 3030-788 Coimbra, Portugal

Abstract

Cycling is a sustainable mode of transportation with significant benefits for society. The number of cyclists on the streets depends heavily on their perception of safety, which makes it essential to establish a common metric for determining and comparing risk factors related to road safety. This research addresses the identification of cyclists’ risk factors using deep learning techniques applied to a Google Street View (GSV) imagery dataset. The research utilizes a case study approach, focusing on London, and applies object detection and image segmentation models to extract cyclists’ risk factors from GSV images. Two state-of-the-art tools, You Only Look Once version 5 (YOLOv5) and the pyramid scene parsing network (PSPNet101), were used for object detection and image segmentation. This study analyzes the results and discusses the technology’s limitations and potential for improvements in assessing cyclist safety. Approximately 2 million objects were identified, and 250 billion pixels were labeled in the 500,000 images available in the dataset. On average, 108 images were analyzed per Lower Layer Super Output Area (LSOA) in London. The distribution of risk factors, including high vehicle speed, tram/train rails, truck circulation, parked cars and the presence of pedestrians, was identified at the LSOA level using YOLOv5. Statistically significant negative correlations were found between cars and buses, cars and cyclists, and cars and people. In contrast, positive correlations were observed between people and buses and between people and bicycles. Using PSPNet101, building (19%), sky (15%) and road (15%) pixels were the most common. The findings of this research have the potential to contribute to a better understanding of risk factors for cyclists in urban environments and provide insights for creating safer cities for cyclists by applying deep learning techniques.

Funder

Research Center for Territory, Transports and Environment-CITTA

Association for the Development of Civil Engineering

Publisher

MDPI AG

Subject

Management, Monitoring, Policy and Law,Renewable Energy, Sustainability and the Environment,Geography, Planning and Development,Building and Construction

Reference71 articles.

1. U.S (2020, April 19). Departament of Transportation. “pedbikeinfo”, Pedestrian and Bicycle Information Center, Available online: https://www.pedbikeinfo.org.

2. Balogh, S.M. (2017). Perceived Safety of Cyclists: The Role of Road Attributes, KTH Royal Institute of Technology.

3. Centers for Disease Control and Prevention (CDC) (2020, April 19). Physical Activity: Builds a Healthy and Strong America, Available online: https://www.cdc.gov/physicalactivity/about-physical-activity/pdfs/healthy-strong-america-201902_508.pdf.

4. U.S (2020, May 19). Environmental Protection Agency. Greenhouse Gas Emissions from a Typical Passenger Vehicle, Available online: https://www.epa.gov/greenvehicles/greenhouse-gas-emissions-typical-passenger-vehicle.

5. Gibbs, K., Slater, S., Nicholson, N., Barker, D., and Chaloupka, F. (2012). Income Disparities in Street Features that Encourage Walking, A BTG Research Brief; Bridging the Gap Program.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3