Author:
Yang ,Liu ,Jiang ,Xu ,Sheng ,Yang
Abstract
Accurate road information is important for applications involving road maintenance, intelligent transportation, and road network updates. Mobile laser scanning (MLS) can effectively extract road information. However, accurately extracting road edges based on large-scale data for complex road conditions, including both structural and non-structural road types, remains difficult. In this study, a robust method to automatically extract structural and non-structural road edges based on a topological network of laser points between adjacent scan lines and auxiliary surfaces is proposed. The extraction of road and curb points was achieved mainly from the roughness of the extracted surface, without considering traditional thresholds (e.g., height jump, slope, and density). Five large-scale road datasets, containing different types of road curbs and complex road scenes, were used to evaluate the practicality, stability, and validity of the proposed method via qualitative and quantitative analyses. Measured values of the correctness, completeness, and quality of extracted road edges were over 95.5%, 91.7%, and 90.9%, respectively. These results confirm that the proposed method can extract road edges from large-scale MLS datasets without the need for auxiliary information on intensity, image, or geographic data. The proposed method is effective regardless of whether the road width is fixed, the road is regular, and the existence of pedestrians and vehicles. Most importantly, the proposed method provides a valuable solution for road edge extraction that is useful for road authorities when developing intelligent transportation systems, such as those required by self-driving vehicles.
Funder
National Natural Science Foundation of China
the National Key Research and Development Program of China
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献