Abstract
Determining samples is considered to be a precondition in deep network training and learning, but at present, samples are usually created manually, which limits the application of deep networks. Therefore, this article proposes an OpenStreetMap (OSM) data-driven method for creating road-positive samples. First, based on the OSM data, a line segment orientation histogram (LSOH) model is constructed to determine the local road direction. Secondly, a road homogeneity constraint rule and road texture feature statistical model are constructed to extract the local road line, and on the basis of the local road lines with the same direction, a polar constraint rule is proposed to determine the local road line set. Then, an iterative interpolation algorithm is used to connect the local road lines on both sides of the gaps between the road lines. Finally, a local texture self-similarity (LTSS) model is implemented to determine the road width, and the centerpoint autocorrection model and random sample consensus (RANSAC) algorithm are used to extract the road centerline; the road width and road centerline are used to complete the creation of the road-positive samples. Experiments are conducted on different scenes and different types of images to demonstrate the proposed method and compare it with other approaches. The results demonstrate that the proposed method for creating road-positive samples has great advantages in terms of accuracy and integrity.
Subject
General Earth and Planetary Sciences
Reference48 articles.
1. Automatic road extraction from remote sensing imagery incorporating prior information and colour segmentation;Ziemsa;Remote Sens. Spat. Inf. Sci.,2007
2. A multi-index learning approach for classification of high-resolution remotely sensed images over urban areas
3. Model-based road extraction from images;Steger,1995
4. Semantic objects and context for finding roads;Baumgartner;Proc. SPIE Int. Soc. Opt. Eng.,1997
5. Automatic Road Detection in Grayscale Aerial Images
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献