Affiliation:
1. School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
2. School of Software Engineering, Chengdu University of Information Technology, Chengdu 610225, China
Abstract
With the development of deep learning and remote sensing technologies in recent years, many semantic segmentation methods based on convolutional neural networks (CNNs) have been applied to road extraction. However, previous deep learning-based road extraction methods primarily used RGB imagery as an input and did not take advantage of the spectral information contained in hyperspectral imagery. These methods can produce discontinuous outputs caused by objects with similar spectral signatures to roads. In addition, the images obtained from different Earth remote sensing sensors may have different spatial resolutions, enhancing the difficulty of the joint analysis. This work proposes the Multiscale Fusion Attention Network (MSFANet) to overcome these problems. Compared to traditional road extraction frameworks, the proposed MSFANet fuses information from different spectra at multiple scales. In MSFANet, multispectral remote sensing data is used as an additional input to the network, in addition to RGB remote sensing data, to obtain richer spectral information. The Cross-source Feature Fusion Module (CFFM) is used to calibrate and fuse spectral features at different scales, reducing the impact of noise and redundant features from different inputs. The Multiscale Semantic Aggregation Decoder (MSAD) fuses multiscale features and global context information from the upsampling process layer by layer, reducing information loss during the multiscale feature fusion. The proposed MSFANet network was applied to the SpaceNet dataset and self-annotated images from Chongzhou, a representative city in China. Our MSFANet performs better over the baseline HRNet by a large margin of +6.38 IoU and +5.11 F1-score on the SpaceNet dataset, +3.61 IoU and +2.32 F1-score on the self-annotated dataset (Chongzhou dataset). Moreover, the effectiveness of MSFANet was also proven by comparative experiments with other studies.
Funder
Key Projects of Global Change and Response of Ministry of Science and Technology of China
Science and Technology Support Project of Sichuan Province
Fengyun Satellite Application Advance Plan
Natural Science Foundation of Sichuan Province
Subject
General Earth and Planetary Sciences
Reference49 articles.
1. BT-RoadNet: A boundary and topologically-aware neural network for road extraction from high-resolution remote sensing imagery;Zhou;ISPRS J. Photogramm. Remote Sens.,2020
2. Remote sensing and GIS techniques for reconstructing the military fort system on the Roman boundary (Tunisian section) and identifying archaeological sites;Bachagha;Remote Sens. Environ.,2020
3. Jia, J., Sun, H., Jiang, C., Karila, K., Karjalainen, M., Ahokas, E., Khoramshahi, E., Hu, P., Chen, C., and Xue, T. (2021). Review on active and passive remote sensing techniques for road extraction. Remote Sens., 13.
4. csBoundary: City-Scale Road-Boundary Detection in Aerial Images for High-Definition Maps;Xu;IEEE Robot. Autom. Lett.,2022
5. A sensor-fusion drivable-region and lane-detection system for autonomous vehicle navigation in challenging road scenarios;Li;IEEE Trans. Veh. Technol.,2013
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献