Author:
Xie Lijian,Feng Xiuli,Zhang Chi,Dong Yuyi,Huang Junjie,Liu Kaikai
Abstract
As the basic spatial unit of urban planning and management, it is necessary to know the distribution status of urban functional areas in time. Due to the complexity of urban land use, it is difficult to identify the urban functional areas using only remote sensing images. Social perception data can provide additional information for the identification of urban functional areas. However, the sources of remote sensing data and social perception data differ, with some differences in data forms. Existing methods cannot comprehensively consider the characteristics of these data for functional area identification. Therefore, in this study, we propose a multimodal deep learning method with an attention mechanism to fully utilize the data features of these two modalities and apply it to the recognition of urban functional areas. First, the pre-processed remote sensing images, points of interest, and building footprint data are divided into block-based target units of features by the road network. Next, the remote sensing image features and social perception data features of the target unit are extracted separately using a two-branch convolutional network. Finally, features are extracted sequentially along two separate dimensions, being channel and spatial, to generate an attention weight map for the identification and classification mapping of urban functional areas. The model framework was finally applied to the Ningbo dataset for testing, and the recognition accuracy was above 93%. The experimental results deduce, as a whole, that the prediction performance of the deep multimodal fusion model framework with an attention mechanism is comparatively superior to other traditional methods. It can provide a reference for the classification of urban land use and provide data support for urban planning and management.
Subject
Building and Construction,Civil and Structural Engineering,Architecture
Cited by
11 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献