Semantic-Enhanced Graph Convolutional Neural Networks for Multi-Scale Urban Functional-Feature Identification Based on Human Mobility

Author:

Chen Yuting12ORCID,Zhao Pengjun123ORCID,Lin Yi4,Sun Yushi4ORCID,Chen Rui13,Yu Ling13ORCID,Liu Yu5ORCID

Affiliation:

1. Department of Urban Planning and Design, Shenzhen Graduate School, Peking University, Shenzhen 518055, China

2. Key Laboratory of Earth Surface System and Human-Earth Relations of Ministry of Natural Resources of China, Shenzhen 518055, China

3. School of Urban and Environmental Sciences, Peking University, Beijing 100091, China

4. Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong, China

5. Institute of Remote Sensing and Geographical Information System, School of Earth and Space Sciences, Peking University, Beijing 100091, China

Abstract

Precise identification of spatial unit functional features in the city is a pre-condition for urban planning and policy-making. However, inferring unknown attributes of urban spatial units from data mining of spatial interaction remains a challenge in geographic information science. Although neural-network approaches have been widely applied to this field, urban dynamics, spatial semantics, and their relationship with urban functional features have not been deeply discussed. To this end, we proposed semantic-enhanced graph convolutional neural networks (GCNNs) to facilitate the multi-scale embedding of urban spatial units, based on which the identification of urban land use is achieved by leveraging the characteristics of human mobility extracted from the largest mobile phone datasets to date. Given the heterogeneity of multi-modal spatial data, we introduced the combination of a systematic data-alignment method and a generative feature-fusion method for the robust construction of heterogeneous graphs, providing an adaptive solution to improve GCNNs’ performance in node-classification tasks. Our work explicitly examined the scale effect on GCNN backbones, for the first time. The results prove that large-scale tasks are more sensitive to the directionality of spatial interaction, and small-scale tasks are more sensitive to the adjacency of spatial interaction. Quantitative experiments conducted in Shenzhen demonstrate the superior performance of our proposed framework compared to state-of-the-art methods. The best accuracy is achieved by the inductive GraphSAGE model at the scale of 250 m, exceeding the baseline by 25.4%. Furthermore, we innovatively explained the role of spatial-interaction factors in the identification of urban land use through the deep learning method.

Funder

National Natural Science Foundation of China

Shenzhen Science and Technology Innovation Program

Shenzhen Science and Technology Program

Introduction Project of Postdoctoral International Exchange Program

Guangdong Basic and Applied Basic Research Foundation

Publisher

MDPI AG

Subject

Earth and Planetary Sciences (miscellaneous),Computers in Earth Sciences,Geography, Planning and Development

Reference81 articles.

1. Comtois, C., and Slack, B. (2009). The Geography of Transport Systems, Routledge.

2. The influence of land use on travel behavior: Specification and estimation strategies;Boarnet;Transp. Res. Part A Policy Pract.,2001

3. Wong, D.W. (2004). WorldMinds: Geographical Perspectives on 100 Problems: Commemorating the 100th Anniversary of the Association of American Geographers 1904–2004, Springer.

4. Scale, context, and heterogeneity: The complexity of the social space;Menendez;Sci. Rep.,2022

5. A hierarchical spatial unit partitioning approach for fine-grained urban functional region identification;Jing;Trans. GIS,2022

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3