Enhancing Graph Neural Networks via Memorized Global Information

Author:

Zeng Ruihong1ORCID,Fang Jinyuan2ORCID,Liu Siwei3ORCID,Meng Zaiqiao4ORCID,Liang Shangsong5ORCID

Affiliation:

1. School of Computer Science and Engineering, Sun Yat-Sen University, Guangzhou, China

2. Sun Yat-Sen University, Guangzhou, China

3. Department of Machine Learning, Mohamed bin Zayed University of Artificial Intelligence, Masdar City, United Arab Emirates

4. University of Glasgow, Glasgow United Kingdom of Great Britain and Northern Ireland

5. Schoo of Data and Computer Science, Sun Yat-Sen University, Guangzhou China

Abstract

Graph neural networks (GNNs) have gained significant attention for their impressive results on different graph-based tasks. The essential mechanism of GNNs is the message-passing framework, whereby node representations are aggregated from local neighbourhoods. Recently, Transformer-based GNNs have been introduced to learn the long-range dependencies, enhancing performance. However, their quadratic computational complexity, due to the attention computation, has constrained their applicability on large-scale graphs. To address this issue, we propose MGIGNN ( M emorized G lobal I nformation G raph N eural N etwork), an innovative approach that leverages memorized global information to enhance existing GNNs in both transductive and inductive scenarios. Specifically, MGIGNN captures long-range dependencies by identifying and incorporating global similar nodes, which are defined as nodes exhibiting similar features, structural patterns and label information within a graph. To alleviate the computational overhead associated with computing embeddings for all nodes, we introduce an external memory module to facilitate the retrieval of embeddings and optimize performance on large graphs. To enhance the memory-efficiency, MGIGNN selectively retrieves global similar nodes from a small set of candidate nodes. These candidate nodes are selected from the training nodes based on a sparse node selection distribution with a Dirichlet prior. This selecting approach not only reduces the memory size required but also ensures efficient utilization of computational resources. Through comprehensive experiments conducted on ten widely-used and real-world datasets, including seven homogeneous datasets and three heterogeneous datasets, we demonstrate that our MGIGNN can generally improve the performance of existing GNNs on node classification tasks under both inductive and transductive settings.

Publisher

Association for Computing Machinery (ACM)

Reference97 articles.

1. Uri Alon and Eran Yahav. 2020. On the bottleneck of graph neural networks and its practical implications. arXiv preprint arXiv:2006.05205(2020).

2. Alessandro Bicciato, Luca Cosmo, Giorgia Minello, Luca Rossi, and Andrea Torsello. 2023. Classifying Me Softly: A Novel Graph Neural Network Based on Features Soft-Alignment. In Structural, Syntactic, and Statistical Pattern Recognition: Joint IAPR International Workshops, S+ SSPR 2022, Montreal, QC, Canada, August 26–27, 2022, Proceedings. Springer, 43–53.

3. Mitchell Black, Zhengchao Wan, Amir Nayyeri, and Yusu Wang. 2023. Understanding oversquashing in gnns through the lens of effective resistance. In International Conference on Machine Learning. PMLR, 2528–2547.

4. Beyond Low-frequency Information in Graph Convolutional Networks

5. Aleksandar Bojchevski and Stephan Günnemann. 2018. Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking. In International Conference on Learning Representations.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3