Orca: Scalable Temporal Graph Neural Network Training with Theoretical Guarantees

Author:

Li Yiming1ORCID,Shen Yanyan2ORCID,Chen Lei3ORCID,Yuan Mingxuan4ORCID

Affiliation:

1. The Hong Kong University of Science and Technology, Hong Kong, China

2. Shanghai Jiao Tong University, Shanghai, China

3. The Hong Kong University of Science and Technology & The Hong Kong University of Science and Technology (Guangzhou), Hong Kong & Guangzhou, China

4. Huawei Noah's Ark Lab, Hong Kong, China

Abstract

Representation learning over dynamic graphs is critical for many real-world applications such as social network services and recommender systems. Temporal graph neural networks (T-GNNs) are powerful representation learning methods and have achieved remarkable effectiveness on continuous-time dynamic graphs. However, T-GNNs still suffer from high time complexity, which increases linearly with the number of timestamps and grows exponentially with the model depth, causing them not scalable to large dynamic graphs. To address the limitations, we propose Orca, a novel framework that accelerates T-GNN training by non-trivially caching and reusing intermediate embeddings. We design an optimal cache replacement algorithm, named MRU, under a practical cache limit. MRU not only improves the efficiency of training T-GNNs by maximizing the number of cache hits but also reduces the approximation errors by avoiding keeping and reusing extremely stale embeddings. Meanwhile, we develop profound theoretical analyses of the approximation error introduced by our reuse schemes and offer rigorous convergence guarantees. Extensive experiments have validated that Orca can obtain two orders of magnitude speedup over the state-of-the-art baselines while achieving higher precision on large dynamic graphs.

Funder

Hong Kong ITC ITF

Hong Kong RGC AOE Project

Hong Kong RGC GRF Project

National Key Research and Development Program of China

Shanghai Municipal Science and Technology Major Project

National Science Foundation of China

Hong Kong RGC CRF Project

Guangdong Basic and Applied Basic Research Foundation

SJTU Global Strategic Partnership Fund

Hong Kong RGC Theme-based project

China NSFC

Microsoft Research Asia Collaborative Research Grant

HKUST-Webank joint research lab grant

HKUST Global Strategic Partnership Fund

Publisher

Association for Computing Machinery (ACM)

Reference75 articles.

1. 2023. AskUbuntu. http://snap.stanford.edu/data/sx-askubuntu.html. 2023. AskUbuntu. http://snap.stanford.edu/data/sx-askubuntu.html.

2. 2023. SuperUser. http://snap.stanford.edu/data/sx-superuser.html. 2023. SuperUser. http://snap.stanford.edu/data/sx-superuser.html.

3. 2023. The technical report. https://github.com/LuckyLYM/Orca/blob/main/technical_report.pdf. 2023. The technical report. https://github.com/LuckyLYM/Orca/blob/main/technical_report.pdf.

4. 2023. Wiki-talk. http://snap.stanford.edu/data/wiki-talk-temporal.html. 2023. Wiki-talk. http://snap.stanford.edu/data/wiki-talk-temporal.html.

5. 2023. Wikipedia edit history dump. https://meta.wikimedia.org/wiki/Data_dumps. 2023. Wikipedia edit history dump. https://meta.wikimedia.org/wiki/Data_dumps.

Cited by 3 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. TimeSGN: Scalable and Effective Temporal Graph Neural Network;2024 IEEE 40th International Conference on Data Engineering (ICDE);2024-05-13

2. Incorporating Dynamic Temperature Estimation into Contrastive Learning on Graphs;2024 IEEE 40th International Conference on Data Engineering (ICDE);2024-05-13

3. ADGNN: Towards Scalable GNN Training with Aggregation-Difference Aware Sampling;Proceedings of the ACM on Management of Data;2023-12-08

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3