Author:
Guo Zhichun,Zhang Chunhui,Fan Yujie,Tian Yijun,Zhang Chuxu,Chawla Nitesh V.
Abstract
Graph neural networks (GNNs) have shown remarkable performance on diverse graph mining tasks. While sharing the same message passing framework, our study shows that different GNNs learn distinct knowledge from the same graph. This implies potential performance improvement by distilling the complementary knowledge from multiple models. However, knowledge distillation (KD) transfers knowledge from high-capacity teachers to a lightweight student, which deviates from our scenario: GNNs are often shallow. To transfer knowledge effectively, we need to tackle two challenges: how to transfer knowledge from compact teachers to a student with the same capacity; and, how to exploit student GNN's own learning ability. In this paper, we propose a novel adaptive KD framework, called BGNN, which sequentially transfers knowledge from multiple GNNs into a student GNN. We also introduce an adaptive temperature module and a weight boosting module. These modules guide the student to the appropriate knowledge for effective learning. Extensive experiments have demonstrated the effectiveness of BGNN. In particular, we achieve up to 3.05% improvement for node classification and 6.35% improvement for graph classification over vanilla GNNs.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Towards Cross-lingual Social Event Detection with Hybrid Knowledge Distillation;ACM Transactions on Knowledge Discovery from Data;2024-08-27
2. Package Arrival Time Prediction via Knowledge Distillation Graph Neural Network;ACM Transactions on Knowledge Discovery from Data;2024-02-28
3. SGD-MLP: Structure Generation and Distillation using a graph free MLP;Proceedings of the 7th Joint International Conference on Data Science & Management of Data (11th ACM IKDD CODS and 29th COMAD);2024-01-04
4. Collective Computational Intelligence Challenges and Opportunities;Lecture Notes in Computer Science;2024
5. Symbolic Prompt Tuning Completes the App Promotion Graph;Lecture Notes in Computer Science;2024