Affiliation:
1. School of Computer Science and Engineering Sun Yat‐sen University Guangzhou Guangdong China
2. Foshan Power Supply Bureau Guangdong Power Grid, China Southern Power Gird Foshan Guangdong China
Abstract
AbstractRecently, graph neural networks (GNNs) have attracted much attention in the field of machine learning due to their remarkable success in learning from graph‐structured data. However, implementing GNNs in practice faces a critical bottleneck from the high complexity of communication and computation, which arises from the frequent exchange of graphic data during model training, especially in limited communication scenarios. To address this issue, we propose a novel framework of federated graph neural networks, where multiple mobile users collaboratively train the global model of graph neural networks in a federated way. The utilization of federated learning into the training of graph neural networks can help reduce the communication overhead of the system and protect the data privacy of local users. In addition, the federated training can help reduce the system computational complexity significantly. We further introduce a greedy‐based user selection for the federated graph neural networks, where the wireless bandwidth is dynamically allocated among users to encourage more users to attend the federated training of neural networks. We perform the convergence analysis on the federated training of neural networks, in order to obtain some more insights on the impact of critical parameters on the system design. Finally, we perform the simulations on the coriolis ocean for reAnalysis (CORA) dataset and show the advantages of the proposed method in this paper.
Funder
Natural Science Foundation of Guangdong Province