Abstract
AbstractFederated learning has been shown to be efficient for training a global model without needing to collect all data from multiple entities to the centralized server. However, the model performance, communication traffic, and data privacy and security are still the focus of federated learning after it has been developed. In this paper, a composition–decomposition based federated learning, denoted as CD-FL, is proposed. In the CD-FL approach, the global model, composed of K sub-models with the same framework, will be decomposed and broadcast to all clients. Each client will randomly choose a sub-model, update its parameters using its own dataset, and upload this sub-model to the server. All sub-models, including the sub-models before and after updating, will be clustered into K clusters to form the global model of the next round. Experimental results on Fashion-MNIST, CIFAR-10, EMNIST, and Tiny-IMAGENET datasets show the efficiency of the model performance and communication traffic of the proposed method.
Funder
National Natural Science Foundation of China
Shanxi Provincial Key Research and Development Project
Natural Science Foundation of Shanxi Province
Publisher
Springer Science and Business Media LLC
Subject
Computational Mathematics,Engineering (miscellaneous),Information Systems,Artificial Intelligence
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. The Performance Analysis of Federated Learning Methods for IoT with Big Data;2024 11th International Conference on Computing for Sustainable Global Development (INDIACom);2024-02-28