A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation

Author:

Zhang Jianfei1ORCID,Shi Yongqiang1

Affiliation:

1. School of Computer Science and Technology, Changchun University of Science and Technology, Changchun 130031, China

Abstract

Federated learning (FL) is a distributed machine learning paradigm under privacy preservation. However, data heterogeneity among clients leads to the shared global model obtained after training, which cannot fit the distribution of each client’s dataset, and the performance of the model degrades. To address this problem, we proposed a personalized federated learning method based on clustering and knowledge distillation, called pFedCK. In this algorithm, each client has an interactive model that participates in global training and a personalized model that is only trained locally. Both of the models perform knowledge distillation with each other through the feature representation of the middle layer and the soft prediction of the model. In addition, in order to make an interaction model only obtaining the model information from the client, which has similar data distribution and avoids the interference of other heterogeneous information, the server will cluster the clients according to the similarity of the amount of parameter variation uploaded by different interaction models during every training round. By clustering clients, interaction models with similar data distributions can cooperate with each other to better fit the local dataset distribution. Thereby, the performance of personalized model can be improved by obtaining more valuable information indirectly. Finally, we conduct simulation experiments on three benchmark datasets under different data heterogeneity scenarios. Compared to the single model algorithms, the accuracy of pFedCK improved by an average of 23.4% and 23.8% over FedAvg and FedProx, respectively; compared to typical personalization algorithms, the accuracy of pFedCK improved by an average of 0.8% and 1.3%, and a maximum of 1.0% and 2.9% over FedDistill and FML.

Funder

“Research on Machine Learning Methods Based on Multi-party Participation”

Science & Technology Development Program of Jilin Province, China

Publisher

MDPI AG

Reference22 articles.

1. McMahan, B., Moore, E., Ramage, D., Hampson, S., and Arcas, B.A. (2017, January 20–22). Communication-efficient learning of deep networks from decentralized data. Proceedings of the Artificial Intelligence and Statistics Conference, Fort Lauderdale, FL, USA.

2. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., and Smith, V. (2020, January 2–4). Federated optimization in heterogeneous networks. Proceedings of the Machine Learning and Systems, Austin, TX, USA.

3. Gao, L., Fu, H., Li, L., Chen, Y., Xu, M., and Xu, C. (2022, January 18–24). FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling and Correction. Proceedings of the 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA.

4. Collins, L., Hassani, H., Mokhtari, A., and Shakkottai, S. (2021). Exploiting Shared Representations for Personalized Federated Learning. arXiv.

5. Deng, Y., Kamani, M., and Mahdavi, M. (2020). Adaptive Personalized Federated Learning. arXiv.

Cited by 3 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3