Addressing Heterogeneity in Federated Learning with Client Selection via Submodular Optimization

Author:

Zhang Jinghui1,Wang Jiawei1,Li Yaning1,Xin Fa1,Dong Fang1,Luo Junzhou1,Wu Zhihua2

Affiliation:

1. Southeast University, China

2. Baidu Inc, China

Abstract

Federated learning (FL) has been proposed as a privacy-preserving distributed learning paradigm, which differs from traditional distributed learning in two main aspects: the systems heterogeneity meaning that clients participating in training have significant differences in systems performance including CPU frequency, dataset size and transmission power, and the statistical heterogeneity indicating that the data distribution among clients exhibits Non-Independent Identical Distribution (Non-IID). Therefore, the random selection of clients will significantly reduce the training efficiency of FL. In this paper, we propose a client selection mechanism considering both systems and statistical heterogeneity, which aims to improve the time-to-accuracy performance by trading off the impact of systems performance differences and data distribution differences among the clients on training efficiency. Firstly, client selection is formulated as a combinatorial optimization problem that jointly optimizes systems and statistical performance. Then we generalize it to a submodular maximization problem with knapsack constraint, and propose the I terative G reedy with P artial E numeration (IGPE) algorithm to greedily select the suitable clients. Then, the approximation ratio of IGPE is analyzed theoretically. Extensive experiments verify that the time-to-accuracy performance of the IGPE algorithm outperforms other compared algorithms in a variety of heterogeneous environments.

Publisher

Association for Computing Machinery (ACM)

Subject

Computer Networks and Communications

Reference72 articles.

1. Federated Learning in Edge Computing: A Systematic Survey

2. Convergence of update aware device scheduling for federated learning at the wireless edge;Amiri Mohammad Mohammadi;IEEE Transactions on Wireless Communications,2021

3. Efficient training management for mobile crowd-machine learning: A deep reinforcement learning approach;Anh Tran The;IEEE Wireless Communications Letters,2019

4. Jinheon Baek Wonyong Jeong Jiongdao Jin Jaehong Yoon and Sung Ju Hwang. 2022. Personalized Subgraph Federated Learning. arXiv preprint arXiv:2206.10206(2022). Jinheon Baek Wonyong Jeong Jiongdao Jin Jaehong Yoon and Sung Ju Hwang. 2022. Personalized Subgraph Federated Learning. arXiv preprint arXiv:2206.10206(2022).

5. Ravikumar Balakrishnan , Tian Li , Tianyi Zhou , Nageen Himayat , Virginia Smith , and Jeff Bilmes . 2021 . Diverse client selection for federated learning via submodular maximization . In International Conference on Learning Representations. Ravikumar Balakrishnan, Tian Li, Tianyi Zhou, Nageen Himayat, Virginia Smith, and Jeff Bilmes. 2021. Diverse client selection for federated learning via submodular maximization. In International Conference on Learning Representations.

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3