Affiliation:
1. Graduate School of Information Science and Technology, Department of Creative Informatics, The University of Tokyo, Tokyo 113-8654, Japan
Abstract
This paper explores the potential for communication-efficient federated learning (FL) in modern distributed systems. FL is an emerging distributed machine learning technique that allows for the distributed training of a single machine learning model across multiple geographically distributed clients. This paper surveys the various approaches to communication-efficient FL, including model updates, compression techniques, resource management for the edge and cloud, and client selection. We also review the various optimization techniques associated with communication-efficient FL, such as compression schemes and structured updates. Finally, we highlight the current research challenges and discuss the potential future directions for communication-efficient FL.
Funder
National Institute of Information and Communications Technology (NICT), JAPAN
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference216 articles.
1. A survey on federated learning;Zhang;Knowl.-Based Syst.,2021
2. Federated learning: A survey on enabling technologies, protocols, and applications;Aledhari;IEEE Access,2020
3. A survey on federated learning: The journey from centralized to distributed on-site learning and beyond;AbdulRahman;IEEE Internet Things J.,2020
4. Wang, T., Rausch, J., Zhang, C., Jia, R., and Song, D. (2020). Federated Learning: Privacy and Incentive, Springer.
5. Kaiwartya, O., Kaushik, K., Gupta, S.K., Mishra, A., and Kumar, M. (2022). Security and Privacy in Cyberspace, Springer Nature.
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献