Efficient Federated Learning for Feature Aggregation with Heterogenous Edge Devices
-
Published:2023-12-01
Issue:1
Volume:2665
Page:012007
-
ISSN:1742-6588
-
Container-title:Journal of Physics: Conference Series
-
language:
-
Short-container-title:J. Phys.: Conf. Ser.
Author:
Liu Fei,Xiong Zheng,Yu Wei,Wu Jia,Kong Zheng,Ji Yunhang,Xu Suwei,Ji Mingtao
Abstract
Abstract
Federated learning is a powerful distributed machine learning paradigm for feature aggregation and learning from multiple heterogenous edge devices, due to its ability to keep the privacy of data. However, the training is inefficient for heterogenous devices with considerable communication. Progressive learning is a promising approach for improving the efficiency. Since progressive learning partitions the training process into multiple stages, it is necessary to determine the number of rounds for each stage, and balance the trade-off between saving the energy and improving the model accuracy. Through pilot experiments, we find that the profile which reflects the relationship between round allocation and model quality remains similar in different hyper-parameter configurations, and also observe that the model quality is lossless if the complete model gets sufficient training. Based on the phenomena, we formulate an optimization problem which minimizes the energy consumption of all devices, under the constraint of model quality. We then design a polynomial-time algorithm for the problem. Experimental results demonstrate the superiority of our proposed algorithm under various settings.
Subject
Computer Science Applications,History,Education
Reference28 articles.
1. Communication-efficient learning of deep networks from decentralized data.;McMahan;International Conference on Artificial Intelligence and Statistics,2017
2. On the convergence of fedavg on non-iid data;Li;International Conference on Learning Representations,2020
3. Edge computing: Vision and challenges;Shi;IEEE internet of things journal,2016
4. Robust and communication-efficient federated learning from non-i.i.d. data;Sattler;IEEE Transactions on Neural Networks and Learning Systems,2020