Abstract
AbstractMulti-center heterogeneous data are a hot topic in federated learning. The data of clients and centers do not follow a normal distribution, posing significant challenges to learning. Based on the assumption that the client data have a multivariate skewed normal distribution, we improve the DP-Fed-mv-PPCA model. We use a Bayesian framework to construct prior distributions of local parameters and use expectation maximization and pseudo-Newton algorithms to obtain robust parameter estimates. Then, the clipping algorithm and differential privacy algorithm are used to solve the problem in which the model parameters do not have a display solution and achieve privacy guarantee. Furthermore, we verified the effectiveness of our model using synthetic and actual data from the Internet of vehicles.
Funder
education science planning foundation of Jilin
Natural Science Foundation of Jilin Province
Publisher
Springer Science and Business Media LLC
Reference33 articles.
1. McMahan, H. B., Moore, E., Ramage, D., Hampson, S. & y Arcas, B. A. Communication-efficient learning of deep networks from decentralized data. In AISTATS (2017).
2. Agarwal, N., Kairouz, P. & Liu, Z. The skellam mechanism for differentially private federated learning. In NeurIPS (2021).
3. Asoodeh, S., Chen, W.-N., du Pin Calmon, F. & Özgür, A. Differentially private federated learning: An information-theoretic perspective. 2021 IEEE International Symposium on Information Theory (ISIT) 344–349 (2021).
4. Geyer, R. C., Klein, T. & Nabi, M. Differentially private federated learning: A client level perspective. ArXiv:1712.07557 (2017).
5. Balelli, I., SantiagoS.Silva, R. & Lorenzi, M. A differentially private probabilistic framework for modeling the variability across federated datasets of heterogeneous multi-view observations. n. pag.arXiv:2204.07352 (2022).
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献