Abstract
AbstractUnprotected gradient exchange in federated learning (FL) systems may lead to gradient leakage-related attacks. CKKS is a promising approximate homomorphic encryption scheme to protect gradients, owing to its unique capability of performing operations directly on ciphertexts. However, configuring CKKS security parameters involves a trade-off between correctness, efficiency, and security. An evaluation gap exists regarding how these parameters impact computational performance. Additionally, the maximum vector length that CKKS can once encrypt, recommended by Homomorphic Encryption Standardization, is 16384, hampers its widespread adoption in FL when encrypting layers with numerous neurons. To protect gradients’ privacy in FL systems while maintaining practical performance, we comprehensively analyze the influence of security parameters such as polynomial modulus degree and coefficient modulus on homomorphic operations. Derived from our evaluation findings, we provide a method for selecting the optimal multiplication depth while meeting operational requirements. Then, we introduce an adaptive segmented encryption method tailored for CKKS, circumventing its encryption length constraint and enhancing its processing ability to encrypt neural network models. Finally, we present FedSHE, a privacy-preserving and efficient Federated learning scheme with adaptive Segmented CKKS Homomorphic Encryption. FedSHE is implemented on top of the federated averaging (FedAvg) algorithm and is available at https://github.com/yooopan/FedSHE. Our evaluation results affirm the correctness and effectiveness of our proposed method, demonstrating that FedSHE outperforms existing homomorphic encryption-based federated learning research efforts in terms of model accuracy, computational efficiency, communication cost, and security level.
Publisher
Springer Science and Business Media LLC
Reference40 articles.
1. Acar A, Aksu H, Uluagac AS, Conti M (2018) A survey on homomorphic encryption schemes: Theory and implementation. ACM Computing Surveys (Csur) 51(4):1–35
2. Albrecht M, Chase M, Chen H, Ding J, Goldwasser S, Gorbunov S, Halevi S, Hoffstein J, Laine K, Lauter K et al (2021) Homomorphic encryption standard. Protect Privacy Through Homomorphic Encrypt. 31–62
3. Aono Y, Hayashi T, Wang L, Moriai S et al (2017) Privacy-preserving deep learning via additively homomorphic encryption. IEEE Trans Inf Forensics Secur 13(5):1333–1345
4. Bonawitz K, Ivanov V, Kreuter B, Marcedone A, McMahan HB, Patel S, Ramage D, Segal A, Seth K (2017) Practical secure aggregation for privacy-preserving machine learning. In: Proceedings of the 2017 ACM SIGSAC conference on computer and communications security, pp 1175–1191
5. Boneh D, Goh E-J, Nissim K Evaluating 2-dnf formulas on ciphertexts. In: Theory of cryptography: second theory of cryptography conference, TCC 2005, Cambridge, MA, USA, February 10-12, 2005. Proceedings 2, pp. 325–341 (2005). Springer