Author:
de Jong Jins,Kamphorst Bart,Kroes Shannon
Abstract
We present a differentially private extension of the block coordinate descent algorithm by means of objective perturbation. The algorithm iteratively performs linear regression in a federated setting on vertically partitioned data. In addition to a privacy guarantee, we derive a utility guarantee; a tolerance parameter indicates how much the differentially private regression may deviate from the analysis without differential privacy. The algorithm’s performance is compared with that of the standard block coordinate descent algorithm on both artificial test data and real-world data. We find that the algorithm is fast and able to generate practical predictions with single-digit privacy budgets, albeit with some accuracy loss.
Reference29 articles.
1. Veugen, T. Secure Multi-party Computation and Its Applications. Proceedings of the Innovations for Community Services, 2022.
2. Veugen, T., Kamphorst, B., van de L’Isle, N., and van Egmond, M.B. Privacy-Preserving Coupling of Vertically-Partitioned Databases and Subsequent Training with Gradient Descent. Proceedings of the Cyber Security Cryptography and Machine Learning, 2021.
3. Sangers, A., van Heesch, M., Attema, T., Veugen, T., Wiggerman, M., Veldsink, J., Bloemen, O., and Worm, D. Secure Multiparty PageRank Algorithm for Collaborative Fraud Detection. Proceedings of the Financial Cryptography and Data Security, 2019.
4. Van Kesteren, E.J., Sun, C., Oberski, D.L., Dumontier, M., and Ippel, L. Privacy-Preserving Generalized Linear Models using Distributed Block Coordinate Descent. arXiv, 2019.
5. Zhu, L., Liu, Z., and Han, S. Deep Leakage from Gradients. Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2019.