Affiliation:
1. Department of Mathematics and Computer Science, University of Southern Denmark , Odense 5230, Denmark
Abstract
Abstract
Motivation
Federated learning enables privacy-preserving machine learning in the medical domain because the sensitive patient data remain with the owner and only parameters are exchanged between the data holders. The federated scenario introduces specific challenges related to the decentralized nature of the data, such as batch effects and differences in study population between the sites. Here, we investigate the challenges of moving classical analysis methods to the federated domain, specifically principal component analysis (PCA), a versatile and widely used tool, often serving as an initial step in machine learning and visualization workflows. We provide implementations of different federated PCA algorithms and evaluate them regarding their accuracy for high-dimensional biological data using realistic sample distributions over multiple data sites, and their ability to preserve downstream analyses.
Results
Federated subspace iteration converges to the centralized solution even for unfavorable data distributions, while approximate methods introduce error. Larger sample sizes at the study sites lead to better accuracy of the approximate methods. Approximate methods may be sufficient for coarse data visualization, but are vulnerable to outliers and batch effects. Before the analysis, the PCA algorithm, as well as the number of eigenvectors should be considered carefully to avoid unnecessary communication overhead.
Availability and implementation
Simulation code and notebooks for federated PCA can be found at https://gitlab.com/roettgerlab/federatedPCA; the code for the federated app is available at https://github.com/AnneHartebrodt/fc-federated-pca
Supplementary information
Supplementary data are available at Bioinformatics Advances online.
Publisher
Oxford University Press (OUP)
Subject
Cell Biology,Developmental Biology,Embryology,Anatomy
Reference46 articles.
1. Privacy-preserving PCA on horizontally-partitioned data;Al-Rubaie;2017 IEEE Conference on Dependable and Secure Computing,2017
2. A Review of Distributed Data Models for Learning
3. Principal Component Analysis for Distributed Data Sets with Updating
4. An improved gap-dependency analysis of the noisy power method;Balcan;29th Annual Conference on Learning Theory,2016
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献