Affiliation:
1. Department of Computer Science , Weizmann Institute of Science, Herzl St 234, Rehovot 7630031, Israel
Abstract
Abstract
We consider the problem of sparse normal means estimation in a distributed setting with communication constraints. We assume there are $M$ machines, each holding $d$-dimensional observations of a $K$-sparse vector $\boldsymbol \mu $ corrupted by additive Gaussian noise. The $M$ machines are connected in a star topology to a fusion center, whose goal is to estimate the vector $\boldsymbol \mu $ with a low communication budget. Previous works have shown that to achieve the centralized minimax rate for the $\ell _2$ risk, the total communication must be high—at least linear in the dimension $d$. This phenomenon occurs, however, at very weak signals. We show that at signal-to-noise ratios (SNRs) that are sufficiently high—but not enough for recovery by any individual machine—the support of $\boldsymbol \mu $ can be correctly recovered with significantly less communication. Specifically, we present two algorithms for distributed estimation of a sparse mean vector corrupted by either Gaussian or sub-Gaussian noise. We then prove that above certain SNR thresholds, with high probability, these algorithms recover the correct support with total communication that is sublinear in the dimension $d$. Furthermore, the communication decreases exponentially as a function of signal strength. If in addition $KM\ll \tfrac{d}{\log d}$, then with an additional round of sublinear communication, our algorithms achieve the centralized rate for the $\ell _2$ risk. Finally, we present simulations that illustrate the performance of our algorithms in different parameter regimes.
Funder
Israel Science Foundation
Minerva Foundation
Publisher
Oxford University Press (OUP)
Subject
Applied Mathematics,Computational Theory and Mathematics,Numerical Analysis,Statistics and Probability,Analysis
Reference33 articles.
1. Distributed learning with sublinear communication;Acharya,2019
2. Distributed signal detection under communication constraints;Acharya,2020
3. General lower bounds for interactive high-dimensional estimation under information constraints;Acharya,2020
4. Distributed testing and estimation under sparse high dimensional models;Battey;Annals of statistics,2018
5. Scaling Up Machine Learning