Funder
Basic and Applied Basic Research Foundation of Guangdong Province
National Natural Science Foundation of China
Reference51 articles.
1. Distributed asynchronous constrained stochastic optimization;Srivastava;IEEE J. Sel. Top. Sign. Proces.,2011
2. Distributed learning in the nonconvex world: From batch data to streaming and beyond;Chang;IEEE Signal Process. Mag.,2020
3. Asymptotic network independence in distributed stochastic optimization for machine learning: Examining distributed and centralized stochastic gradient descent;Pu;IEEE Signal Process. Mag.,2020
4. Can decentralized algorithms outperform centralized algorithms? A case study for decentralized parallel stochastic gradient descent;Lian,2017
5. M. Assran, N. Loizou, N. Ballas, M.G. Rabbat, Stochastic Gradient Push for Distributed Deep Learning, in: International Conference on Machine Learning, 2019, pp. 344–353.