1. Statistically preconditioned accelerated gradient method for distributed optimization;Hendrikx,2020
2. Communication-efficient distributed optimization using an approximate Newton-type method;Shamir,2014
3. On convergence of distributed approximate Newton methods: globalization, sharper bounds and beyond;Yuan;J. Mach. Learn. Res.,2020
4. Giant: globally improved approximate Newton method for distributed optimization;Wang,2018
5. An optimal algorithm for decentralized finite-sum optimization;Hendrikx;SIAM J. Control Optim.,2021