1. Dan Alistarh , Demjan Grubic , Jerry Li , Ryota Tomioka , and Milan Vojnovic . 2017 . QSGD: Communication-efficient SGD via gradient quantization and encoding. In Advances in Neural Information Processing Systems (NIPS). Dan Alistarh, Demjan Grubic, Jerry Li, Ryota Tomioka, and Milan Vojnovic. 2017. QSGD: Communication-efficient SGD via gradient quantization and encoding. In Advances in Neural Information Processing Systems (NIPS).
2. Dmitrii Avdiukhin and Shiva Kasiviswanathan . 2021 . Federated learning under arbitrary communication patterns . In International Conference on Machine Learning. PMLR, 425--435 . Dmitrii Avdiukhin and Shiva Kasiviswanathan. 2021. Federated learning under arbitrary communication patterns. In International Conference on Machine Learning. PMLR, 425--435.
3. Keith Bonawitz Hubert Eichner Wolfgang Grieskamp Dzmitry Huba Alex Ingerman Vladimir Ivanov Chloe Kiddon Jakub Konecny Stefano Mazzocchi H Brendan McMahan etal 2019. Towards federated learning at scale: System design. In Machine Learning and Systems (MLSys). Keith Bonawitz Hubert Eichner Wolfgang Grieskamp Dzmitry Huba Alex Ingerman Vladimir Ivanov Chloe Kiddon Jakub Konecny Stefano Mazzocchi H Brendan McMahan et al. 2019. Towards federated learning at scale: System design. In Machine Learning and Systems (MLSys).
4. Alireza Fallah Aryan Mokhtari and Asuman Ozdaglar. 2020. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. In Advances in Neural Information Processing Systems (NIPS). Alireza Fallah Aryan Mokhtari and Asuman Ozdaglar. 2020. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. In Advances in Neural Information Processing Systems (NIPS).
5. Yuanxiong Guo , Ying Sun , Rui Hu , and Yanmin Gong . 2022 . Hybrid local SGD for federated learning with heterogeneous communications . In International Conference on Learning Representations (ICLR). Yuanxiong Guo, Ying Sun, Rui Hu, and Yanmin Gong. 2022. Hybrid local SGD for federated learning with heterogeneous communications. In International Conference on Learning Representations (ICLR).