1. Beutel, D.J., Topal, T., Mathur, A., Qiu, X., Parcollet, T., Lane, N.D.: Flower: A friendly federated learning research framework. arXiv preprint arXiv:2007.14390 (2020)
2. Bi, R., Liu, Q., Ren, J., Tan, G.: Utility aware offloading for mobile-edge computing. Tsinghua Science and Technology 26(2), 239–250 (2020)
3. Bonawitz, K., Eichner, H., Grieskamp, W., Huba, D., Ingerman, A., Ivanov, V., Kiddon, C., Konečný, J., Mazzocchi, S., McMahan, B., Van Overveldt, T., Petrou, D., Ramage, D., Roselander, J.: Towards federated learning at scale: System design. In: A. Talwalkar, V. Smith, M. Zaharia (eds.) Proceedings of Machine Learning and Systems, vol. 1, pp. 374–388 (2019)
4. Chai, Z., Fayyaz, H., Fayyaz, Z., Anwar, A., Zhou, Y., Baracaldo, N., Ludwig, H., Cheng, Y.: Towards taming the resource and data heterogeneity in federated learning. In: 2019 {USENIX} Conference on Operational Machine Learning (OpML 19), pp. 19–21 (2019)
5. Chen, T., Giannakis, G.B., Sun, T., Yin, W.: Lag: Lazily aggregated gradient for communication-efficient distributed learning. arXiv preprint arXiv:1805.09965 (2018)