1. S. Wiedemann, K.-R. Müller, W. Samek, Compact and computationally efficient representation of deep neural networks, IEEE Trans. on Neural Netw. and Learn. Syst. 31 (3) (2020) 772–785.
2. F. Sattler, S. Wiedemann, K.-R. Müller, W. Samek, Robust and communication-efficient federated learning from non-i.i.d. data, IEEE Trans. on Neural Netw. and Learn. Syst. 31 (9) (2020) 3400–3413.
3. B. McMahan, H, E. Moore, D. Ramage, S. Hampson, B. A. y Arcas, Communication-efficient learning of deep networks from decentralized data, in: Proc. Artif. Intell. Statist.(AISTATS), 2017, pp. 1273–1282.
4. B. McMahan, E. Moore, D. Ramage, S. Hampson, B. A. y Arcas, Communication-efficient learning of deep networks from decentralized data, in: Artificial intelligence and statistics, PMLR, 2017, pp. 1273–1282.
5. J. Dean, G. S. Corrado, R. Monga, K. Chen, A. Y. Ng, Large scale distributed deep networks, in: Proc. Adv. Neural inf. Process. Sys., 2015, pp. 1223–1231.