Author:
Bahi Jacques M.,Couturier Raphaël,Azar Joseph,Nguimfack Kevin Kana
Publisher
Springer Nature Singapore
Reference22 articles.
1. Ben-Nun, T., Hoefler, T.: Demystifying parallel and distributed deep learning: an in-depth concurrency analysis. ACM Comput. Surv. (CSUR) 52(4), 1–43 (2019)
2. Verbraeken, J., Wolting, M., Katzy, J., Kloppenburg, J., Verbelen, T., Rellermeyer, J.S.: A survey on distributed machine learning. ACM Comput. Surv. (CSUR) 53(2), 1–33 (2020)
3. Li, S., et al.: Pytorch distributed: experiences on accelerating data parallel training. arXiv preprint arXiv:2006.15704 (2020)
4. Podareanu, D., Codreanu, V., Sandra Aigner, T., van Leeuwen, G.C., Weinberg, V.: Best practice guide-deep learning. Partnership for Advanced Computing in Europe (PRACE), Technical Report, vol. 2 (2019)
5. Yuan, B., Wolfe, C.R., Dun, C., Tang, Y., Kyrillidis, A., Jermaine, C.: Distributed learning of fully connected neural networks using independent subnet training. Proc. VLDB Endow. 15(8), 1581–1590 (2022)