Author:
Starlin Jini S.,Chenthalir Indra Dr. N.
Subject
Electrical and Electronic Engineering,General Computer Science,Electronic, Optical and Magnetic Materials
Reference22 articles.
1. Lee, N., Ajanthan, T., Torr, P.H., and Jaggi, M., Understanding the Effects of Data Parallelism and Sparsity on Neural Network Training. arXiv preprint arXiv:2003.11316, 2020.
2. Xun, Y., Zhang, J., Qin, X., and Zhao, X., FiDoop-DP: Data partitioning in frequent itemset mining on hadoop clusters, IEEE Trans. Parallel Distrib. Syst., 2016, vol. 28, no. 1, pp. 101–114.
3. Kulkarni, M., Pingali, K., Ramanarayanan, G., Walter, B., Bala, K., and Chew, L.P., Optimistic parallelism benefits from data partitioning, ACM SIGOPS Oper. Syst. Rev., 2008, vol. 42, no. 2, pp. 233–243
4. Hernández, Á.B., Perez, M.S., Gupta, S., and Muntés-Mulero, V., Using machine learning to optimize parallelism in big data applications, Future Gener. Comput. Syst., 2018, vol. 86, pp. 1076–1092.
5. Wang, K. and Porter, M.D., Optimal Bayesian clustering using non-negative matrix factorization, Comput. Stat. Data Anal., 2018, vol. 128, pp. 395–411
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献