Abstract
Importance sampling, a variant of online sampling, is often used in neural network training to improve the learning process, and, in particular, the convergence speed of the model. We study, here, the performance of a set of batch selection algorithms, namely, online sampling algorithms that process small parts of the dataset at each iteration. Convergence is accelerated through the creation of a bias towards the learning of hard samples. We first consider the baseline algorithm and investigate its performance in terms of convergence speed and generalization efficiency. The latter, however, is limited in case of poor balancing of data sets. To alleviate this shortcoming, we propose two variations of the algorithm that achieve better generalization and also manage to not undermine the convergence speed boost offered by the original algorithm. Various data transformation techniques were tested in conjunction with the proposed scheme to develop an overall training method of the model and to ensure robustness in different training environments. An experimental framework was constructed using three naturally imbalanced datasets and one artificially imbalanced one. The results assess the advantage in convergence of the extended algorithm over the vanilla one, but, mostly, show better generalization performance in imbalanced data environments.
Subject
Computational Mathematics,Computational Theory and Mathematics,Numerical Analysis,Theoretical Computer Science
Reference46 articles.
1. The class imbalance problem: A systematic study;Japkowicz;Intell. Data Anal.,2002
2. Guo, X., Yin, Y., Dong, C., Yang, G., and Zhou, G. (2008, January 18–20). On the Class Imbalance Problem. Proceedings of the 2008 Fourth International Conference on Natural Computation, Jinan, China.
3. Ioannou, G., Tagaris, T., and Stafylopatis, A. (2019, January 26–28). Improving the Convergence Speed of Deep Neural Networks with Biased Sampling. Proceedings of the 3rd International Conference on Advances in Artificial Intelligence, ICAAI 2019, Istanbul, Turkey.
4. SMOTE: Synthetic Minority over-Sampling Technique;Chawla;J. Artif. Int. Res.,2002
5. Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties;Fix;Int. Stat. Rev./Rev. Int. Stat.,1989
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献