Funder
National Research Foundation of Korea
Ministry of Science, ICT and Future Planning
Reference47 articles.
1. Towards understanding ensemble, knowledge distillation and self-distillation in deep learning;Allen-Zhu,2023
2. A closer look at memorization in deep networks;Arpit,2017
3. Less is more: A comprehensive framework for the number of components of ensemble classifiers;Bonab;IEEE Transactions on Neural Networks and Learning Systems,2019
4. Understanding and utilizing deep neural networks trained with noisy labels;Chen,2019
5. Chen, Y., Shen, X., Hu, S. X., & Suykens, J. A. (2021). Boosting co-teaching with compression regularization for label noise. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (pp. 2688–2692).