1. novel two-stage sample selection framework UCRT by introducing a joint loss, unsupervised contrastive loss, and an improved uniform consistency selection strategy, which is applicable to almost all noise scenarios. Through extensive experiments, we validate the effectiveness of the introduced joint loss when used with UCL, demonstrating its efficacy across diverse noise scenarios without additional tuning. Moreover, the incorporation of uniform selection addresses the challenge of class imbalance when gathering small, clean subsets in asymmetric scenarios. Lastly, incorporating UCL enhances model robustness in scenarios with asymmetric or IDN noise. Upon combining these three components, UCRT not only achieves superior performance but also sustains commendable robustness throughout iterations, preventing overfitting. We validate the effectiveness and superiority of our method through extensive experiments on CIFAR-10/100 with CDN and IDN noise, as well as comparison on several real-world noisy datasets;CRediT authorship contribution statement Qian Zhang: Funding acquisition
2. Understanding deep learning requires re-thinking generalization;C Zhang;Proc. Int. Conf. on Learn,2017
3. TCC-net: A two-stage training method with contradictory loss and co-teaching based on meta-learning for learning with noisy labels;Q Q Xia;Inf. Sci
4. ExpertNet: Defeat noisy labels by deep expert consultation paradigm for pneumoconiosis staging on chest radiographs;W J Sun
5. A balanced random learning strategy for CNN based Landsat image segmentation under imbalanced and noisy labels;X M Zhao;Pattern Recognit