Affiliation:
1. Minnan Normal University, Zhangzhou, China
2. Minnan Normal University, Zhangzhou China
Abstract
Existing methods based on transfer learning leverage auxiliary information to help tail generalization and improve the performance of the tail classes. However, they cannot fully exploit the relationships between auxiliary information and tail classes and bring irrelevant knowledge to the tail classes. To solve this problem, we propose a hierarchical CNN with knowledge complementation, which regards hierarchical relationships as auxiliary information and transfers relevant knowledge to tail classes. First, we integrate semantics and clustering relationships as hierarchical knowledge into the CNN to guide feature learning. Then, we design a complementary strategy to jointly exploit the two types of knowledge, where semantic knowledge acts as a prior dependence and clustering knowledge reduces the negative information caused by excessive semantic dependence (i.e., semantic gaps). In this way, the CNN facilitates the utilization of the two complementary hierarchical relationships and transfers useful knowledge to tail data to improve long-tailed classification accuracy. Experimental results on public benchmarks show that the proposed model outperforms existing methods. In particular, our model improves accuracy by 3.46% compared with the second-best method on the long-tailed tieredImageNet dataset.
Funder
National Natural Science Foundation of China
Natural Science Foundation of Fujian Province
Publisher
Association for Computing Machinery (ACM)
Reference51 articles.
1. To Combat Multi-Class Imbalanced Problems by Means of Over-Sampling Techniques
2. Kaidi Cao, Colin Wei, Adrien Gaidon, Nikos Arechiga, and Tengyu Ma. 2019. Learning imbalanced datasets with label-distribution-aware margin loss. In International Conference on Neural Information Processing Systems. 1567–1578.
3. Deep Self-Evolution Clustering
4. Hierarchy-aware Label Semantics Matching Network for Hierarchical Text Classification
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献