Author:
Li Chen,Yu Xiaoguang,Song Shuangyong,Wang Jia,Zou Bo,He Xiaodong
Abstract
This paper presents SimCTC, a simple contrastive learning (CL) framework that greatly advances the state-of-the-art text clustering models. In SimCTC, a pre-trained BERT model first maps the input sequence to the representation space, which is then followed by three different loss function heads: Clustering head, Instance-CL head and Cluster-CL head. Experimental results on multiple benchmark datasets demonstrate that SimCTC remarkably outperforms 6 competitive text clustering methods with 1%-6% improvement on Accuracy (ACC) and 1%-4% improvement on Normalized Mutual Information (NMI). Moreover, our results also show that the clustering performance can be further improved by setting an appropriate number of clusters in the cluster-level objective.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Matrix Contrastive Learning for Short Text Clustering;Communications in Computer and Information Science;2023-11-13
2. Towards Intelligent Training Systems for Customer Service;2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC);2023-10-01
3. CEIL: A General Classification-Enhanced Iterative Learning Framework for Text Clustering;Proceedings of the ACM Web Conference 2023;2023-04-30