Graph Clustering with High-Order Contrastive Learning
Author:
Li Wang1, Zhu En1ORCID, Wang Siwei1ORCID, Guo Xifeng2
Affiliation:
1. School of Computer Science, National University of Defense Technology, Changsha 410000, China 2. School of Cyberspace Science, Dongguan University of Technology, Dongguan 523808, China
Abstract
Graph clustering is a fundamental and challenging task in unsupervised learning. It has achieved great progress due to contrastive learning. However, we find that there are two problems that need to be addressed: (1) The augmentations in most graph contrastive clustering methods are manual, which can result in semantic drift. (2) Contrastive learning is usually implemented on the feature level, ignoring the structure level, which can lead to sub-optimal performance. In this work, we propose a method termed Graph Clustering with High-Order Contrastive Learning (GCHCL) to solve these problems. First, we construct two views by Laplacian smoothing raw features with different normalizations and design a structure alignment loss to force these two views to be mapped into the same space. Second, we build a contrastive similarity matrix with two structure-based similarity matrices and force it to align with an identity matrix. In this way, our designed contrastive learning encompasses a larger neighborhood, enabling our model to learn clustering-friendly embeddings without the need for an extra clustering module. In addition, our model can be trained on a large dataset. Extensive experiments on five datasets validate the effectiveness of our model. For example, compared to the second-best baselines on four small and medium datasets, our model achieved an average improvement of 3% in accuracy. For the largest dataset, our model achieved an accuracy score of 81.92%, whereas the compared baselines encountered out-of-memory issues.
Funder
National Key R&D Program of China National Natural Science Foundation of China
Subject
General Physics and Astronomy
Reference37 articles.
1. Hamilton, W.L., Ying, Z., and Leskovec, J. (2017, January 4–9). Inductive Representation Learning on Large Graphs. Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA. 2. Kipf, T.N., and Welling, M. (2017, January 24–26). Semi-Supervised Classification with Graph Convolutional Networks. Proceedings of the ICLR 2017, Toulon, France. 3. Monti, F., Boscaini, D., Masci, J., Rodolà, E., Svoboda, J., and Bronstein, M.M. (2017, January 21–26). Geometric Deep Learning on Graphs and Manifolds Using Mixture Model CNNs. Proceedings of the CVPR 2017, Honolulu, HI, USA. 4. Beck, D., Haffari, G., and Cohn, T. (2018, January 15–20). Graph-to-Sequence Learning using Gated Graph Neural Networks. Proceedings of the ACL 2018, Melbourne, Australia. 5. Bastings, J., Titov, I., Aziz, W., Marcheggiani, D., and Sima’an, K. (2017, January 9–11). Graph Convolutional Encoders for Syntax-aware Neural Machine Translation. Proceedings of the EMNLP 2017, Copenhagen, Denmark.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|