Affiliation:
1. Nanjing University of Science and Technology, Nanjing, China
Abstract
Graph contrastive learning has made remarkable achievements in the self-supervised representation learning of graph-structured data. By employing perturbation function (i.e., perturbation on the nodes or edges of graph), most graph contrastive learning methods construct contrastive samples on the original graph. However, the perturbation-based data augmentation methods randomly change the inherent information (e.g., attributes or structures) of the graph. Therefore, after nodes embedding on the perturbed graph, we cannot guarantee the validity of the contrastive samples as well as the learned performance of graph contrastive learning. To this end, in this article, we propose a novel generation-based multi-view contrastive learning framework (GMVC) for self-supervised graph representation learning, which generates the contrastive samples based on our generator rather than perturbation function. Specifically, after nodes embedding on the original graph we first employ random walk in the neighborhood to develop multiple relevant node sequences for each anchor node. We then utilize the transformer to generate the representations of relevant contrastive samples of anchor node based on the features and structures of the sampled node sequences. Finally, by maximizing the consistency between the anchor view and the generated views, we force the model to effectively encode graph information into nodes embeddings. We perform extensive experiments of node classification and link prediction tasks on eight benchmark datasets, which verify the effectiveness of our generation-based multi-view graph contrastive learning method.
Publisher
Association for Computing Machinery (ACM)
Reference55 articles.
1. Fredrik Carlsson, Amaru Cuba Gyllensten, Evangelia Gogoulou, Erik Ylipää Hellqvist, and Magnus Sahlgren. 2020. Semantic re-tuning with contrastive tension. In Proceedings of the International Conference on Learning Representations.
2. metapath2vec
3. Vijay Prakash Dwivedi and Xavier Bresson. 2021. A generalization of transformer networks to graphs. arXiv:2012.09699 [cs.LG].
4. Evgeniy Faerman Otto Voggenreiter Felix Borutta Tobias Emrich Max Berrendorf and Matthias Schubert. 2019. Graph alignment networks with node matching scores. Proceedings of Advances in Neural Information Processing Systems (NIPS) 2 (2019).
5. node2vec