Author:
Wan Sheng,Pan Shirui,Yang Jian,Gong Chen
Abstract
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph. As one of the most popular graph-based SSL approaches, the recently proposed Graph Convolutional Networks (GCNs) have gained remarkable progress by combining the sound expressiveness of neural networks with graph structure. Nevertheless, the existing graph-based methods do not directly address the core problem of SSL, \emph{i.e.}, the shortage of supervision, and thus their performances are still very limited. To accommodate this issue, this paper presents a novel GCN-based SSL algorithm which aims to enrich the supervision signals by utilizing both data similarities and graph structure. Firstly, by designing a semi-supervised contrastive loss, the improved node representations can be generated via maximizing the agreement between different views of the same data or the data from the same class. Therefore, the rich unlabeled data and the scarce yet valuable labeled data can jointly provide abundant supervision information for learning discriminative node representations, which helps improve the subsequent classification result. Secondly, the underlying determinative relationship between the input graph topology and data features is extracted as supplementary supervision signals for SSL via using a graph generative loss related to input features. Intensive experimental results on a variety of real-world datasets firmly verify the effectiveness of our algorithm when compared with other state-of-the-art methods.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
61 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献