Abstract
To alleviate the impact of insufficient labels in less-labeled classification problems, self-supervised learning improves the performance of graph neural networks (GNNs) by focusing on the information of unlabeled nodes. However, none of the existing self-supervised pretext tasks perform optimally on different datasets, and the choice of hyperparameters is also included when combining self-supervised and supervised tasks. To select the best-performing self-supervised pretext task for each dataset and optimize the hyperparameters with no expert experience needed, we propose a novel auto graph self-supervised learning framework and enhance this framework with a one-shot active learning method. Experimental results on three real world citation datasets show that training GNNs with automatically optimized pretext tasks can achieve or even surpass the classification accuracy obtained with manually designed pretext tasks. On this basis, compared with using randomly selected labeled nodes, using actively selected labeled nodes can further improve the classification performance of GNNs. Both the active selection and the automatic optimization contribute to semi-supervised node classification.
Funder
National Natural Science Foundation of China
Subject
General Physics and Astronomy
Reference43 articles.
1. A comprehensive survey on graph neural networks;Wu;IEEE Trans. Neural Networks Learn. Syst.,2020
2. Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions;Sarker;SN Comput. Sci.,2021
3. A comprehensive survey of graph embedding: Problems, techniques, and applications;Cai;IEEE Trans. Knowl. Data Eng.,2018
4. Kipf, T.N., and Welling, M. (2017, January 24–26). Semi-Supervised Classification with Graph Convolutional Networks. Proceedings of the 5th International Conference on Learning Representations ICLR, Toulon, France.
5. Liu, Y., Jin, M., Pan, S., Zhou, C., Zheng, Y., Xia, F., and Yu, P. (2022). Graph self-supervised learning: A survey. IEEE Trans. Knowl. Data Eng.