Author:
Kong Jian-Gang,Li Qing-Xu,Li Jian,Liu Yu,Zhu Jia-Ji
Abstract
Antiferromagnetic materials are exciting quantum materials with rich physics and great potential for applications. On the other hand, an accurate and efficient theoretical method is highly demanded for determining critical transition temperatures, Néel temperatures, of antiferromagnetic materials. The powerful graph neural networks (GNNs) that succeed in predicting material properties lose their advantage in predicting magnetic properties due to the small dataset of magnetic materials, while conventional machine learning models heavily depend on the quality of material descriptors. We propose a new strategy to extract high-level material representations by utilizing self-supervised training of GNNs on large-scale unlabeled datasets. According to the dimensional reduction analysis, we find that the learned knowledge about elements and magnetism transfers to the generated atomic vector representations. Compared with popular manually constructed descriptors and crystal graph convolutional neural networks, self-supervised material representations can help us to obtain a more accurate and efficient model for Néel temperatures, and the trained model can successfully predict high Néel temperature antiferromagnetic materials. Our self-supervised GNN may serve as a universal pre-training framework for various material properties.
Subject
General Physics and Astronomy
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献