Abstract
Knowledge representation learning is representing entities and relations in a knowledge graph as dense low-dimensional vectors in the continuous space, which explores the features and properties of the graph. Such a technique can facilitate the computation and reasoning on the knowledge graphs, which benefits many downstream tasks. In order to alleviate the problem of insufficient entity representation learning caused by sparse knowledge graphs, some researchers propose knowledge graph embedding models based on instances and concepts, which utilize the latent semantic connections between concepts and instances contained in the knowledge graphs to enhance the knowledge graph embedding. However, they model instances and concepts in the same space or ignore the transitivity of isA relations, leading to inaccurate embeddings of concepts and instances. To address the above shortcomings, we propose a knowledge graph embedding model that differentiates concepts and instances based on spatial transformation—CIST. The model alleviates the gathering issue of similar instances or concepts in the semantic space by modeling them in different embedding spaces, and adds a learnable parameter to adjust the neighboring range for concept embedding to distinguish hierarchical information of different concepts, thus modeling the transitivity of isA relations. The above features of instances and concepts serve as auxiliary information so that thoroughly modeling them could alleviate the insufficient entity representation learning issue. For the experiments, we chose two tasks, i.e., link prediction and triple classification, and two real-life datasets: YAGO26K-906 and DB111K-174. Compared with state of the arts, CIST achieves an optimal performance in most cases. Specifically, CIST outperforms the SOTA model JOIE by 51.1% on Hits@1 in link prediction and 15.2% on F1 score in triple classification.
Subject
General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)
Reference29 articles.
1. Distributed Representations of Words and Phrases and their Compositionality;Mikolov;Proceedings of the Advances in Neural Information Processing Systems 26: 27th Annual Conference on Neural Information Processing Systems 2013,2013
2. Translating Embeddings for Modeling Multi-relational Data;Bordes;Proceedings of the Advances in Neural Information Processing Systems 26: 27th Annual Conference on Neural Information Processing Systems 2013,2013
3. Knowledge Graph Embedding by Translating on Hyperplanes;Wang;Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence,2014
4. Embedding Entities and Relations for Learning and Inference in Knowledge Bases;Yang;Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015,2015
5. Modeling Relational Data with Graph Convolutional Networks
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献