Author:
Shah Haseeb,Villmow Johannes,Ulges Adrian,Schwanecke Ulrich,Shafait Faisal
Abstract
We present a novel extension to embedding-based knowledge graph completion models which enables them to perform open-world link prediction, i.e. to predict facts for entities unseen in training based on their textual description. Our model combines a regular link prediction model learned from a knowledge graph with word embeddings learned from a textual corpus. After training both independently, we learn a transformation to map the embeddings of an entity’s name and description to the graph-based embedding space.In experiments on several datasets including FB20k, DBPedia50k and our new dataset FB15k-237-OWE, we demonstrate competitive results. Particularly, our approach exploits the full knowledge graph structure even when textual descriptions are scarce, does not require a joint training on graph and text, and can be applied to any embedding-based link prediction model, such as TransE, ComplEx and DistMult.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
37 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Information Fusion Representation Learning in Zero-shot Scenarios;Proceedings of the 2024 Guangdong-Hong Kong-Macao Greater Bay Area International Conference on Education Digitalization and Computer Science;2024-07-26
2. A Diffusion Model for Inductive Knowledge Graph Completion;2024 International Joint Conference on Neural Networks (IJCNN);2024-06-30
3. A Meta-Learning-Based Joint Two-View Framework for Inductive Knowledge Graph Completion;2024 International Joint Conference on Neural Networks (IJCNN);2024-06-30
4. A survey of inductive knowledge graph completion;Neural Computing and Applications;2023-12-13
5. Improving Knowledge Base Updates with CAIA: A Method Utilizing Capsule Network and Attentive Intratriplet Association Features;Journal of Sensors;2023-10-05