Author:
Kotnis Bhushan,Lawrence Carolin,Niepert Mathias
Abstract
Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bidirectional Query Embedding (BiQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the elements of a query graph. We introduce two new challenging datasets for studying conjunctive query inference and conduct experiments on several benchmark datasets that demonstrate BiQE significantly outperforms state of the art baselines.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Conditional Logical Message Passing Transformer for Complex Query Answering;Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining;2024-08-24
2. Privacy-Preserved Neural Graph Databases;Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining;2024-08-24
3. Generative Models for Complex Logical Reasoning over Knowledge Graphs;Proceedings of the 17th ACM International Conference on Web Search and Data Mining;2024-03-04
4. Towards Bi-Level Out-of-Distribution Logical Reasoning on Knowledge Graphs;2023 IEEE International Conference on Big Data (BigData);2023-12-15
5. Knowledge Graph Reasoning over Entities and Numerical Values;Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining;2023-08-04