Author:
Tian Hao,Zhang Xiaoxiong,Wang Yuhan,Zeng Daojian
Abstract
Knowledge graph completion is an important technology for supplementing knowledge graphs and improving data quality. However, the existing knowledge graph completion methods ignore the features of triple relations, and the introduced entity description texts are long and redundant. To address these problems, this study proposes a multi-task learning and improved TextRank for knowledge graph completion (MIT-KGC) model. The key contexts are first extracted from redundant entity descriptions using the improved TextRank algorithm. Then, a lite bidirectional encoder representations from transformers (ALBERT) is used as the text encoder to reduce the parameters of the model. Subsequently, the multi-task learning method is utilized to fine-tune the model by effectively integrating the entity and relation features. Based on the datasets of WN18RR, FB15k-237, and DBpedia50k, experiments were conducted with the proposed model and the results showed that, compared with traditional methods, the mean rank (MR), top 10 hit ratio (Hit@10), and top three hit ratio (Hit@3) were enhanced by 38, 1.3%, and 1.9%, respectively, on WN18RR. Additionally, the MR and Hit@10 were increased by 23 and 0.7%, respectively, on FB15k-237. The model also improved the Hit@3 and the top one hit ratio (Hit@1) by 3.1% and 1.5% on the dataset DBpedia50k, respectively, verifying the validity of the model.
Subject
General Physics and Astronomy
Reference43 articles.
1. WordNet
2. Freebase: A collaboratively created graph database for structuring human knowledge;Bollacker;Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data,2008
3. Multi-task deep neural networks for natural language understanding;Liu;Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics,2019
4. TextRank: Bringing Order into Texts;Mihalcea;Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing,2004
5. Albert: A lite bert for self-supervised learning of language representations;Lan;arXiv,2019
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献