Relational Memory-Augmented Language Models

Author:

Liu Qi1,Yogatama Dani2,Blunsom Phil34

Affiliation:

1. University of Oxford, United Kingdom qi.liu@cs.ox.ac.uk

2. DeepMind, United Kingdom dyogatama@deepmind.com

3. University of Oxford, United Kingdom

4. DeepMind, United Kingdom phil.blunsom@cs.ox.ac.uk

Abstract

Abstract We present a memory-augmented approach to condition an autoregressive language model on a knowledge graph. We represent the graph as a collection of relation triples and retrieve relevant relations for a given context to improve text generation. Experiments on WikiText-103, WMT19, and enwik8 English datasets demonstrate that our approach produces a better language model in terms of perplexity and bits per character. We also show that relational memory improves coherence, is complementary to token-based memory, and enables causal interventions. Our model provides a simple yet effective way to combine an autoregressive language model and a knowledge graph for more coherent and logical generation.

Publisher

MIT Press - Journals

Subject

Artificial Intelligence,Computer Science Applications,Linguistics and Language,Human-Computer Interaction,Communication

Reference90 articles.

1. Knowledge graph based synthetic corpus generation for knowledge- enhanced language model pre-training;Agarwal,2021

2. A neural knowledge language model;Ahn;arXiv preprint arXiv: 1608.00318,2016

3. Leveraging linguistic structure for open domain information extraction;Angeli,2015

4. Learning beyond datasets: Knowledge graph augmented neural networks for natural language processing;Annervaz,2018

5. Learning to compute word embeddings on the fly;Bahdanau;CoRR,2017

Cited by 5 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3