Abstract
AbstractThis paper proposes a sequence-to-sequence model for data-to-text generation, called DM-NLG, to generate a natural language text from structured nonlinguistic input. Specifically, by adding a dynamic memory module to the attention-based sequence-to-sequence model, it can store the information that leads to generate previous output words and use it to generate the next word. In this way, the decoder part of the model is aware of all previous decisions, and as a result, the generation of duplicate words or incomplete semantic concepts is prevented. To improve the generated sentences quality by the DM-NLG decoder, a postprocessing step is performed using the pretrained language models. To prove the effectiveness of the DM-NLG model, we performed experiments on five different datasets and observed that our proposed model is able to reduce the slot error rate rate by 50% and improve the BLEU by 10%, compared to the state-of-the-art models.
Publisher
Cambridge University Press (CUP)
Subject
Artificial Intelligence,Linguistics and Language,Language and Linguistics,Software
Reference71 articles.
1. Konstas, I. and Lapata, M. (2012). Unsupervised concept-to-text generation with hypergraphs. In Proceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Montréal, Canada, pp. 752–761.
2. Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity
3. Chang, E. , Shen, X. , Zhu, D. , Demberg, V. and Su, H. (2021). Neural data-to-text generation with LM-based text augmentation, arXiv preprint arXiv:2102.03556.
4. What to talk about and how? Selective Generation using LSTMs with Coarse-to-Fine Alignment
5. Kasner, Z. and Dušek, O. (2020). Data-to-text generation with iterative text editing. In Proceedings of the 13th International Conference on Natural Language Generation, Dublin, Ireland, pp. 60–67.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献