SP-NLG: A Semantic-Parsing-Guided Natural Language Generation Framework
-
Published:2023-04-08
Issue:8
Volume:12
Page:1772
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Li Tongliang1ORCID, Zhang Shun1, Li Zhoujun1
Affiliation:
1. State Key Lab of Software Development Environment, Beihang University, Beijing 100191, China
Abstract
We propose SP-NLG: A semantic-parsing-guided natural language generation framework for logical content generation with high fidelity. Prior studies adopt large pretrained language models and coarse-to-fine decoding techniques to generate text with logic; while achieving considerable results on automatic evaluation metrics, they still face challenges in terms of logical fidelity based on human evaluation. Inspired by semantic parsing, which translates natural language utterances into executable logical forms, we propose to guide the generation process with a semantic parser. Different from semantic parsing and QA tasks, of which the logical correctness can be evaluated based on the execution result, the logic information of the generated content is implicit. To leverage a semantic parser for generation, we propose a slot-tied back-search algorithm. We follow the coarse-to-fine generation scheme, but instead of filling the slots with model predictions, which is less uncontrolled, the slot values are offline searched by the algorithm. The slot-tied back-search algorithm ties the parameters of a logic form with the slots of a template in one-to-one correspondence. We back-search the arguments that correctly execute the logic form and fill the arguments (as slot values) into the textual template to generate the final target, which ensures the logical correctness. Experiment results of a model built on SP-NLG demonstrates that our framework achieves high fidelity on logical text generation.
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Reference53 articles.
1. Leveraging Pre-trained Checkpoints for Sequence Generation Tasks;Rothe;Trans. Assoc. Comput. Linguist.,2020 2. Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mohamed, A., Levy, O., Stoyanov, V., and Zettlemoyer, L. (2020, January 5–10). BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 3. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer;Raffel;J. Mach. Learn. Res.,2020 4. Chen, W., Chen, J., Su, Y., Chen, Z., and Wang, W.Y. (2020, January 5–10). Logical Natural Language Generation from Open-Domain Tables. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online. 5. Dzmitry, B., Cho, K., and Yoshua, B. (2015, January 7–9). Neural machine translation by jointly learning to align and translate. Proceedings of the 3rd International Conference on Learning Representations, San Diego, CA, USA.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|