Author:
Kobayashi Naoki,Hirao Tsutomu,Kamigaito Hidetaka,Okumura Manabu,Nagata Masaaki
Abstract
Some downstream NLP tasks exploit discourse dependency trees converted from RST trees. To obtain better discourse dependency trees, we need to improve the accuracy of RST trees at the upper parts of the structures. Thus, we propose a novel neural top-down RST parsing method. Then, we exploit three levels of granularity in a document, paragraphs, sentences and Elementary Discourse Units (EDUs), to parse a document accurately and efficiently. The parsing is done in a top-down manner for each granularity level, by recursively splitting a larger text span into two smaller ones while predicting nuclearity and relation labels for the divided spans. The results on the RST-DT corpus show that our method achieved the state-of-the-art results, 87.0 unlabeled span score, 74.6 nuclearity labeled span score, and the comparable result with the state-of-the-art, 60.0 relation labeled span score. Furthermore, discourse dependency trees converted from our RST trees also achieved the state-of-the-art results, 64.9 unlabeled attachment score and 48.5 labeled attachment score.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
13 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Topic-Aware Two-Layer Context-Enhanced Model for Chinese Discourse Parsing;Communications in Computer and Information Science;2023-11-27
2. A Two-Stage Long Text Summarization Method Based on Discourse Structure;International Journal of Software Innovation;2023-09-29
3. Top-down Text-Level Discourse Rhetorical Structure Parsing with Bidirectional Representation Learning;Journal of Computer Science and Technology;2023-09
4. An Extractive Text Summarization Model Based on Rhetorical Structure Theory;2023 26th ACIS International Winter Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD-Winter);2023-07-05
5. Discourse Parsing on Multi-Granularity Interaction;2023 International Joint Conference on Neural Networks (IJCNN);2023-06-18