Affiliation:
1. Shanghai Jiao Tong University, Department of Computer Science and Engineering. charlee@sjtu.edu.cn
2. Shanghai Jiao Tong University, Department of Computer Science and Engineering. zhaohai@cs.sjtu.edu.cn
3. Shanghai Jiao Tong University, Department of Computer Science and Engineering. heshexia@sjtu.edu.cn
4. Shanghai Jiao Tong University, Department of Computer Science and Engineering. caijiaxun@sjtu.edu.cn
Abstract
Semantic role labeling (SRL) is dedicated to recognizing the semantic predicate-argument structure of a sentence. Previous studies in terms of traditional models have shown syntactic information can make remarkable contributions to SRL performance; however, the necessity of syntactic information was challenged by a few recent neural SRL studies that demonstrate impressive performance without syntactic backbones and suggest that syntax information becomes much less important for neural semantic role labeling, especially when paired with recent deep neural network and large-scale pre-trained language models. Despite this notion, the neural SRL field still lacks a systematic and full investigation on the relevance of syntactic information in SRL, for both dependency and both monolingual and multilingual settings. This paper intends to quantify the importance of syntactic information for neural SRL in the deep learning framework. We introduce three typical SRL frameworks (baselines), sequence-based, tree-based, and graph-based, which are accompanied by two categories of exploiting syntactic information: syntax pruning-based and syntax feature-based. Experiments are conducted on the CoNLL-2005, -2009, and -2012 benchmarks for all languages available, and results show that neural SRL models can still benefit from syntactic information under certain conditions. Furthermore, we show the quantitative significance of syntax to neural SRL models together with a thorough empirical survey using existing models.
Subject
Artificial Intelligence,Computer Science Applications,Linguistics and Language,Language and Linguistics
Reference100 articles.
1. Neural machine translation by jointly learning to align and translate;Bahdanau,2015
2. The Berkeley FrameNet project;Baker,1998
3. Semantic parsing on Freebase from question-answer pairs;Berant,2013
4. A high-performance syntactic and semantic dependency parser;Björkelund,2010
5. Multilingual semantic role labeling;Björkelund,2009
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献