Abstract
AbstractQuestion classification is an essential task in question answering (QA) systems. An effective and efficient question classification model can not only restrict the search space for answers, but also guide the QA system in selecting the optimal knowledge base and search strategy. In recent years, self-attention mechanism has been widely used in question classification for its strength of capturing global dependencies. However, it models all signals with weighted averaging, which is prone to overlooking the relation of neighboring signals. Furthermore, recent research has revealed that part-of-speech (POS) information can be used to determine and reinforce the semantics in sentence representation. In this paper, we propose a POS-aware adjacent relation attention network (POS-ARAN) for question classification, which enhance context representations with POS information and neighboring signals. To consider the local context, we propose an adjacent relation attention mechanism, which incorporates a Gaussian bias via a dynamic window to revise the vanilla self-attention mechanism. Thus, it can capture both the long-term dependency and local representation of semantic relations among words in different sentences. In addition, a POS-aware embedding layer is proposed, which helps to locate the appropriate headwords by syntactic information. Extensive experiments are conducted on Experimental Data for Question Classification (EDQC) dataset and Yahoo! Answers Comprehensive Questions and Answers 1.0, the results demonstrate that our model significantly outperforms the existing methods, achieving 95.59% in coarse-grained level accuracy and 92.91% in fine-grained level accuracy, respectively.
Funder
National Key Research and Development Program of China
Major Research plan of the National Social Science Foundation of China
Publisher
Springer Science and Business Media LLC
Subject
Computational Mathematics,Engineering (miscellaneous),Information Systems,Artificial Intelligence
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献