Affiliation:
1. Civil Aviation University of China, Tianjin, China
Abstract
Attention mechanism is an increasingly important approach in the field of natural language processing (NLP). In the attention-based named entity recognition (NER) model, most attention mechanisms can calculate attention coefficient to express the importance of sentence semantic information but cannot adjust the position distribution of contextual feature vectors in the semantic space. To address this issue, a radial basis function attention (RBF-attention) layer is proposed to adaptively regulate the position distribution of sequence contextual feature vectors, which can minimize the relative distance of within-category named entities and maximize the relative distance of between-category named entities in the semantic space. The experimental results on CoNLL2003 English and MSRA Chinese NER datasets indicate that the proposed model performs better than other baseline approaches without relying on any external feature engineering.
Funder
Scientific Research Project of Tianjin Municipal Education Commission
National Natural Science Foundation of China
Publisher
Association for Computing Machinery (ACM)
Reference42 articles.
1. A survey of named entity recognition and classification
2. E. Bastianelli, G. Castellucci, D. Croce, and R. Basili. 2013. Textual inference and meaning representation in human robot interaction. In Joint Symposium on Semantic Processing Textual Inference and Structures in Corpora. (2013), 65–69.
3. Quantifying the Significance of Cybersecurity Text through Semantic Similarity and Named Entity Recognition
4. Information Extraction over Structured Data: Question Answering with Freebase
5. O. Kuru, O. A. Can, and D. Yuret, 2016. CharNER: Character-level named entity recognition. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers (COLING’16). 911–921.