Improving Low-Resource Chinese Named Entity Recognition Using Bidirectional Encoder Representation from Transformers and Lexicon Adapter
-
Published:2023-09-27
Issue:19
Volume:13
Page:10759
-
ISSN:2076-3417
-
Container-title:Applied Sciences
-
language:en
-
Short-container-title:Applied Sciences
Author:
Dang Xiaochao1, Wang Li2, Dong Xiaohui2ORCID, Li Fenfang2, Deng Han2
Affiliation:
1. College of Computer Science & Engineering, Northwest Normal University, Lanzhou 730070, China 2. Gansu Province Internet of Things Engineering Research Centre, Northwest Normal University, Lanzhou 730070, China
Abstract
Due to their individual advantages, the integration of lexicon information and pre-trained models like BERT has been widely adopted in Chinese sequence labeling tasks. However, given their high demand for training data, efforts have been made to enhance their performance in low-resource scenarios. Currently, certain specialized domains, such as agriculture, the industrial sector, and the metallurgical industry, suffer from a scarcity of data. Consequently, there is a dearth of effective models for entity relationship recognition when faced with limited data availability. Inspired by this, we constructed a suitable small balanced dataset and proposed a based-domain-NER model. Firstly, we construct a domain-specific dictionary based on mine hoist equipment and fault text and generate a dictionary tree to obtain word vector information. Secondly, we use a Lexicon Adapter to obtain the vector information of the domain-specific dictionary feature words matched using the characters and calculate the weights between their word vectors, integrating position encoding to enhance the positional information of the word vectors. Finally, we incorporate word vector information into the feature extraction layer to enhance the boundary information of domain entities and mitigate the semantic loss problem caused via using only character feature representation. Experimental results on a manually annotated dataset of mine hoist fault texts show that this method outperforms BiLSTM, BiLSTM-CRF, BERT, BERT-BiLSTM-CRF, and LEBERT, effectively improving the accuracy of named entity recognition (NER) for mine hoist faults.
Funder
National Natural Science Foundation of China Industrial Support Foundations of Gansu
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference42 articles.
1. A Survey on Deep Learning for Named Entity Recognition;Li;IEEE Trans. Knowl. Data Eng.,2020 2. Hedderich, M.A., Lange, L., Adel, H., Strötgen, J., and Klakow, D. (2021, January 6–11). A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios. Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics, Online. 3. Liang, Y., Meng, F., Zhou, C., Xu, J., Chen, Y., Su, J., and Zhou, J. (2022). A variational hierarchical model for neural cross-lingual summarization. arXiv. 4. Xie, J., Yang, Z., Neubig, G., Smith, N.A., and Carbonell, J. (November, January 31). Neural Cross-Lingual Named Entity Recognition with Minimal Resources. Proceedings of the EMNLP, Brussels, Belgium. 5. Huang, Z., Xu, W., and Yu, K. (2015). Bidirectional LSTM-CRF Models for Sequence Tagging. arXiv.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|