Affiliation:
1. College of Electronic and Information Engineering, Inner Mongolia University, Hohhot 010021, China
Abstract
With the Chinese data for solid rocket engines, traditional named entity recognition cannot be used to learn both character features and contextual sequence-related information from the input text, and there is a lack of research on the advantages of dual-channel networks. To address this problem, this paper proposes a BERT-based dual-channel named entity recognition model for solid rocket engines. This model uses a BERT pre-trained language model to encode individual characters, obtaining a vector representation corresponding to each character. The dual-channel network consists of a CNN and BiLSTM, using the convolutional layer for feature extraction and the BiLSTM layer to extract sequential and sequence-related information from the text. The experimental results showed that the model proposed in this paper achieved good results in the named entity recognition task using the solid rocket engine dataset. The accuracy, recall and F1-score were 85.40%, 87.70% and 86.53%, respectively, which were all higher than the results of the comparison models.
Funder
Key Technology Research Plan Project of Inner Mongolia Autonomous Region
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Reference28 articles.
1. Telugu named entity recognition using bert;Gorla;Int. J. Data Sci. Anal.,2022
2. A survey on deep learning for named entity recognition;Li;IEEE Trans. Knowl. Data Eng.,2020
3. A survey of named entity recognition and classification. Lingvisticæ Investigationes;Nadeau;Int. J. Linguist. Lang. Resour.,2007
4. Kůrková, V., Manolopoulos, Y., Hammer, B., Iliadis, L., and Maglogiannis, I. (2018). Artificial Neural Networks and Machine Learning–ICANN 2018, Springer.
5. CustNER: A rule-based named-entity recognizer with improved recall;Mumtaz;Int. J. Semant. Web Inf. Syst.,2020
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献