Multi-Meta Information Embedding Enhanced BERT for Chinese Mechanics Entity Recognition
-
Published:2023-10-15
Issue:20
Volume:13
Page:11325
-
ISSN:2076-3417
-
Container-title:Applied Sciences
-
language:en
-
Short-container-title:Applied Sciences
Author:
Zhang Jiarong1ORCID, Yuan Jinsha1, Zhang Jing2, Luo Zhihong3, Li Aitong4
Affiliation:
1. Department of Electronic and Communication Engineering, North China Electric Power University, Baoding 071003, China 2. Department of New Energy Power Technology Research, COMAC Beijing Aircraft Technology Research Institute, Beijing 102211, China 3. Department of Electric Power, Inner Mongolia University of Technology, Hohhot 010051, China 4. College of Economics, Bohai University, Jinzhou 121013, China
Abstract
The automatic extraction of key entities in mechanics problems is an important means to automatically solve mechanics problems. Nevertheless, for standard Chinese, compared with the open domain, mechanics problems have a large number of specialized terms and composite entities, which leads to a low recognition capability. Although recent research demonstrates that external information and pre-trained language models can improve the performance of Chinese Named Entity Recognition (CNER), few efforts have been made to combine the two to explore high-performance algorithms for extracting mechanics entities. Therefore, this article proposes a Multi-Meta Information Embedding Enhanced Bidirectional Encoder Representation from Transformers (MMIEE-BERT) for recognizing entities in mechanics problems. The proposed method integrates lexical information and radical information into BERT layers directly by employing an information adapter layer (IAL). Firstly, according to the characteristics of Chinese, a Multi-Meta Information Embedding (MMIE) including character embedding, lexical embedding, and radical embedding is proposed to enhance Chinese sentence representation. Secondly, an information adapter layer (IAL) is proposed to fuse the above three embeddings into the lower layers of the BERT. Thirdly, a Bidirectional Long Short-Term Memory (BiLSTM) network and a Conditional Random Field (CRF) model are applied to semantically encode the output of MMIEE-BERT and obtain each character’s label. Finally, extensive experiments were carried out on the dataset built by our team and widely used datasets. The results demonstrate that the proposed method has more advantages than the existing models in the entity recognition of mechanics problems, and the precision, recall, and F1 score were improved. The proposed method is expected to provide an automatic means for extracting key information from mechanics problems.
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference40 articles.
1. Yan, W., Liu, X., and Shi, S. (2017, January 7–11). Deep Neural Solver for Math Word Problems. Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, (EMNLP) 2017, Copenhagen, Denmark. 2. Zhang, J., Wang, L., Lee, K.W., Yi, B., and Lim, E.P. (2020, January 5–10). Graph-to-Tree Learning for Solving Math Word Problems. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, (ACL) 2020, Washington, DC, USA. 3. A relation based algorithm for solving direct current circuit problems;He;Appl. Intell.,2020 4. Integrating deep learning with first order logic for solving kinematic problems;Zhang;Appl. Intell.,2022 5. An Artificial Intelligence Technology Based Algorithm for Solving Mechanics Problems;Zhang;IEEE Access,2022
|
|