Affiliation:
1. College of Air Traffic Management, Civil Aviation Flight University of China, Guanghan 618307, China
Abstract
In recent years, the emergence of large-scale pre-trained language models has made transfer learning possible in natural language processing, which overturns the traditional model architecture based on recurrent neural networks (RNN). In this study, we constructed a multi-intention recognition model, Ernie-Gram_Bidirectional Gate Recurrent Unit (BiGRU)_Attention (EBA), for air traffic control (ATC). Firstly, the Ernie-Gram pre-training model is used as the bottom layer of the overall architecture to implement the encoding of text information. The BiGRU module that follows is used for further feature extraction of the encoded information. Secondly, as keyword information is very important in Chinese radiotelephony communications, the attention layer after the BiGRU module is added to realize the extraction of keyword information. Finally, two fully connected layers (FC) are used for feature vector fusion and outputting intention classification vector, respectively. We experimentally compare the effects of two different tokenizer tools, the BERT tokenizer tool and Jieba tokenizer tool, on the final performance of the Bert model. The experimental results reveal that although the Jieba tokenizer tool has considered word information, the effect of the Jieba tokenizer tool is not as good as that of the BERT tokenizer tool. The final model’s accuracy is 98.2% in the intention recognition dataset of the ATC instructions, which is 2.7% higher than the Bert benchmark model and 0.7–3.1% higher than other improved models based on BERT.
Funder
National Key R&D Program of China
National Natural Science Foundation of China
Safety Capacity Building Project of Civil Aviation Administration of China
Reference31 articles.
1. Lin, Y. (2021). Spoken Instruction Understanding in Air Traffic Control: Challenge, Technique, and Application. Aerospace, 8.
2. Guo, S., and Wang, Q. (2022). Application of Knowledge Distillation Based on Transfer Learning of ERNIE Model in Intelligent Dialogue In-tention Recognition. Sensors, 22.
3. AMFF: A new attention-based multi-feature fusion method for intention recognition;Liu;Knowl. Based Syst.,2021
4. Dušek, O., and Jurčíček, F. (2016). A context-aware natural language generator for dialogue systems. arXiv.
5. Haffner, P., Tur, G., and Wright, J.H. (2003, January 6–10). Optimizing SVMs for complex call classification. Proceedings of the 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing(ICASSP), Hong Kong, China.
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献