Abstract
How to enable the computer to accurately analyze the emotional information and story background of characters in Qin opera is a problem that needs to be studied. To promote the artistic inheritance and cultural emotion color dissemination of Qin opera, an emotion analysis model of Qin opera based on attention residual network (ResNet) is presented. The neural network is improved and optimized from the perspective of the model, learning rate, network layers, and the network itself, and then multi-head attention is added to the ResNet to increase the recognition ability of the model. The convolutional neural network (CNN) is optimized from the internal depth, and the fitting ability and stability of the model are enhanced through the ResNet model. Combined with the attention mechanism, the expression of each weight information is strengthened. The multi-head attention mechanism is introduced in the model and a multi-head attention ResNet, namely, MHAtt_ResNet, is proposed. The network structure can effectively identify the features of the spectrogram, improve the weight information of spectrogram features, and deepen the relationship between distant information in long-time series. Through experiments, the proposed model has high emotional classification accuracy for Qin opera, and with the increase of the number of data sets, the model will train a better classification effect.
Reference44 articles.
1. Convolutional neural networks for speech recognition.;Abdel-Hamid;ACM Trans. Audio Speech Lang. Process.,2014
2. Speech emotion recognition from spectrograms with deep convolutional neural network;Badshah;Proceedings of the 2017 International Conference on Platform Technology and Service (Plat Con),2017
3. Hierarchical SVM-k NN to classify music emotion;Bayu;Proceedings of the 2019 International Seminar on Research of Information Technology and Intelligent Systems (ISRITI),2019
4. Residual networks for computer go.;Cazenave;IEEE Trans. Games,2018
5. Music audio sentiment classification based on CNN-LSTM.;Chen;Commun. Technol.,2019