Author:
Biswas Dipto,Gil Joon-Min
Abstract
Deep learning techniques are used as basic essential techniques in natural language processing. They rely on modeling nonlinear relationships within complex data. In this study, “Long Short-Term Memory” (LSTM) and “Gated Recurrent Units” (GRU) deep learning techniques are applied to the classification of research papers. We combine Bidirectional LSTM and GRU with “Convolutional Neural Networks” (CNN) to boost the classification performance for a recommendation system of research papers. In our method, word embedding is also used to classify and recommend research papers. Thus, in this study, we evaluate six types of models, LSTM, GRU, CNN with LSTM, CNN with GRU, CNN with BiLSTM, and CNN with BiGRU. These models used the Word2Vec (CBOW and Sg) pre-trained method to compare their performance on the FGCS dataset. The performance results show that the combined models with CNN architecture achieve better accuracy and F1-Score than the basic LSTM and GRU models. For a more in-depth analysis, the CNN with BiLSTM and CNN with BiGRU models exhibit superior performance compared to the CNN with LSTM and CNN with GRU models. Furthermore, the CBOW Word2Vec embedding method for combined CNN models consistently has better performance than the Sg Word2Vec embedding method.
Publisher
Journal of Internet Technology