Publisher
Springer Science and Business Media LLC
Reference24 articles.
1. Conneau A, Lample G (2019) Cross-lingual language model pretraining. In: Wallach H, Larochelle H, Beygelzimer A, d’Alché-Buc F, Fox E, Garnett R (eds) Advances in Neural Information Processing Systems, vol 32. Curran Associates, Inc.
2. Devlin J, Chang M-W, Lee K, Toutanova K (2018) Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805
3. Gao X, Shi X, Guo H, Liu Y (2020) To buy or not buy food online: The impact of the covid-19 epidemic on the adoption of e-commerce in china. PloS One 15(8):e0237900
4. Goudjil M, Koudil M, Bedda M, Ghoggali N (2018) A novel active learning method using svm for text classification. Int J Autom Comput 15(3):290–298
5. Li C, Zhan G, Li Z (2018) News text classification based on improved bi-lstm-cnn. In: 2018 9th International Conference on Information Technology in Medicine and Education (ITME), pp 890–893
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献