Author:
Zhao Xiuhao,Li Zhao,Zhang Xianming,Wang Jibin,Chen Tong,Ju Zhengyu,Wang Canjun,Zhang Chao,Zhan Yiming
Publisher
Springer Nature Switzerland
Reference17 articles.
1. Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 3615–3620 (2019)
2. Caruana, R.: Multitask learning. Mach. Learn. 28(1), 41–75 (1997)
3. Chen, Y.: Convolutional neural network for sentence classification. Master’s thesis, University of Waterloo (2015)
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
5. Fox, E.A., Akscyn, R.M., Furuta, R.K., Leggett, J.J.: Digital libraries. Commun. ACM 38(4), 22–28 (1995)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. BCC: Bidirectional Consistency Constraint Method for Hierarchical Text Classification;ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP);2024-04-14