Author:
Rezayi Saed,Dai Haixing,Liu Zhengliang,Wu Zihao,Hebbar Akarsh,Burns Andrew H.,Zhao Lin,Zhu Dajiang,Li Quanzheng,Liu Wei,Li Sheng,Liu Tianming,Li Xiang
Publisher
Springer Nature Switzerland
Reference26 articles.
1. Alsentzer, E., et al.: Publicly available clinical bert embeddings. In: Clinical NLP Workshop, pp. 72–78 (2019)
2. Beltagy, I., Lo, K., Cohan, A.: Scibert: a pretrained language model for scientific text. In: EMNLP-IJCNLP, pp. 3615–3620 (2019)
3. Conneau, A., et al.: Unsupervised cross-lingual representation learning at scale. In: ACL, pp. 8440–8451 (2020)
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: NAACL, pp. 4171–4186 (2019)
5. Feng, S.Y., et al.: A survey of data augmentation approaches for NLP. In: ACL-IJCNLP, pp. 968–988 (2021)
Cited by
11 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献