Author:
Sayed Mohamad Al,Braşoveanu Adrian M. P.,Nixon Lyndon J. B.,Scharl Arno
Publisher
Springer Nature Switzerland
Reference26 articles.
1. Abdelrazek, A., et al.: Topic modeling algorithms and applications: a survey. Inf. Syst. 112, 102131 (2023). https://doi.org/10.1016/j.is.2022.102131
2. Brown, T.B., et al.: Language models are few-shot learners. In: Larochelle, H. et al. (eds.) Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020 (December), pp. 6–12. Virtual (2020). https://arxiv.org/abs/2005.14165
3. Lecture Notes in Computer Science;B Clavié,2023
4. Devlin, J., et al.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Burstein, J., Doran, C., Solorio, T. (eds.) Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, Minneapolis, MN, USA, 2–7 June 2019, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/n19-1423
5. Ferrell, B.J., et al.: Attention-based models for classifying small data sets using community-engaged research protocols: classification system development and validation pilot study. JMIR Formative Res. 6(9), e32460 (2022). https://doi.org/10.2196/32460
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献