Author:
Lamproudis Anastasios,Henriksson Aron
Publisher
Springer Nature Switzerland
Reference26 articles.
1. Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 3606–3611 (2019)
2. Dalianis, H., Henriksson, A., Kvist, M., Velupillai, S., Weegar, R.: HEALTH BANK- a workbench for data science applications in healthcare. In: CEUR Workshop Proceedings Industry Track Workshop, pp. 1–18 (1 2015). http://ceur-ws.org/Vol-1381/paper1.pdf
3. Dalianis, H., Velupillai, S.: De-identifying Swedish clinical text - refinement of a gold standard and experiments with conditional random fields. J. Biomed. Semant. 1(1), 6 (2010). https://doi.org/10.1186/2041-1480-1-6
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423, https://aclanthology.org/N19-1423
5. El Boukkouri, H., Ferret, O., Lavergne, T., Zweigenbaum, P.: Re-train or train from scratch? Comparing pre-training strategies of BERT in the medical domain. In: LREC 2022, pp. 2626–2633 (2022)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献