1. Alsentzer, E., et al.: Publicly available clinical BERT embeddings. arXiv preprint
arXiv:1904.03323
(2019)
2. Beltagy, I., Cohan, A., Lo, K.: SciBERT: pretrained contextualized embeddings for scientific text. arXiv preprint
arXiv:1903.10676
(2019)
3. BioASQ Participants Area BioASQ, May 2019.
http://participants-area.bioasq.org/results/7b/phaseB/
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint
arXiv:1810.04805
(2018)
5. Dimitriadis, D., Tsoumakas, G.: Word embeddings and external resources for answer processing in biomedical factoid question answering. J. Biomed. Inform. 92, 103118 (2019)