Funder
Natural Sciences and Engineering Research Council of Canada
York University
Reference104 articles.
1. A primer in BERTology: What we know about how BERT works;Rogers;Trans. Assoc. Comput. Linguist.,2021
2. AMMU: a survey of transformer-based biomedical pretrained language models;Kalyan;J. Biomed. Inform.,2022
3. BioBERT: a pre-trained biomedical language representation model for biomedical text mining;Lee;Bioinformatics,2020
4. Hongyi Yuan, Zheng Yuan, Ruyi Gan, Jiaxing Zhang, Yutao Xie, Sheng Yu, BioBART: Pretraining and Evaluation of A Biomedical Generative Language Model, in: Proceedings of the 21st Workshop on Biomedical Language Processing, 2022, pp. 97–109.
5. BioGPT: generative pre-trained transformer for biomedical text generation and mining;Luo;Brief. Bioinform.,2022
Cited by
11 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献