Author:
Sáenz Carlos Abel Córdova,Becker Karin
Funder
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Hardware and Architecture,Human-Computer Interaction,Information Systems,Software
Reference37 articles.
1. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Proceedings of the 31st international conference on neural information processing systems (NIPS’17), pp 6000–6010
2. Devlin J, Chang M, Lee K, Toutanova K (2019) BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, (NAACL-HLT), pp 4171–4186
3. Tenney I, Das D, Pavlick E (2019) BERT rediscovers the classical NLP pipeline. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 4593–4601. https://doi.org/10.18653/v1/P19-1452
4. Rogers A, Kovaleva O, Rumshisky A (2020) A primer in bertology: What we know about how bert works. Trans Assoc Comput Linguist 8:842–866. https://doi.org/10.1162/tacl_a_00349
5. Ventura F, Greco S, Apiletti D, Cerquitelli T (2022) Trusting deep learning natural-language models via local and global explanations. Knowl Inf Syst 64(7):1863–1907. https://doi.org/10.1007/s10115-022-01690-9
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献