Publisher
Springer International Publishing
Reference24 articles.
1. Carmo, D., Piau, M., Campiotti, I., Nogueira, R., Lotufo, R.: PTT5: pretraining and validating the T5 model on Brazilian Portuguese data (2020)
2. Carrico, N., Quaresma, P.: Sentence embeddings and sentence similarity for Portuguese FAQs. In: IberSPEECH 2021, pp. 200–204, March 2021
3. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, June 2019. https://doi.org/10.18653/v1/N19-1423, https://aclanthology.org/N19-1423
4. Finardi, P., Viegas, J.D., Ferreira, G.T., Mansano, A.F., Carid’a, V.F.: BERTaú: Itaú BERT for digital customer service. ArXiv abs/2101.12015 (2021)
5. Google: BERT. https://github.com/google-research/bert (2019)
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献