Author:
Azroumahli Chaimae,Elyounoussi Yacine,Badir Hassan
Publisher
Springer Nature Switzerland
Reference25 articles.
1. Polignano, M., Basile, P., de Gemmis, M., et al.: AlBERTo: Italian BERT language understanding model for NLP challenging tasks based on tweets. In: CEUR Workshop Proceedings, vol. 2481 (2019)
2. Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv Prepr arXiv181004805. arXiv:1811.03600v2 (2018)
3. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: ICLR 2020 Conference Blind Submission, pp. 4069–4076 (2013). https://doi.org/10.48550/arXiv.1301.3781
4. Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
5. Peters, M.E., Neumann, M., Iyyer, M., et al.: Deep contextualized word representations. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 2227–2237 (2018)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献