Author:
Liu Ye,Maier Wolfgang,Minker Wolfgang,Ultes Stefan
Publisher
Springer Nature Singapore
Reference46 articles.
1. Antoun W, Baly F, Hajj H (2020) Arabert: transformer-based model for Arabic language understanding. In: LREC 2020 workshop language resources and evaluation conference 11–16 May 2020, p 9
2. Budzianowski P, Vulić I (2019) Hello, it’s GPT-2-how can i help you? towards the use of pretrained language models for task-oriented dialogue systems. In: Proceedings of the 3rd workshop on neural generation and translation, pp 15–22
3. Colombo P, Witon W, Modi A, Kennedy J, Kapadia M (2019) Affect-driven dialog generation. In: Proceedings of the 2019 conference of the north American chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 3734–3743
4. Devlin J, Chang MW, Lee K, Toutanova K (2019) Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the north American chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 4171–4186
5. Dinan E, Logacheva V, Malykh V, Miller A, Shuster K, Urbanek J, Kiela D, Szlam A, Serban I, Lowe R et al (2019) The second conversational intelligence challenge (convai2), p 1902
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献