1. Devlin, J., Chang, M., Lee, K., and Toutanova, K. (2019, January 2–7). BERT: Pre-training of deep bidirectional transformers for language understanding. Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, MN, USA.
2. Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., and Stoyanov, V. (2019). RoBERTa: A robustly optimized BERT pretraining approach. arXiv.
3. Language models are unsupervised multitask learners;Radford;OpenAI Blog,2019
4. Gervas, P. (2000, January 17–20). WASP: Evaluation of different strategies for the automatic generation of Spanish verse. Proceedings of the AISB00 Symposium on Creative & Cultural Aspects of AI, Birmingham, UK.
5. Oliveira, H.G. (2012, January 27). PoeTryMe: A versatile platform for poetry generation. Proceedings of the ECAI 2012 Workshop on Computational Creativity, Concept Invention, and General Intelligence (C3GI), Montpellier, France.