1. Bhunia, A.K., et al.: Pixelor: A competitive sketching AI agent. so you think you can sketch? ACM Trans. Graph. (TOG) 39(6), 1–15 (2020)
2. Brown, T., et al.: Language models are few-shot learners. Adv. Neural Inf. Process. Syst. 33, 1877–1901 (2020)
3. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
4. Fabi, S., Otte, S., Scholz, F., Wührer, J., Karlbauer, M., Butz, M.V.: Extending the omniglot challenge: imitating handwriting styles on a new sequential dataset. IEEE Trans. Cogn. Dev. Syst. 15, 896–903 (2022)
5. Feinman, R., Lake, B.M.: Learning task-general representations with generative neuro-symbolic modeling. arXiv preprint arXiv:2006.14448 (2020)