1. Ahn, M., et al.: Do as i can, not as i say: grounding language in robotic affordances. arXiv:2204.01691 (2022)
2. Black, S., et al.: GPT-NeoX-20B: an open-source autoregressive language model. In: Proceedings of the ACL Workshop on Challenges & Perspectives in Creating Large Language Models (2022)
3. Brown, T., et al.: Language models are few-shot learners. Adv. Neural Inf. Process. Syst. 33, 1877–1901 (2020)
4. Catell, R.: Handbook for the sixteen personality factor questionnaire (16 pf). clinical, educational, industrial, and research psychology, for use with all forms of the test by Cattell, Raymond (1970)
5. Chowdhery, A., et al.: PaLM: scaling language modeling with pathways. arXiv:2204.02311 (2022)