1. Aly, R., Acharya, S., Ossa, A., Köhn, A., Biemann, C., Panchenko, A.: Every child should have parents: a taxonomy refinement algorithm based on hyperbolic term embeddings. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp. 4811–4817. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/P19-1474. https://aclanthology.org/P19-1474
2. Brown, T.B., et al.: Language models are few-shot learners. In: Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M., Lin, H. (eds.) Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020, 6–12 December 2020, virtual (2020)
3. Chang, D., Lin, E., Brandt, C., Taylor, R.: Incorporating domain knowledge into language models using graph convolutional networks for clinical semantic textual similarity (preprint). JMIR Med. Inform. (2020). https://doi.org/10.2196/23101
4. Chung, H.W., et al.: Scaling instruction-finetuned language models (2022)
5. Conover, M., et al.: Free Dolly: Introducing the World’s First Truly Open Instruction-Tuned LLM (2023). https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm