Publisher
Springer Nature Singapore
Reference38 articles.
1. Baichuan: Baichuan 2: Open large-scale language models. arXiv preprint arXiv:2309.10305 (2023)
2. Brown, T.B., et al.: Language models are few-shot learners (2020)
3. Chang, Y., et al.: A survey on evaluation of large language models. arXiv (2023). https://doi.org/10.48550/arxiv.2307.03109
4. Chowdhery, A., et al.: PaLM: scaling language modeling with pathways. arXiv (2022). https://doi.org/10.48550/arxiv.2204.02311
5. Dettmers, T., Pagnoni, A., Holtzman, A., Zettlemoyer, L.: QLoRA: efficient finetuning of quantized LLMs. arXiv (2023)