1. Anthropic. 2023. Introducing Claude.
2. Jiamu Bai, Daoyuan Chen, Bingchen Qian, Liuyi Yao, and Yaliang Li. 2024. Federated Fine-tuning of Large Language Models under Heterogeneous Language Tasks and Client Resources. arXiv preprint arXiv:2402.11505 (2024).
3. Keith Bonawitz, Hubert Eichner, Wolfgang Grieskamp, et al. 2019. Towards federated learning at scale: System design. In Proc. of machine learning and systems (MLSys'19), Vol. 1. 374--388.
4. Tom Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared D Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, et al. 2020. Language Models are Few-Shot Learners. In Proc. of the Advances in Neural Information Processing Systems (NeurIPS'20), Vol. 33. 1877--1901.