1. P. Lewis et al., “Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks,” May 2020, [Online]. Available: http://arxiv.org/abs/2005.11401 .
2. Z. Levonian et al., “Retrieval-augmented Generation to Improve Math Question-Answering: Trade-offs Between Groundedness and Human Preference,” Oct. 2023, [Online]. Available: http://arxiv.org/abs/2310.03184 .
3. W. E. Thompson et al., “Large Language Models with Retrieval-Augmented Generation for Zero-Shot Disease Phenotyping,” Dec. 2023, [Online]. Available: http://arxiv.org/abs/2312.06457 .
4. E. J. Hu et al., “LoRA: Low-Rank Adaptation of Large Language Models,” Jun. 2021, [Online]. Available: http://arxiv.org/abs/2106.09685 .
5. A. Q. Jiang et al., “Mistral 7B,” Oct. 2023, [Online]. Available: http://arxiv.org/abs/2310.06825 .