1. Bert: Pre-training of deep bidirectional transformers for language understanding;Devlin,2019
2. Llama: Open and efficient foundation language models;Touvron,2023
3. Robust speech recognition via large-scale weak supervision;Radford
4. Neural network quantization in federated learning at the edge
5. Lora: Low-rank adaptation of large language models;Hu,2021