1. Language Models Are Few-Shot Learners;Brown Tom;Advances in NeurIPS,2020
2. Hongjie Cai, Nan Song, Zengzhi Wang, Qiming Xie, Qiankun Zhao, Ke Li, Siwei Wu, Shijie Liu, Jianfei Yu, and Rui Xia. 2023. MEMD-ABSA: A Multi-Element Multi-Domain Dataset for Aspect-Based Sentiment Analysis. arXiv preprint arXiv:2306.16956 , Vol. 1 (2023), 1--14.
3. Zekai Chen, Mariann Micsinai Balan, and Kevin Brown. 2023. Language Models Are Few-Shot Learners for Prognostic Prediction. arXiv preprint arXiv:2302.12692 , Vol. 1 (2023), 1--9.
4. Junyoung Chung, Caglar Gulcehre, KyungHyun Cho, and Yoshua Bengio. 2014. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. Advances in NeurIPS Workshop on Deep Learning , Vol. 1 (2014), 1--9.
5. Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of NAACL. Association for Computational Linguistics, Minneapolis, Minnesota, 4171--4186.