1. Random forests;Breiman;Machine Learning,2001
2. Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., Agarwal, S., Herbert, A., Krueger, G., Henighan, T., Child, R., Ramesh, A., Ziegler, D., Wu, J., Winter, C., Hesse, C., Chen, M., Sigler, E., Litwin, M., Gray, S., Chess, B., Clark, J., Berner, C., McCandlish, S., Radford, A., Sutskever, I., & Amodei, D. (2020) Language Models are Few-Shot Learners. In Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020, December 6-12, 2020, virtual.
3. Cer, D., Diab, M., Agirre, E., Lopez-Gazpio, I., & Specia, L. (2017) Semeval-2017 task 1: Semantic textual similarity-multilingual and cross-lingual focused evaluation. arXiv preprint arXiv:1708.00055.
4. Chen, D., & Manning, C. D. (2014) A fast and accurate dependency parser using neural networks. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP 2014, October 25-29, 2014, Doha, Qatar (pp. 740–750).
5. Chen, L., Lv, B., Wang, C., Zhu, S., Tan, B., & Yu, K. (2020a) Schema-guided multi-domain dialogue state tracking with graph attention neural networks. In the Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI 2020, February 7-12, 2020, New York, NY, USA, (pp. 7521–7528).