Author:
Wang Yuhua,Hu Junying,Su Yongli,Zhang Bo,Sun Kai,Zhang Hai
Funder
China Postdoctoral Science Foundation
National Natural Science Foundation of China
Reference27 articles.
1. Improving BERT with local context comprehension for multi-turn response selection in retrieval-based dialogue systems;Chen;Comput. Speech Lang.,2023
2. Devlin, J., Chang, M.W., Lee, K., et al., 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In: Proceedings of NAACL-HLT. pp. 4171–4186.
3. SemEval-2010 Task 8: Multi-way classification of semantic relations between pairs of nominals;Hendrickx;ACL,2010
4. Howard, J., Ruder, S., 2018. Universal Language Model Fine-tuning for Text Classification. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). pp. 328–339.
5. Bidirectional LSTM-CRF models for sequence tagging;Huang,2015