Author:
zhou hao,Hou Jin,Wu Tao,Li Xuemei,Chen Yan,Mao Guobin,Yin Huiyang
Reference16 articles.
1. Distributed Representations of Words and Phrases and their Compositionality;Mikolov,2013
2. Proceedings of the 2018 Conference of the North American Chapter of
the Association for Computational Linguistics: Human Language
Technologies, Volume 1 (Long Papers)
3. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[J];Devlin,2018
4. RoBERTa: A Robustly Optimized BERT Pretraining Approach[J];Liu,2019
5. XLNet: Generalized Autoregressive Pretraining for Language Understanding[J];Yang,2019