1. Zihang Dai, Zhilin Yang, Yiming Yang, Jaime G. Carbonell, Quoc V. Le, and Ruslan Salakhutdinov. 2019. Transformer-XL: Attentive Language Models beyond a Fixed-Length Context. ArXiv abs/1901.02860 (2019).
2. Chuanhai Dong, Jiajun Zhang, Chengqing Zong, Masanori Hattori, and Hui Di. 2016. Character-Based LSTM-CRF with Radical-Level Features for Chinese Named Entity Recognition. In Natural Language Understanding and Intelligent Applications, Chin-Yew Lin, Nianwen Xue, Dongyan Zhao, Xuanjing Huang, and Yansong Feng (Eds.). Springer International Publishing, Cham, 239–250.
3. A Transformer-Based Longer Entity Attention Model for Chinese Named Entity Recognition in Aerospace
4. Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence