1. Yubo Chen , Liheng Xu , Kang Liu , Daojian Zeng , and Jun Zhao . 2015. Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks . In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) . Association for Computational Linguistics , Beijing, China , 167–176. https://doi.org/10.3115/v1/P15-1017 Yubo Chen, Liheng Xu, Kang Liu, Daojian Zeng, and Jun Zhao. 2015. Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Association for Computational Linguistics, Beijing, China, 167–176. https://doi.org/10.3115/v1/P15-1017
2. Collective Event Detection via a Hierarchical and Bias Tagging Networks with Gated Multi-level Attention Mechanisms
3. Shiyao Cui , Bowen Yu , Tingwen Liu , Zhenyu Zhang , Xuebin Wang , and Jinqiao Shi . 2020. Edge-Enhanced Graph Convolution Networks for Event Detection with Syntactic Relation. arXiv:2002.10757 [cs](Sept . 2020 ). http://arxiv.org/abs/2002.10757 arXiv: 2002.10757. Shiyao Cui, Bowen Yu, Tingwen Liu, Zhenyu Zhang, Xuebin Wang, and Jinqiao Shi. 2020. Edge-Enhanced Graph Convolution Networks for Event Detection with Syntactic Relation. arXiv:2002.10757 [cs](Sept. 2020). http://arxiv.org/abs/2002.10757 arXiv: 2002.10757.
4. Revisiting Pre-Trained Models for Chinese Natural Language Processing
5. Jacob Devlin , Ming-Wei Chang , Kenton Lee , and Kristina Toutanova . 2019 . BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv:1810.04805 [cs] (May 2019). http://arxiv.org/abs/1810.04805 arXiv: 1810.04805. Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv:1810.04805 [cs] (May 2019). http://arxiv.org/abs/1810.04805 arXiv: 1810.04805.