Author:
Zhu Tong,Zhang Guoliang,Li Zechang,Yu Zijian,Ren Junfei,Wu Mengsong,Wang Zhefeng,Huai Baoxing,Chao Pingfu,Chen Wenliang
Publisher
Springer Nature Switzerland
Reference21 articles.
1. Chen, Y., Liu, S., Zhang, X., Liu, K., Zhao, J.: Automatically labeled data generation for large scale event extraction. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 409–419. Association for Computational Linguistics, Vancouver (2017). 10.18653/v1/P17-1038, https://aclanthology.org/P17-1038
2. Cui, Y., Che, W., Liu, T., Qin, B., Yang, Z.: Pre-training with whole word masking for chinese BERT. IEEE/ACM Trans. Audio Speech Lang. Process. 29, 3504–3514 (2021). https://doi.org/10.1109/TASLP.2021.3124365
3. Dai, X., Karimi, S., Hachey, B., Paris, C.: An effective transition-based model for discontinuous NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5860–5870. Association for Computational Linguistics (2020). 10.18653/v1/2020.acl-main.520, https://aclanthology.org/2020.acl-main.520
4. Dozat, T., Manning, C.D.: Deep biaffine attention for neural dependency parsing. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24–26, 2017, Conference Track Proceedings. OpenReview.net (2017), https://openreview.net/forum?id=Hk95PK9le
5. Dyer, C., Ballesteros, M., Ling, W., Matthews, A., Smith, N.A.: Transition-based dependency parsing with stack long short-term memory. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 334–343. Association for Computational Linguistics, Beijing (2015). https://doi.org/10.3115/v1/P15-1033,https://aclanthology.org/P15-1033