1. Wasi Uddin Ahmad , Nanyun Peng , and Kai-Wei Chang . 2021 . GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction. In The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) . Wasi Uddin Ahmad, Nanyun Peng, and Kai-Wei Chang. 2021. GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction. In The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21).
2. Yubo Chen , Liheng Xu , Kang Liu , Daojian Zeng , and Jun Zhao . 2015. Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks . In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) . Association for Computational Linguistics , Beijing, China , 167--176. https://doi.org/10.3115/v1/P15--1017 Yubo Chen, Liheng Xu, Kang Liu, Daojian Zeng, and Jun Zhao. 2015. Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Association for Computational Linguistics, Beijing, China, 167--176. https://doi.org/10.3115/v1/P15--1017
3. Can one language bootstrap the other
4. Marco Cuturi . 2013. Sinkhorn distances: Lightspeed computation of optimal transport. Advances in neural information processing systems 26 ( 2013 ), 2292--2300. Marco Cuturi. 2013. Sinkhorn distances: Lightspeed computation of optimal transport. Advances in neural information processing systems 26 (2013), 2292--2300.
5. Jacob Devlin , Ming-Wei Chang , Kenton Lee , and Kristina Toutanova . 2019 . BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding . In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies , Volume 1 (Long and Short Papers). Association for Computational Linguistics, Minneapolis, Minnesota, 4171--4186. https://doi.org/10. 18653/v1/N19--1423 Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Association for Computational Linguistics, Minneapolis, Minnesota, 4171--4186. https://doi.org/10.18653/v1/N19--1423