1. Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence);LH Bonifacio,2020
2. Chelba, C., Mahajan, M.: Information extraction using the structured language model. CoRR cs.CL/0108023 (2001)
3. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. CoRR abs/1810.04805 (2018)
4. Du, X., Rush, A.M., Cardie, C.: Document-level event-based extraction using generative template-filling transformers. CoRR abs/2008.09249 (2020)
5. Du, X., Rush, A.M., Cardie, C.: Template filling with generative transformers. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, 6–11 June 2021 (2021)