1. Annotated Question and Answer Dataset for Security Export Control;Obayashi,2021
2. Devlin J, Chang MW, Lee K, Toutanova K. BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805. 2018 Oct 11.
3. Liu Y, Ott M, Goyal N, Du J, Joshi M, Chen D, Levy O, Lewis M, Zettlemoyer L, Stoyanov V. RoBERTa: A robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692. 2019 Jul 26.
4. Language models are few-shot learners;Brown;Advances in neural information processing systems,2020
5. Sovrano F, Palmirani M, Vitali F. Legal knowledge extraction for knowledge graph based question-answering. In Legal Knowledge and Information Systems 2020 (pp. 143-153). IOS Press.