1. Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mohamed, A., Levy, O., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension, 7871–7880 (2020)
2. Carmo, D., Piau, M., Campiotti, I., Nogueira, R., Lotufo, R.: PTT5: Pretraining and validating the T5 model on Brazilian Portuguese data, 1–12 (2020). Available from: http://arxiv.org/abs/2008.09144
3. Akyon, F.C., Cavusoglu, D., Cengiz, C., Altinuc, S.O., Temizel, A.: Automated question generation and question answering from Turkish texts using text-to-text transformers, 1–14 (2021). Available from: http://arxiv.org/abs/2111.06476
4. Madane, S., Bhura, S.: Traffic surveillance: theoretical survey of video motion detection. Int J Sci Technol Eng. 2(08), 207–211 (2016)
5. Lopez, L.E., Cruz, D.K., Cruz, J.C.B., Cheng, C.: Simplifying paragraph-level question generation via transformer language models. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformat.). 13032, LNAI: 323–334 (2021)