Publisher
Springer Nature Switzerland
Reference20 articles.
1. Bai, J., Lu, F., Zhang, K., et al.: Onnx: open neural network exchange (2019). https://github.com/onnx/onnx
2. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding (2018). arXiv:1810.04805
3. Gniewkowski, M., Walkowiak, T.: Assessment of document similarity visualisation methods. In: Vetulani, Z., Paroubek, P., Kubis, M. (eds.) Human Language Technology. Challenges for Computer Science and Linguistics, pp. 348–363. Springer International Publishing, Cham (2022)
4. Klamra, C., Wojdyga, G., Żurowski, S., Rosalska, P., Kozłowska, M., Ogrodniczuk, M.: Devulgarization of polish texts using pre-trained language models. In: Groen, D., de Mulatier, C., Paszynski, M., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M.A. (eds.) Computational Science-ICCS 2022, pp. 49–55. Springer International Publishing, Cham (2022)
5. Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mohamed, A., Levy, O., Stoyanov, V., Zettlemoyer, L.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 7871–7880. Association for Computational Linguistics (Jul. 2020). https://aclanthology.org/2020.acl-main.703