1. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, Minnesota, vol. 1, pp. 4171–4186. Association for Computational Linguistics (2019)
2. Gereme, F., Zhu, W., Ayall, T., Alemu, D.: Combating fake news in “low-resource’’ languages: Amharic fake news detection accompanied by resource crafting. Information 12(1), 20 (2021). https://doi.org/10.3390/info12010020
3. Shu, K., Sliva, A., Wang, S., Tang, J., Liu, H.: Fake news detection on social media: a data mining perspective. ACM SIGKDD Explor. Newsl. 19 (2017). https://doi.org/10.1145/3137597.3137600
4. Rai, N., Kumar, D., Kaushik, N., Raj, C., Ali, A.: Fake news classification using transformer-based enhanced LSTM and BERT. Int. J. Cogn. Comput. Eng. 3, 98–105 (2022)
5. Juarto, B., Yulianto: Indonesian news classification using IndoBert. Int. J. Intell. Syst. Appl. Eng. 11(2), 454–460 (2023)