Author:
Alharbi Khalil,Haq Mohd Anul
Abstract
This study investigates the effectiveness of the DistilBERT model in classifying tweets related to disasters. This study achieved significant predictive accuracy through a comprehensive analysis of the dataset and iterative refinement of the model, including adjustments to hyperparameters. The benchmark model developed highlights the benefits of DistilBERT, with its reduced size and improved processing speed contributing to greater computational efficiency while maintaining over 95% of BERT's capabilities. The results indicate an impressive average training accuracy of 92.42% and a validation accuracy of 82.11%, demonstrating the practical advantages of DistilBERT in emergency management and disaster response. These findings underscore the potential of advanced transformer models to analyze social media data, contributing to better public safety and emergency preparedness.
Publisher
Engineering, Technology & Applied Science Research
Reference23 articles.
1. R. Prasad, A. U. Udeme, S. Misra, and H. Bisallah, "Identification and classification of transportation disaster tweets using improved bidirectional encoder representations from transformers," International Journal of Information Management Data Insights, vol. 3, no. 1, Apr. 2023, Art. no. 100154.
2. A. E. Yüksel, Y. A. Türkmen, A. Özgür, and B. Altınel, "Turkish Tweet Classification with Transformer Encoder," in Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), Varna, Bulgaria, Jun. 2019, pp. 1380–1387.
3. V. Porvatov and N. Semenova, "5q032e@SMM4H’22: Transformer-based classification of premise in tweets related to COVID-19." arXiv, Oct. 15, 2023.
4. V. Balakrishnan et al., "A Comprehensive Analysis of Transformer-Deep Neural Network Models in Twitter Disaster Detection," Mathematics, vol. 10, no. 24, Jan. 2022, Art. no. 4664.
5. C. Wang, P. Nulty, and D. Lillis, "Transformer-based Multi-task Learning for Disaster Tweet Categorisation." arXiv, Oct. 15, 2021.