Affiliation:
1. Centro de Investigación en Computación (CIC), Instituto Politécnico Nacional (IPN), Mexico City 07738, Mexico
Abstract
In this paper, we analyzed the performance of different transformer models for regret and hope speech detection on two novel datasets. For the regret detection task, we compared the averaged macro-scores of the transformer models to the previous state-of-the-art results. We found that the transformer models outperformed the previous approaches. Specifically, the roberta-based model achieved the highest averaged macro F1-score of 0.83, beating the previous state-of-the-art score of 0.76. For the hope speech detection task, the bert-based, uncased model achieved the highest averaged-macro F1-score of 0.72 among the transformer models. However, the specific performance of each model varied slightly depending on the task and dataset. Our findings highlight the effectiveness of transformer models for hope speech and regret detection tasks, and the importance of considering the effects of context, specific transformer architectures, and pre-training on their performance.
Funder
Mexican Government
Secretaría de Investigación y Posgrado of the Instituto Politecnico Nacional, Mexico
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference27 articles.
1. The Experience of Regret: What, When, and Why;Gilovich;Psychol. Rev.,1995
2. Failing to act: Regrets of Terman’s geniuses;Hattiangadi;Int. J. Aging Hum. Dev.,1995
3. Regret theory: A new foundation;Diecidue;J. Econ. Theory,2017
4. Balouchzahi, F., Butt, S., Sidorov, G., and Gelbukh, A. (2022). ReDDIT: Regret Detection and Domain Identification from Text. arXiv.
5. Pennington, J., Socher, R., and Manning, C.D. (2014, January 25–29). Glove: Global vectors for word representation. Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), Doha, Qatar.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献