1. Dogu Araci. 2019. FinBERT: Financial Sentiment Analysis with Pre-trained Language Models. arxiv:1908.10063 [cs.CL] https://arxiv.org/abs/1908.10063 Dogu Araci. 2019. FinBERT: Financial Sentiment Analysis with Pre-trained Language Models. arxiv:1908.10063 [cs.CL] https://arxiv.org/abs/1908.10063
2. Ankush Chopra and Sohom Ghosh . 2021 . Term Expansion and FinBERT fine-tuning for Hypernym and Synonym Ranking of Financial Terms . In Proceedings of the Third Workshop on Financial Technology and Natural Language Processing (FinNLP@IJCAI 2021 ). -, Online, 46–51. https://aclanthology.org/ 2021.finnlp-1.8 Ankush Chopra and Sohom Ghosh. 2021. Term Expansion and FinBERT fine-tuning for Hypernym and Synonym Ranking of Financial Terms. In Proceedings of the Third Workshop on Financial Technology and Natural Language Processing (FinNLP@IJCAI 2021). -, Online, 46–51. https://aclanthology.org/2021.finnlp-1.8
3. Jacob Devlin , Ming-Wei Chang , Kenton Lee , and Kristina Toutanova . 2019 . BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding . In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies , Volume 1 (Long and Short Papers). Association for Computational Linguistics, Minneapolis, Minnesota, 4171–4186. https://doi.org/10. 18653/v1/N 19 - 1423 10.18653/v1 Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Association for Computational Linguistics, Minneapolis, Minnesota, 4171–4186. https://doi.org/10.18653/v1/N19-1423
4. Detecting context-based in-claim numerals in Financial Earnings Conference Calls
5. FiNCAT-2: An enhanced Financial Numeral Claim Analysis Tool