1. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423
2. Ambalavan, A.K., Moulahi, B., Azé, J., Bringay, S.: Unveiling online suicide behavior: What can we learn about mental health from suicide survivors of Reddit? MedInfo 50–54 (2019)
3. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20, 273–297 (1995)
4. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., Polosukhin, I.: Attention is all you need. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, pp. 6000–6010. NIPS’17, Curran Associates Inc., Red Hook, NY, USA (2017)
5. Grant, R.N., Kucher, D., León, A.M., Gemmell, J.F., Raicu, D.S., Fodeh, S.J.: Automatic extraction of informal topics from online suicidal ideation. BMC Bioinform. 19, 57–66 (2018)