Author:
Wu Xiaobao,Nguyen Thong,Luu Anh Tuan
Abstract
AbstractTopic models have been prevalent for decades to discover latent topics and infer topic proportions of documents in an unsupervised fashion. They have been widely used in various applications like text analysis and context recommendation. Recently, the rise of neural networks has facilitated the emergence of a new research field—neural topic models (NTMs). Different from conventional topic models, NTMs directly optimize parameters without requiring model-specific derivations. This endows NTMs with better scalability and flexibility, resulting in significant research attention and plentiful new methods and applications. In this paper, we present a comprehensive survey on neural topic models concerning methods, applications, and challenges. Specifically, we systematically organize current NTM methods according to their network structures and introduce the NTMs for various scenarios like short texts and cross-lingual documents. We also discuss a wide range of popular applications built on NTMs. Finally, we highlight the challenges confronted by NTMs to inspire future research.
Publisher
Springer Science and Business Media LLC
Reference153 articles.
1. Alvarez-Melis D, Jaakkola TS (2017) Tree-structured decoding with doubly-recurrent neural networks. In: International Conference on Learning Representations
2. Angelov D (2020) Top2vec: Distributed representations of topics. arXiv preprint arXiv:2008.09470
3. Avasthi S, Chauhan R, Acharjya DP (2022) Topic modeling techniques for text mining over a large-scale scientific and biomedical text corpus. Int J Ambient Comput Intell 13(1):1–18
4. Bai H, Chen Z, Lyu MR, et al (2018) Neural relational topic models for scientific article analysis. In: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, pp 27–36
5. Bianchi F, Terragni S, Hovy D (2021a) Pre-training is a hot topic: Contextualized document embeddings improve topic coherence. In: Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp 759–766
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献