Author:
Guilherme Gomes Bruno,Murai Fabricio,Goussevskaia Olga,Couto da Silva Ana Paula
Publisher
Springer International Publishing
Reference25 articles.
1. Conneau, A., et al.: Unsupervised cross-lingual representation learning at scale. In: ACL (2020)
2. Dahl, G.E., Adams, R.P., Larochelle, H.: Training restricted Boltzmann machines on word observations. In: ICML (2012)
3. Davies, D.L., Bouldin, D.W.: A cluster separation measure. IEEE Trans. Pattern Anal. Mach. Intell. PAMI–1(2), 224–227 (1979). https://doi.org/10.1109/TPAMI.1979.4766909
4. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (2019)
5. Ding, R., Nallapati, R., Xiang, B.: Coherence-aware neural topic modeling. In: EMNLP (2018)