1. Bahuleyan, H., Mou, L., Vechtomova, O., Poupart, P., 2018. Variational attention for sequence-to-sequence models. In: Proceedings of the 27th International Conference on Computational Linguistics. COLING-2018, ISBN: 9781948087506, pp. 1672–1682.
2. Bowman, S.R., Vilnis, L., Vinyals, O., Dai, A.M., Jozefowicz, R., Bengio, S., 2016. Generating sentences from a continuous space. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, Proceedings. CoNLL-2016, ISBN: 9781945626197, pp. 10–21.
3. Brown, T.B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., Agarwal, S., Herbert-Voss, A., Krueger, G., Henighan, T., Child, R., Ramesh, A., Ziegler, D.M., Wu, J., Winter, C., Hesse, C., Chen, M., Sigler, E., Litwin, M., Gray, S., Chess, B., Clark, J., Berner, C., McCandlish, S., Radford, A., Sutskever, I., Amodei, D., 2020. Language models are few-shot learners. In: Proceedings of Advances in Neural Information Processing Systems. NIPS-2020, pp. 1877–1901.
4. Emotions in text: Dimensional and categorical models;Calvo;Comput. Intell.,2013
5. Chen, X., Duan, Y., Houthooft, R., Schulman, J., Sutskever, I., Abbeel, P., 2016. InfoGAN: Interpretable representation learning by information maximizing generative adversarial nets. In: Proceedings of Advances in Neural Information Processing Systems. NIPS-2016, pp. 2180–2188.