Funder
National Natural Science Foundation of China
Publisher
Springer Science and Business Media LLC
Reference25 articles.
1. Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473
2. Bahuleyan H, El Asri L (2020) Diverse keyphrase generation with neural unlikelihood training. In: Proceedings of the 28th International Conference on Computational Linguistics, pp 5271–5287
3. Bao Y, Zhou H, Huang S, Wang D, Qian L, Dai X, Chen J, Li L (2022) Latent-glat: glancing at latent variables for parallel text generation. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp 8398–8409
4. Clark K, Luong M-T, Le QV, Manning CD (2019) Electra: pre-training text encoders as discriminators rather than generators. In: International Conference on Learning Representations
5. Geng X, Feng X, Qin B (2021) Learning to rewrite for non-autoregressive neural machine translation. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp 3297–3308