Publisher
Springer Nature Singapore
Reference33 articles.
1. Xiao, Y., et al.: A survey on non-autoregressive generation for neural machine translation and beyond. IEEE Trans. Pattern Anal. Mach. Intell. 45(10), 11407–11427 (2023)
2. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
3. Gu, J., Bradbury, J., Xiong, C., Li, V., Socher, R.: Non-autoregressive neural machine translation. In: International Conference on Learning Representations (ICLR) (2018)
4. Gu, J., Kong, X.: Fully non-autoregressive neural machine translation: tricks of the trade. In: ACL/IJCNLP (Findings), ser. Findings of ACL, vol. ACL/IJCNLP 2021, pp. 120–133. Association for Computational Linguistics (2021)
5. Guo, J., Tan, X., Xu, L., Qin, T., Chen, E., Liu, T.: Fine-tuning by curriculum learning for non-autoregressive neural machine translation. In: AAAI, vol. 34, no. 05, pp. 7839–7846 (2020)