Author:
Huang Wenxin,Dong Xiao,Wang Meng-xiang,Liu Guangya,Yu Jianxing,Zhu Huaijie,Yin Jian
Publisher
Springer Nature Singapore
Reference22 articles.
1. Cai, E.: Graph transformer for graph-to-sequence learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 7464–7471 (2020)
2. Cheng, X., et al.: Spellgcn: incorporating phonological and visual similarities into language models for chinese spelling check. arXiv preprint arXiv:2004.14166 (2020)
3. Cui, X., Zhang, B.I.: The principles for building the “international corpus of learner Chinese’’. Appli. Ling. 2, 100–108 (2011)
4. Devlin, J., Chang, M.W., Lee, Kenton, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
5. Hong, Y., Yu, X., He, N., Liu, N., Liu, J.: Faspell: a fast, adaptable, simple, powerful Chinese spell checker based on dae-decoder paradigm. In: Proceedings of the 5th Workshop on Noisy User-generated Text, pp. 160–169 (2019)