Affiliation:
1. Singapore Management University, Singapore, Singapore
Abstract
In Chinese, Chengyu are fixed phrases consisting of four characters. As a type of idioms, their meanings usually cannot be derived from their component characters. In this article, we study the task of recommending a Chengyu given a textual context. Observing some of the limitations with existing work, we propose a two-stage model, where during the first stage we re-train a Chinese BERT model by masking out Chengyu from a large Chinese corpus with a wide coverage of Chengyu. During the second stage, we fine-tune the re-trained, Chengyu-oriented BERT on a specific Chengyu recommendation dataset. We evaluate this method on ChID and CCT datasets and find that it can achieve the state of the art on both datasets. Ablation studies show that both stages of training are critical for the performance gain.
Funder
National Research Foundation, Singapore
International Research Centres in Singapore Funding Initiative
Publisher
Association for Computing Machinery (ACM)
Reference36 articles.
1. Yiming Cui Wanxiang Che Ting Liu Bing Qin Ziqing Yang Shijin Wang and Guoping Hu. 2019a. Pre-Training with Whole Word Masking for Chinese BERT. arxiv:cs.CL/1906.08101. http://arxiv.org/abs/cs.CL/1906.08101. Yiming Cui Wanxiang Che Ting Liu Bing Qin Ziqing Yang Shijin Wang and Guoping Hu. 2019a. Pre-Training with Whole Word Masking for Chinese BERT. arxiv:cs.CL/1906.08101. http://arxiv.org/abs/cs.CL/1906.08101.
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献