Author:
Jiao Xiaoqi,Chang Huating,Yin Yichun,Shang Lifeng,Jiang Xin,Chen Xiao,Li Linlin,Wang Fang,Liu Qun
Funder
Fundamental Research Funds for the Central Universities
National Major Science and Technology Projects of China
National Natural Science Foundation of China
Subject
Artificial Intelligence,Cognitive Neuroscience,Computer Science Applications
Reference48 articles.
1. Coupling weight elimination with genetic algorithms to reduce network size and preserve generalization;Bebis;Neurocomputing,1997
2. Electra: Pre-training text encoders as discriminators rather than generators;Clark;ICLR,2020
3. B. Cui, Y. Li, M. Chen, Z. Zhang, Fine-tune bert with sparse self-attention mechanism, in: EMNLP, 2019a.
4. Cui, Y., Liu, T., Che, W., Xiao, L., Chen, Z., Ma, W., Wang, S., Hu, G., 2019b. A span-extraction dataset for chinese machine reading comprehension, in: EMNLP
5. Dehghani, M., Gouws, S., Vinyals, O., Uszkoreit, J., Kaiser, L., 2019. Universal transformers, in: ICLR.
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献