Author:
Zhou Wangchunshu,Ge Tao,Wei Furu,Zhou Ming,Xu Ke
Publisher
Association for Computational Linguistics
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Relaxed Attention for Transformer Models;2023 International Joint Conference on Neural Networks (IJCNN);2023-06-18
2. LRTD: A Low-rank Transformer with Dynamic Depth and Width for Speech Recognition;2022 International Joint Conference on Neural Networks (IJCNN);2022-07-18
3. DropDim: A Regularization Method for Transformer Networks;IEEE Signal Processing Letters;2022
4. Regularized Contrastive Learning of Semantic Search;Natural Language Processing and Chinese Computing;2022