Author:
Srivallapanondh Sasipim,Freire Pedro J.,Spinnler Bernhard,Costa Nelson,Napoli Antonio,Turitsyn Sergei K.,Prilepsky Jaroslaw E.
Abstract
To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose knowledge distillation to recast the RNN into a parallelizable feed-forward structure. The latter shows 38% latency decrease, while impacting the Q-factor by only 0.5 dB.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献