BriefGPT.xyz
Dec, 2022
应用于光学信道均衡的知识蒸馏:解决循环连接并行化问题
Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection
HTML
PDF
Sasipim Srivallapanondh, Pedro J. Freire, Bernhard Spinnler, Nelson Costa, Antonio Napoli...
TL;DR
通过知识蒸馏重构循环神经网络为可并行化的前馈结构以解决其限制,可显著减少延迟并且仅降低0.5dB的Q因子。
Abstract
To circumvent the non-parallelizability of
recurrent neural network
-based equalizers, we propose
knowledge distillation
to recast the RNN into a parallelizable
→