BriefGPT.xyz
Jul, 2022
基于知识蒸馏的中文语法错误纠正
Chinese grammatical error correction based on knowledge distillation
HTML
PDF
Peng Xia, Yuechi Zhou, Ziyan Zhang, Zecheng Tang, Juntao Li
TL;DR
使用知识蒸馏方法压缩模型参数,构建攻击测试集,从而在中文语法错误纠正上提高模型抗攻性和鲁棒性。实验结果表明,压缩小型模型的性能仍然良好,训练速度加快,具有最佳效果和显著的鲁棒性。
Abstract
In view of the poor
robustness
of existing
chinese grammatical error correction
models on attack test sets and large model parameters, this paper uses the method of
→