BriefGPT.xyz
Mar, 2022
神经机器翻译的平衡训练:超越持续学习的灾难性遗忘
Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine Translation
HTML
PDF
Chenze Shao, Yang Feng
TL;DR
针对神经网络在从动态数据分布中连续学习多个任务时逐渐忘记以前学习的知识的问题,提出了一种补充在线知识蒸馏(COKD)的解决方法,成功地缓解了不平衡训练问题并在多个机器翻译任务上取得了实质性的改进。
Abstract
neural networks
tend to gradually forget the previously learned knowledge when learning multiple tasks sequentially from dynamic data distributions. This problem is called \textit{
catastrophic forgetting
}, which
→