BriefGPT.xyz
Dec, 2013
梯度神经网络中灾难性遗忘的实证研究
An Empirical Investigation of Catastrophic Forgeting in Gradient-Based Neural Networks
HTML
PDF
Ian J. Goodfellow, Mehdi Mirza, Xia Da, Aaron Courville, Yoshua Bengio
TL;DR
探究现代神经网络机器学习模型在不同任务训练后的“灾难性遗忘”问题,发现以dropout算法为代表的梯度训练算法可以最好地适应新任务并记住旧任务,而不同任务间的关系会显著影响激活函数表现,建议激活函数选择交叉验证。
Abstract
catastrophic forgetting
is a problem faced by many
machine learning
models and algorithms. When trained on one task, then trained on a second task, many
→