BriefGPT.xyz
Oct, 2023
重新设置并忘记它:重新学习最后一层的权重提高持续和迁移学习
Reset It and Forget It: Relearning Last-Layer Weights Improves Continual and Transfer Learning
HTML
PDF
Lapo Frati, Neil Traft, Jeff Clune, Nick Cheney
TL;DR
通过实验验证,本文发现通过反复重置最后一层的权重(即“zapping”)的简单预训练机制可以提高迁移学习和持续学习的性能,这一机制在许多领域都适用,并且在计算上高效简单。
Abstract
This work identifies a simple
pre-training mechanism
that leads to representations exhibiting better continual and
transfer learning
. This mechanism -- the repeated resetting of weights in the last layer, which w
→