May, 2023
关于在代码的预训练语言模型中使用连续学习以实现针对分布外的泛化
On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Martin Weyssow, Xin Zhou, Kisub Kim, David Lo, Houari Sahraoui
TL;DR本文提出可解决 Pre-trained language models 在软件代码动态环境下 catastrophic forgetting 问题的五种 continual learning methods,并在两个 downstream tasks 中取得可比较或优越的表现。