BriefGPT.xyz
Aug, 2023
关于层标准化调整在视觉Transformer连续学习中的有效性
On the Effectiveness of LayerNorm Tuning for Continual Learning in Vision Transformers
HTML
PDF
Thomas De Min, Massimiliano Mancini, Karteek Alahari, Xavier Alameda-Pineda, Elisa Ricci
TL;DR
通过回顾和扩展简单的迁移学习思想:学习任务特定的归一化层,我们在维持竞争性性能的同时降低了计算成本,在ImageNet-R和CIFAR-100实验中,我们的方法在计算上更加经济且结果要么优于现有技术水平,要么与其相媲美。
Abstract
State-of-the-art rehearsal-free continual learning methods exploit the peculiarities of
vision transformers
to learn
task-specific prompts
, drastically reducing catastrophic forgetting. However, there is a tradeo
→