BriefGPT.xyz
Apr, 2024
参数高效微调无灾难性遗忘的自监督视觉迁移模型
Parameter Efficient Fine-tuning of Self-supervised ViTs without Catastrophic Forgetting
HTML
PDF
Reza Akbarian Bafghi, Nidhin Harilal, Claire Monteleoni, Maziar Raissi
TL;DR
人工神经网络经常面临灾难性遗忘的问题,其中视觉变换器尤其明显,我们通过两种参数高效的微调策略(块扩展和低秩适应)研究了如何解决这一问题,结果显示使用这些策略后的预训练视觉变换器在新领域具有更好的参数效率且能有效减轻灾难性遗忘。
Abstract
artificial neural networks
often suffer from
catastrophic forgetting
, where learning new concepts leads to a complete loss of previously acquired knowledge. We observe that this issue is particularly magnified in
→