BriefGPT.xyz
Feb, 2024
DoRA:权重分解低秩适应
DoRA: Weight-Decomposed Low-Rank Adaptation
HTML
PDF
Shih-Yang Liu, Chien-Yi Wang, Hongxu Yin, Pavlo Molchanov, Yu-Chiang Frank Wang...
TL;DR
通过权重分解分析和 LoRA 方法的聚焦更新,DoRA 方法在保持低成本细调的基础上增强了学习能力和训练稳定性,对各种常识推理、视觉指导调整和图像/视频-文本理解等不同下游任务,超越了 LoRA 方法。
Abstract
Among the widely used
parameter-efficient finetuning
(PEFT) methods,
lora
and its variants have gained considerable popularity because of avoiding additional inference costs. However, there still often exists an
→