BriefGPT.xyz
Jul, 2024
LoRA-Pro:低秩适配器是否得到了适当的优化?
LoRA-Pro: Are Low-Rank Adapters Properly Optimized?
HTML
PDF
Zhengbo Wang, Jian Liang
TL;DR
该研究解决了低秩适配(LoRA)在与完全微调相比时性能不足的问题。通过引入“等效梯度”的新概念,本文优化了LoRA的优化过程,使其在性能上更接近完全微调。实验结果表明,该方法有效缩小了LoRA与完全微调之间的性能差距。
Abstract
Low-Rank Adaptation
, also known as LoRA, has emerged as a prominent method for parameter-efficient
Fine-Tuning
foundation models by re-parameterizing the original matrix into the product of two low-rank matrices.
→