BriefGPT.xyz
Mar, 2024
视觉语言预训练模型参数高效微调的实证研究
An Empirical Study of Parameter Efficient Fine-tuning on Vision-Language Pre-train Model
HTML
PDF
Yuxin Tian, Mouxing Yang, Yunfan Li, Dayiheng Liu, Xingzhang Ren...
TL;DR
最近的研究应用了参数高效微调技术(PEFTs)来有效缩小预训练和下游任务之间的性能差距。该研究发现,对于与预训练一致的下游微调任务,数据规模不再影响性能,而可微参数规模的影响并不单调,这种观察可指导PEFTs的训练策略选择。
Abstract
Recent studies applied
parameter efficient fine-tuning
techniques (
pefts
) to efficiently narrow the performance gap between pre-training and downstream. There are two important factors for various
→