BriefGPT.xyz
Aug, 2022
生成多模态预训练模型的Prompt调整
Prompt Tuning for Generative Multimodal Pretrained Models
HTML
PDF
Hao Yang, Junyang Lin, An Yang, Peng Wang, Chang Zhou...
TL;DR
本文探讨将Prompt调参应用于多模态预训练,使用基于生成模型的统一序列到序列的预训练模型,实现轻量级Prompt调参,并与微调进行比较,通过实验研究发现Prompt调参具有改善鲁棒性的优点,但也存在一些局限性,给出了未来研究的方向。
Abstract
prompt tuning
has become a new paradigm for model tuning and it has demonstrated success in natural language pretraining and even vision pretraining. In this work, we explore the transfer of
prompt tuning
to
→