BriefGPT.xyz
Jun, 2024
专注关注:面向领域通用的提示优化方法,用于语言模型
Concentrate Attention: Towards Domain-Generalizable Prompt Optimization for Language Models
HTML
PDF
Chengzhengxu Li, Xiaoming Liu, Zhaohan Zhang, Yichen Wang, Chen Liu...
TL;DR
我们提供了一个名为“Concentration”的面向领域通用的提示优化目标,它通过增加注意力强度并减少注意力分布的波动,从而改善了多源领域通用设置中软提示和硬提示通用性优化方法的准确性,同时保持了满意的领域内性能。
Abstract
Recent advances in
prompt optimization
have notably enhanced the performance of
pre-trained language models
(PLMs) on downstream tasks. However, the potential of optimized prompts on
→