BriefGPT.xyz
Jun, 2023
面向可控文本生成的有针对性前缀调整
Focused Prefix Tuning for Controllable Text Generation
HTML
PDF
Congda Ma, Tianyu Zhao, Makoto Shing, Kei Sawada, Manabu Okumura
TL;DR
本文提出了一种名为Focused Prefix Tuning (FPT) 的方法,用于在可控制的文本生成数据集中处理一些无关紧要的信号,以提高模型的性能,实验结果表明,在单属性控制任务中,FPT可以比基线模型实现更好的控制精度和文本流畅性,在多属性控制任务中,FPT可以在保持灵活性的同时,达到与现有技术相当的控制精度。
Abstract
In a controllable
text generation
dataset, there exist unannotated attributes that could provide irrelevant
learning signals
to models that use it for training and thus degrade their performance. We propose
→