BriefGPT.xyz
Nov, 2023
零样本摘要中具有参数高效层的语言和任务算术
Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization
HTML
PDF
Alexandra Chronopoulou, Jonas Pfeiffer, Joshua Maynez, Xinyi Wang, Sebastian Ruder...
TL;DR
通过元素级算术操作组合语言和任务的参数,我们提出了一种改进的零样本跨语言迁移方法,能在使用最少PEFT模块训练的情况下实现一致的收益,对摘要生成任务表现出良好的效果。
Abstract
parameter-efficient fine-tuning
(PEFT) using labeled task data can significantly improve the performance of large
language models
(LLMs) on the downstream task. However, there are 7000 languages in the world and
→