BriefGPT.xyz
Jan, 2024
仅需一点多语言知识的多语言教学优化
Multilingual Instruction Tuning With Just a Pinch of Multilinguality
HTML
PDF
Uri Shaham, Jonathan Herzig, Roee Aharoni, Idan Szpektor, Reut Tsarfaty...
TL;DR
通过研究多语言指令调整多语言大型语言模型的效果,我们发现跨语言转移以及将多语言示例用于指令调整对多语言指令跟随具有显著的改进作用。
Abstract
As
instruction-tuned large language models
(LLMs) gain global adoption, their ability to follow instructions in multiple languages becomes increasingly crucial. One promising approach is
cross-lingual transfer
, w
→