BriefGPT.xyz
Nov, 2023
关于基于解码器的多语言模型跨语言提示调优的分析
On the Analysis of Cross-Lingual Prompt Tuning for Decoder-based Multilingual Model
HTML
PDF
Nohil Park, Joonsuk Park, Kang Min Yoo, Sungroh Yoon
TL;DR
多语言模型中,通过参数高效微调和基于令牌的提示微调,令牌提示微调在所有语言上实现了与或更好的性能,对低资源语言的性能提升更为有效,这一现象与多语言模型的标记化方案有关。
Abstract
An exciting advancement in the field of
multilingual models
is the emergence of
autoregressive models
with zero- and few-shot capabilities, a phenomenon widely reported in large-scale language models. To further
→