BriefGPT.xyz
Sep, 2023
自我增强提高零-shot跨语言迁移
Self-Augmentation Improves Zero-Shot Cross-Lingual Transfer
HTML
PDF
Fei Wang, Kuan-Hao Huang, Kai-Wei Chang, Muhao Chen
TL;DR
提出一种称为SALT的简单而有效的方法,结合了代码混合和嵌入混合自增强,通过从多语言预训练语言模型中提取跨语言知识并增强其在下游任务中的可转移性,改进了零射击跨语言转移能力,而无需外部数据。
Abstract
zero-shot cross-lingual transfer
is a central task in
multilingual nlp
, allowing models trained in languages with more sufficient training resources to generalize to other low-resource languages. Earlier efforts
→