TL;DR提出一种称为 SALT 的简单而有效的方法,结合了代码混合和嵌入混合自增强,通过从多语言预训练语言模型中提取跨语言知识并增强其在下游任务中的可转移性,改进了零射击跨语言转移能力,而无需外部数据。
Abstract
zero-shot cross-lingual transfer is a central task in multilingual nlp,
allowing models trained in languages with more sufficient training resources to
generalize to other low-resource languages. Earlier efforts