BriefGPT.xyz
Mar, 2022
匹配脚本,适应多语: 分析多语言预训练对跨语言可迁移性的影响
Match the Script, Adapt if Multilingual: Analyzing the Effect of Multilingual Pretraining on Cross-lingual Transferability
HTML
PDF
Yoshinari Fujinuma, Jordan Boyd-Graber, Katharina Kann
TL;DR
本研究旨在探讨预训练语言模型在不同数量、相关性条件下的零样本学习能力,并发现通过模型适应,增加预训练语言数量能够提高语言模型的性能。
Abstract
pretrained multilingual models
enable
zero-shot learning
even for unseen languages, and that performance can be further improved via adaptation prior to finetuning. However, it is unclear how the number of
→