BriefGPT.xyz
Apr, 2022
跨语言调整上下文词表示对零-shot迁移的影响
The Impact of Cross-Lingual Adjustment of Contextual Word Representations on Zero-Shot Transfer
HTML
PDF
Pavel Efimov, Leonid Boytsov, Elena Arslanova, Pavel Braslavski
TL;DR
本研究使用已训练好的mBERT模型对英语模型进行零样本迁移,并尝试采用小型平行语料库进行跨语言调整以提高性能表现,结果表明跨语言调整对不同语言的自然语言处理任务表现效果显著,且可以提高语义相似词汇的嵌入向量距离。
Abstract
Large pre-trained multilingual models such as
mbert
and XLM-R enabled effective cross-lingual zero-shot transfer in many
nlp tasks
. A
cross-lingu
→