BriefGPT.xyz
Oct, 2021
BERT 多语言模型的时间?分离跨语言传输的关键要素
When is BERT Multilingual? Isolating Crucial Ingredients for Cross-lingual Transfer
HTML
PDF
Ameet Deshpande, Partha Talukdar, Karthik Narasimhan
TL;DR
该论文在多语言语言模型上进行了大规模实证研究,并发现词嵌入的对齐程度与零-shot迁移的性能密切相关,因此需要在多语言模型中专门改善词嵌入的对齐程度。
Abstract
While recent work on
multilingual language models
has demonstrated their capacity for cross-lingual
zero-shot transfer
on downstream tasks, there is a lack of consensus in the community as to what shared properti
→