BriefGPT.xyz
Jun, 2019
Multilingual BERT的多语言能力如何?
How multilingual is Multilingual BERT?
HTML
PDF
Telmo Pires, Eva Schlinger, Dan Garrette
TL;DR
研究表明,在零-shot跨语言模型转移方面,多语言BERT(M-BERT)表现出惊人的性能,经过大量探究实验,证明转移甚至可以到不同文字的语言中,但它们会因特定的语言对而表现出系统缺陷。
Abstract
In this paper, we show that
multilingual bert
(M-BERT), released by Devlin et al. (2018) as a single language model pre-trained from monolingual corpora in 104 languages, is surprisingly good at zero-shot
cross-lingual
→