BriefGPT.xyz
May, 2020
多语言 BERT 中的各种语言是否平等?
Are All Languages Created Equal in Multilingual BERT?
HTML
PDF
Shijie Wu, Mark Dredze
TL;DR
本文研究了Multilingual BERT在多种语言下的性能表现,特别是在对低资源语言的表示质量方面的评估,结果表明Monolingual BERT和mBERT相比差距较大,而解决这个问题的关键在于更有效的预训练技术或更多的数据。
Abstract
multilingual bert
(mBERT) trained on 104 languages has shown surprisingly good
cross-lingual performance
on several NLP tasks, even without explicit cross-lingual signals. However, these evaluations have focused
→