BriefGPT.xyz
Jul, 2022
多语言编码器如何学习跨语言表示?
How Do Multilingual Encoders Learn Cross-lingual Representation?
HTML
PDF
Shijie Wu
TL;DR
本文主要研究NLP系统中的多语言支持,其中提到Multilingual BERT作为一个可应用于104种语言的替代方案,通过分析Multilingual BERT的行为和跨语言传输与这些模型的优化行为,提供更好的跨语言模型和跨语言传输理解。
Abstract
nlp
systems typically require support for more than one language. As different languages have different amounts of supervision,
cross-lingual transfer
benefits languages with little to no training data by transfe
→