BriefGPT.xyz
Jun, 2021
Syntax-augmented Multilingual BERT 跨语言转移
Syntax-augmented Multilingual BERT for Cross-lingual Transfer
HTML
PDF
Wasi Uddin Ahmad, Haoran Li, Kai-Wei Chang, Yashar Mehdad
TL;DR
本研究表明,通过在辅助目标中提供语言语法和训练mBERT以对通用依赖树结构进行编码,可提高跨语言转移,从而提高了在四项NLP任务中的性能表现。
Abstract
In recent years, we have seen a colossal effort in
pre-training multilingual text encoders
using large-scale corpora in many languages to facilitate
cross-lingual transfer learning
. However, due to typological di
→