BriefGPT.xyz
Apr, 2019
Beto, Bentz, Becas:BERT的跨语言效果之惊人
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT
HTML
PDF
Shijie Wu, Mark Dredze
TL;DR
该研究探讨了mBERT作为零-shot语言转移模型在跨语言任务上的运用,包括NLI、文档分类、NER、POS标注和依赖分析等五个任务。研究发现,mBERT在每个任务上都具有竞争力,并考察了其使用策略、语言无关特征和影响跨语言传输的因素。
Abstract
pretrained
contextual representation models
(Peters et al., 2018; Devlin et al., 2018) have pushed forward the state-of-the-art on many NLP tasks. A new release of BERT (Devlin, 2018) includes a model simultaneou
→