BriefGPT.xyz
Mar, 2020
什么[MASK]?理解特定语言的BERT模型
What the [MASK]? Making Sense of Language-Specific BERT Models
HTML
PDF
Debora Nozza, Federico Bianchi, Dirk Hovy
TL;DR
本文研究了自然语言处理中的BERT模型及其多语言版本(mBERT),比较并探究了语言特定的BERT模型与mBERT在架构、数据领域和任务上的差异和共性,为读者提供了一个直观的综述和交互式展示网站。
Abstract
Recently,
natural language processing
(NLP) has witnessed an impressive progress in many areas, due to the advent of novel, pretrained contextual representation models. In particular, Devlin et al. (2019) proposed a model, called
→