BriefGPT.xyz
Oct, 2020
中文BERT是否编码单词结构?
Does Chinese BERT Encode Word Structure?
HTML
PDF
Yile Wang, Leyang Cui, Yue Zhang
TL;DR
本文探究了中文BERT在注意力权重分布统计和探测任务方面的表现,发现其捕捉了单词信息,其中单词级别特征主要集中在中间表示层,在文本理解等下游任务中,词特征以不同的方式被应用。
Abstract
contextualized representations
give significantly improved results for a wide range of
nlp tasks
. Much work has been dedicated to analyzing the features captured by representative models such as
→