BriefGPT.xyz
Oct, 2020
BERT无监督领域自适应的知识蒸馏
Knowledge Distillation for BERT Unsupervised Domain Adaptation
HTML
PDF
Minho Ryu, Kichun Lee
TL;DR
利用BERT预训练语言模型,结合领域适应性方法和知识蒸馏算法,提出了一种简单而有效的无监督领域适应方法,称为对抗适应与蒸馏,在30个领域对跨领域情感分类的任务中取得了最先进的性能。
Abstract
A
pre-trained language model
,
bert
, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows r
→