BriefGPT.xyz
Nov, 2020
预训练跨语言语言模型的无监督领域自适应
Unsupervised Domain Adaptation of a Pretrained Cross-Lingual Language Model
HTML
PDF
Juntao Li, Ruidan He, Hai Ye, Hwee Tou Ng, Lidong Bing...
TL;DR
本文提出了一种用于自动提取领域特定特征和领域不变特征的无监督特征分解方法,并利用互信息估计将交叉语言表示计算所述的跨领域和跨语言(CLCD)设置分解为领域不变和领域特定部分,实验结果表明,我们提出的方法在CLCD设置中取得了显著的性能提升。
Abstract
Recent research indicates that
pretraining
cross-lingual language models
on large-scale unlabeled texts yields significant performance improvements over various cross-lingual and low-resource tasks. Through train
→