BriefGPT.xyz
Sep, 2018
多源域自适应的专家混合模型
Multi-Source Domain Adaptation with Mixture of Experts
HTML
PDF
Jiang Guo, Darsh J Shah, Regina Barzilay
TL;DR
本论文提出了一种用于多源无监督领域自适应的专家混合方法,旨在显式地捕获目标示例与不同源域之间的关系,以点到集度量为表达方式,通过元训练无监督地学习这个度量,并在情感分析和词性标注方面进行实验,证明了我们的方法始终优于多个基准线,并可以稳健地处理负迁移。
Abstract
We propose a
mixture-of-experts approach
for
unsupervised domain adaptation
from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. Thi
→