BriefGPT.xyz
Dec, 2020
假设差异规范化的互信息最大化
Hypothesis Disparity Regularized Mutual Information Maximization
HTML
PDF
Qicheng Lao, Xiang Jiang, Mohammad Havaei
TL;DR
提出了一种假设差异规则化的互信息最大化方法,用于处理无监督假设转移,通过使用多个假设来传递源领域的知识,而不需要在适应过程中访问源数据,从而在假设转移学习的上下文中实现了最先进的UDA适应性能。
Abstract
We propose a hypothesis disparity regularized
mutual information maximization
~(HDMI) approach to tackle unsupervised hypothesis transfer -- as an effort towards unifying
hypothesis transfer learning
(HTL) and
→