BriefGPT.xyz
May, 2020
迁移学习的信息论分析
Information-theoretic analysis for transfer learning
HTML
PDF
Xuetong Wu, Jonathan H. Manton, Uwe Aickelin, Jingge Zhu
TL;DR
本文研究了迁移学习中的广义化误差和 excess risk 问题,提出了一种信息论分析方法。结果表明 Kullback-Leibler divergence 在特定环境中能很好地描述广义化误差,我们还将结果推广到一种特定的经验风险最小化算法中。同时,该方法在迭代,噪声梯度下降算法中有潜在的应用。
Abstract
transfer learning
, or
domain adaptation
, is concerned with machine learning problems in which training and testing data come from possibly different distributions (denoted as u and u', respectively). In this work
→