BriefGPT.xyz
Mar, 2020
度量学习的统一互信息视角:交叉熵 vs. 两两损失
Metric learning: cross-entropy vs. pairwise losses
HTML
PDF
Malik Boudiaf, Jérôme Rony, Imtiaz Masud Ziko, Eric Granger, Marco Pedersoli...
TL;DR
通过理论分析,该研究证明了交叉熵与多个已知的成对损失之间的关系,并提出将交叉熵最小化作为近似边界优化算法,从而避免了成对样本采集等复杂的优化技巧。研究通过在四个标准的深度度量学习基准测试中获得最新的结果,超越了最近和复杂的 DML 方法。
Abstract
Recently, substantial research efforts in
deep metric learning
(DML) focused on designing complex
pairwise-distance losses
and convoluted sample-mining and implementation strategies to ease
→