BriefGPT.xyz
Aug, 2020
一种用于最小化功夫迁移学习的灵活选择方案
A Flexible Selection Scheme for Minimum-Effort Transfer Learning
HTML
PDF
Amelie Royer, Christoph H. Lampert
TL;DR
本文提出了一种新的、称为弹性调整的fine-tuning方式,可以用于处理不同于预训练源的、但与其语义相近的实际数据。通过实验证明,相较于传统做法,在许多领域转移情况中调整中间或早期单元的效果更好。
Abstract
fine-tuning
is a popular way of exploiting knowledge contained in a pre-trained
convolutional network
for a new visual recognition task. However, the orthogonal setting of transferring knowledge from a pretrained
→