BriefGPT.xyz
Feb, 2021
AlphaNet: Alpha-Divergence优化超级网络的改进训练
AlphaNet: Improved Training of Supernet with Alpha-Divergence
HTML
PDF
Dilin Wang, Chengyue Gong, Meng Li, Qiang Liu, Vikas Chandra
TL;DR
研究提出了一种使用更通用的alpha-divergence改进supernet训练的方法,以避免模型不确定性的高估或低估,同时应用于neural architecture search(神经架构搜索),并在不同的FLOPs范围内取得了优异的成果。
Abstract
Weight-sharing
neural architecture search
(NAS) is an effective technique for automating
efficient neural architecture design
. Weight-sharing NAS builds a supernet that assembles all the architectures as its sub-
→