BriefGPT.xyz
Jan, 2022
基于渐变的训练免费神经结构搜索的统一和增强
Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search
HTML
PDF
Yao Shu, Zhongxiang Dai, Zhaoxuan Wu, Bryan Kian Hsiang Low
TL;DR
本文提出了一个统一的理论分析框架来研究基于梯度的免训练神经构架搜索方法,从而理论上研究它们之间的关系、保证它们的泛化性能并开发一种名为Hybrid NAS(HNAS)的新框架,它在原则上可以持续提高训练免费NAS的效果。
Abstract
neural architecture search
(NAS) has gained immense popularity owing to its ability to automate neural architecture design. A number of
training-free metrics
are recently proposed to realize NAS without training,
→